scholarly journals The Color Dependence of Interstellar Polarization

1965 ◽  
Vol 7 ◽  
pp. 45-50
Author(s):  
Alfred Behr

Measurements of the Interstellar Polarization of Starlight at different wavelengths have been reported by several observers in references 1 to 8. With a few exceptions the wavelength range of the measurements is still relatively small, and up to now only a few dozens of stars have been measured with high precision. There is still a remarkable discrepancy between the results of different observers, as can be seen in figure 1. The scattering of the individual values is significantly larger than the probable error given by the respective authors. One of the most serious error sources seems to be the instrumental polarization which itself is highly color dependent. (See ref. 9.)Nevertheless, even if a much greater individual error is assumed, a remarkable conclusion can be drawn: the wavelength dependence of polarization is not the same for all stars.

1965 ◽  
Vol 7 ◽  
pp. 57-64
Author(s):  
Thomas Gehrels

The Wavelength Dependence of Polarization as observed in 32 stars, for which the Henry Draper numbers are given, is shown in figure 1. Details of some of these observations are presented in reference 1.The equipment is now being used with the new 154-cm Catalina reflector of the Lunar and Planetary Laboratory at the University of Arizona. The instrumental polarizations are nearly zero. The data processing and observing techniques have been further improved; the precision is mainly determined by statistics such that the internal probable error in the percentage polarization is ±0.03 percent (±0.0006 magnitude) for a half-hour observation per filter on objects brighter than about 7 magnitudes. The wavelength λ ranges from 0.33 to 0.95 μ covered by seven filters of bandwidth of about 0.05 μ. The wavelength range is being extended to 1.2, 1.6, and 2.2 μ and, with high-altitude ballooning, to 0.28 and 0.22 μ.


1998 ◽  
Vol 11 (1) ◽  
pp. 583-583
Author(s):  
S. Röser ◽  
U. Bastian ◽  
K.S. de Boer ◽  
E. Høg ◽  
E. Schilbach ◽  
...  

DIVA (Double Interferometer for Visual Astrometry) is a Fizeau interferometer on a small satellite. It will perform astrometric and photometric observations of at least 4 million stars. A launch in 2002 and a minimum mission length of 24 months are aimed at. A detailed description of the experiment can be obtained from the DIVA homepage at http://www.aip.de:8080/᷉dso/diva. An overview is given by Röser et al., 1997. The limiting magnitude of DIVA is about V = 15 for spectral types earlier than M0, but drops to about V = 17.5 for stars later than M5. Table 1 gives a short overview on DIVA’s performance. DIVA will carry out a skysurvey complete to V = 12.5. For the first time this survey will comprise precise photometry in at least 8 bands in the wavelength range from 400 to 1000 nm. DIVA will improve parallaxes by a factor of 3 compared to Hipparcos; proper motions by at least a factor of 2 and, in combination with the Hipparcos observations, by a factor of 10 for Hipparcos stars. At least 30 times asmany stars as Hipparcos will be observed, and doing this DIVA will fill the gap in observations between Hipparcos and GAIA. DIVA’s combined astrometric and photometric measurements of high precision will have important impacts on astronomy and astrophysics in the next decade.


Complacency potential is an important measure to avoid performance error, such as neglecting to detect a system failure. This study updates and expands upon Singh, Molloy, and Parasuraman’s 1993 Complacency-Potential Rating Scale (CPRS). We updated and expanded the CPRS questions to include technology commonly used today and how frequently the technology is used. The goal of our study was to update the scale, analyze for factor shifts and internal consistency, and to explore correlations between the individual values for each factor and the frequency of use questions. We hypothesized that the factors would not shift from the original and the revised CPRS’s four subscales. Our research found that the revised CPRS consisted of only three subscales with the following Cronbach’s Alpha values: Confidence: 0.599, Safety/Reliability: 0.534, and Trust: 0.201. Correlations between the subscales and the revised complacency-potential and the frequency of use questions are also discussed.


During the latter part of 1902 and the early months of 1903 I resolved to take as many observations of the rates of dissipation of positive and negative electric charges as possible, and to continue them over the whole 24 hours of the day, and, when opportunity offered, over longer periods. There appeared to be little information regarding the rate of dispersion during the night hours. At about the same time that these observations were being made, Nilsson was doing similar work at Upsala, and found a noticeable maximum value for atmospheric conductivity at about midnight. The observations were made on the Canterbury Plains of New Zealand, at a station about 20 feet above sea-level and about five miles due west from the sea coast. The apparatus used was Elster and Geitel’s Zerstreuungs- apparat , and the formula of reduction used was that given by them, viz:- E = 1/ t log V 0 /V- n / t ' log V' 0 /V' . In this formula E is proportional to the conductivity of the gas surrounding the instrument—for positive or negative charges, as the case may be. The constant “ n ” = ratio of capacity without cylinder ____________________________________ capacity with cylinder was determined by me to be 0·47, as the instrument was always used, with the protecting cover. The cover was always at one height above the base of the instrument, and was set so as to be as nearly co-axial with the discharging cylinder as could be judged by eye. No attempt was made to determine the actual capacity of the condenser cylinder and protecting cover, which would be a somewhat variable quantity owing- (1) to the differences on different days in attempting to cause the two to be co-axial; (2) to a certain amount of looseness in the fit of the shank of the cylinder on to its hole. The value above given for “ n "is the mean of several deter­minations made with different settings of the cover and cylinder. The individual values of “ n ” varied over about 0.03.


GEOMATICA ◽  
2014 ◽  
Vol 68 (3) ◽  
pp. 183-194 ◽  
Author(s):  
M. Leslar ◽  
B. Hu ◽  
J.G. Wang

The understanding of the effects of error on Mobile Terrestrial LiDAR (MTL) point clouds has not increased with their popularity. In this study, comprehensive error analyses based on error propagation theory and global sensitivity study were carried out to quantitatively describe the effects of various error sources in a MTL system on the point cloud. Two scenarios were envisioned; the first using the uncertainties for measurement and calibration variables that are normally expected for MTL systems as they exist today, and the second using an ideal situation where measurement and calibration values have been well adjusted. It was found that the highest proportion of error in the point cloud can be attributed to the boresight and lever arm parameters for MTL systems calibrated using non-rigours methods. In particular, under a loosely controlled error condition, the LiDAR to INS Z lever arm and the LiDAR to INS roll angle contributed more error in the output point cloud than any other parameter, including the INS position. Under tightly controlled error conditions, the INS position became the dominant source of error in the point cloud. In addition, conditional variance analysis has shown that the majority of the error in a point cloud can be attributed to the individual variables. Errors caused by the interactions between the diverse variables are minimal and can be regarded as insignificant.


2021 ◽  
pp. 002202212110447
Author(s):  
Plamen Akaliyski ◽  
Christian Welzel ◽  
Michael Harris Bond ◽  
Michael Minkov

Nations have been questioned as meaningful units for analyzing culture due to their allegedly limited variance-capturing power and large internal heterogeneity. Against this skepticism, we argue that culture is by definition a collective phenomenon and focusing on individual differences contradicts the very concept of culture. Through the “miracle of aggregation,” we can eliminate random noise and arbitrary variation at the individual level in order to distill the central cultural tendencies of nations. Accordingly, we depict national culture as a gravitational field that socializes individuals into the orbit of a nation’s central cultural tendency. Even though individuals are also exposed to other gravitational forces, subcultures in turn gravitate within the limited orbit of their national culture. Using data from the World Values Survey, we show that individual values cluster in concentric circles around their nation’s cultural gravity center. We reveal the miracle of aggregation by demonstrating that nations capture the bulk of the variation in the individuals’ cultural values once they are aggregated into lower-level territorial units such as towns and sub-national regions. We visualize the gravitational force of national cultures by plotting various intra-national groups from five large countries that form distinct national clusters. Contrary to many scholars’ intuitions, alternative social aggregates, such as ethnic, linguistic, and religious groups, as well as diverse socio-demographic categories, add negligible explained variance to that already captured by nations.


2020 ◽  
Vol 37 (4) ◽  
pp. 687-703 ◽  
Author(s):  
Michael Schlundt ◽  
J. Thomas Farrar ◽  
Sebastien P. Bigorre ◽  
Albert J. Plueddemann ◽  
Robert A. Weller

AbstractThe comparison of equivalent neutral winds obtained from (i) four WHOI buoys in the subtropics and (ii) scatterometer estimates at those locations reveals a root-mean-square (RMS) difference of 0.56–0.76 m s−1. To investigate this RMS difference, different buoy wind error sources were examined. These buoys are particularly well suited to examine two important sources of buoy wind errors because 1) redundant anemometers and a comparison with numerical flow simulations allow us to quantitatively assess flow distortion errors, and 2) 1-min sampling at the buoys allows us to examine the sensitivity of buoy temporal sampling/averaging in the buoy–scatterometer comparisons. The interanemometer difference varies as a function of wind direction relative to the buoy wind vane and is consistent with the effects of flow distortion expected based on numerical flow simulations. Comparison between the anemometers and scatterometer winds supports the interpretation that the interanemometer disagreement, which can be up to 5% of the wind speed, is due to flow distortion. These insights motivate an empirical correction to the individual anemometer records and subsequent comparison with scatterometer estimates show good agreement.


Author(s):  
Т.А. Нестик

Приводятся результаты эмпирического исследования (N=1600), посвященного отношению россиян к новым технологиям. Показано, что позитивное отношение личности к новым технологиям поддерживается ценностями открытости к изменениям и отрицательно связано с ценностями сохранения. Удалось прояснить соотношение когнитивных, аффективных и поведенческих компонентов отношения личности к новым технологиям. Были выделены социально-психологические типы оценивания новых технологий («индифферентные», «ориентированные на влияние значимых других», «разборчивые», «прагматики», «ориентированные на безопасность»), а также социально-психологические типы отношения личности к новым технологиям («технофилы», «тревожные сторонники технического прогресса», «технофобы» и «безразличные к технологиям»). Выявлены социально-психологические предикторы технооптимизма, технофобии, технофилии и готовности использовать новые технологии. На основании проведенных исследований можно сделать вывод о том, что технофилия и технофобия являются не противоположными полюсами одной шкалы, а разными феноменами, связанными друг с другом. The results of an empirical study (N = 1600) devoted to the attitudes of Russians towards new technologies are presented. It is shown that the positive attitudes of the individual to new technologies is supported by the values of openness to change and is negatively associated with the values of conservation. We managed to clarify the relationship between the cognitive, affective and behavioral components of personal attitudes to new technologies. We have identified several socio-psychological types of assessment of new technologies ("indifferent", "relying on social support", "discerning", "pragmatists", "safety-oriented"), as well as socio-psychological types of personality attitudes to new technologies ("technophiles", "anxious supporters of technical progress", "technophobes", and "indifferent to technology"). The socio-psychological predictors of techno-optimism, technophobia, technophilia and willingness to use new technologies were identified. Based on the research carried out, it can be concluded that technophilia and technophobia are not opposite poles of the same scale, but different phenomena related to each other.


The place of the state in the theory of shocks is predetermined by the increasing importance of the subjective component of the processes of self-movement of systemic integrities. The main problem is that the state formalizes the actions of subjects as economic agents, abstracting from social conditions that generate the individual values of a person implemented in the economy. So, the economic subject acquires its own individual values in a society with a sharp polarization of citizens' incomes, inequality of opportunities, a shrinking middle class, and an ineffective public healthcare system, as demonstrated by the coronavirus pandemic. As a result, a fundamental problem arises of the discrepancy between society and economy as well as formal and informal institutions that predetermine the opportunistic behavior of the economic subjects. Thus, the state persistently strives for financial stability in the economy, abstracting from the problems of social disunity.


Author(s):  
Steven Kim

We pass judgment on the value of scientific results. For example, we may feel that the periodic table of elements is of greater significance than the understanding that hydrogen is lighter than oxygen. For one thing, the latter fact may be deduced from an understanding of the periodic table. In this way, we may ascribe a measure of quality to an idea. The quality of an idea is subjective, and cannot be assigned an absolute value. However, it is possible to give a partial ordering and claim that idea A is of higher, lower, or equal worth in relation to idea B. The judgment of the relative importance of ideas is made routinely, for example, by an instructor in delivering a lecture or writing a book. The assignment of value occurs implicitly in the selection of topics and their relative emphasis. Once we admit a preference ordering among ideas, we may also assign an arbitrary numerical scale to them. This practice is standard in the field of economics, where a preference ordering among goods suggests a measure of utility. Since each consumer has individual tastes and needs, the resulting utility function varies from one person to another even for the same basket of goods. Further, the preferences are subjective and relative, rather than absolute. As a result, the level of utility can be based only on a conceptual scale. The basic measure of utility is an arbitrary unit called a util. In a similar way, we may assign a quality metric in terms of a granular unit of a qual. A person may assign a particular set of quals to a portfolio of ideas based on his own tastes and predilictions. A second person may offer a completely different set of quality values. This conception of an individual ordering of ideas is consistent with the view of difficulty and creativity as relative rather than absolute parameters. To pursue this line of reasoning, we may also speak of the combined quality of two or more ideas. The value of a set of ideas may be greater than, equal to, or less than the sum of the individual values.


Sign in / Sign up

Export Citation Format

Share Document