scholarly journals Charting nearby dust clouds using Gaia data only

2019 ◽  
Vol 631 ◽  
pp. A32 ◽  
Author(s):  
R. H. Leike ◽  
T. A. Enßlin

Aims. Highly resolved maps of the local Galactic dust are an important ingredient for sky emission models. Over almost the whole electromagnetic spectrum one can see imprints of dust, many of which originate from dust clouds within 300 pc. Having a detailed 3D reconstruction of these local dust clouds enables detailed studies, helps to quantify the impact on other observables, and is a necessary milestone of larger reconstructions, as every sightline for more distant objects will pass through the local dust. Methods. To infer the dust density we use parallax and extinction estimates published by the Gaia collaboration in their second data release (DR2). We model the dust as a log-normal process using a hierarchical Bayesian model. We also nonparametrically infer the kernel of the log-normal process, which corresponds to the physical spatial correlation power spectrum of the log-density. Results. Using only data from Gaia DR2, we reconstruct the 3D dust density and its spatial correlation spectrum in a 600 pc cube centered on the Sun. We report a spectral index of the logarithmic dust density of 3.1 on Fourier scales with wavelengths between 2 and 125 pc. The resulting 3D dust map as well as the power spectrum and posterior samples are publicly available for download.

2020 ◽  
Vol 500 (2) ◽  
pp. 2532-2542
Author(s):  
Linda Blot ◽  
Pier-Stefano Corasaniti ◽  
Yann Rasera ◽  
Shankar Agarwal

ABSTRACT Future galaxy surveys will provide accurate measurements of the matter power spectrum across an unprecedented range of scales and redshifts. The analysis of these data will require one to accurately model the imprint of non-linearities of the matter density field. In particular, these induce a non-Gaussian contribution to the data covariance that needs to be properly taken into account to realize unbiased cosmological parameter inference analyses. Here, we study the cosmological dependence of the matter power spectrum covariance using a dedicated suite of N-body simulations, the Dark Energy Universe Simulation–Parallel Universe Runs (DEUS-PUR) Cosmo. These consist of 512 realizations for 10 different cosmologies where we vary the matter density Ωm, the amplitude of density fluctuations σ8, the reduced Hubble parameter h, and a constant dark energy equation of state w by approximately $10{{\ \rm per\ cent}}$. We use these data to evaluate the first and second derivatives of the power spectrum covariance with respect to a fiducial Λ-cold dark matter cosmology. We find that the variations can be as large as $150{{\ \rm per\ cent}}$ depending on the scale, redshift, and model parameter considered. By performing a Fisher matrix analysis we explore the impact of different choices in modelling the cosmological dependence of the covariance. Our results suggest that fixing the covariance to a fiducial cosmology can significantly affect the recovered parameter errors and that modelling the cosmological dependence of the variance while keeping the correlation coefficient fixed can alleviate the impact of this effect.


2021 ◽  
Author(s):  
Tarun Grover ◽  
Jamie Stuart Andrews ◽  
Irfan Ahmed ◽  
Ibnu Hafidz Arief

Abstract Unconventional resource plays, herein referred to as source rock plays, have been able to significantly increase the supply of hydrocarbons to the world. However, majority of the companies developing these resource plays have struggled to generate consistent positive cash flows, even during periods of stable commodity prices and after successfully reducing the development costs. The fundamental reasons for poor financial performance can be attributed to various reasons, such as; rush to lease acreage and drill wells to hold acreage, delayed mapping of sweet spots, slow acknowledgement of high geological variability, spending significant capital in trial and errors to narrow down optimal combinations of well spacing and stimulation designs. The objective of this paper is to present a systematic integrated multidisciplinary analysis of several unconventional plays worldwide which, if used consistently, can lead to significantly improved economics. We present an analysis of several unconventional plays in the US and Argentina with fluid systems ranging from dry gas to black oil. We utilize the publicly available datasets of well stimulation and production data along with laboratory measured core data to evaluate the sweet spots, the measure of well productivity, and the variability in well productivity. We investigate the design parameters which show the strongest correlation to well productivity. This step allows us to normalize the well productivity in such a way that the underlying well productivity variability due to geology is extracted. We can thus identify the number of wells which should be drilled to establish geology driven productivity variability. Finally, we investigate the impact of well spacing on well productivity. The data indicates that, for any well, first year cumulative production is a robust measure of ultimate well productivity. The injected slurry volume shows the best correlation to the well productivity and "completion normalized" well productivity can be defined as first year cumulative production per barrel of injected slurry volume. However, if well spacing is smaller than the created hydraulic fracture network, the potential gain of well productivity is negated leading to poor economics. Normalized well productivity is log-normally distributed in any play due to log-normal distribution of permeability and the sweet spots will generally be defined by most permeable portions of the play. Normalized well productivity is shown to be independent of areal scale of any play. We show that in every play analyzed, typically 20-50 wells (with successful stimulation and production) are sufficient to extract the log-normal productivity distribution depending on play size and target intervals. We demonstrate that once the log-normal behavior is anticipated, creation of production profiles with p10-p50-p90 values is quite straightforward. The way the data analysis is presented can be easily replicated and utilized by any operator worldwide which can be useful in evaluation of unconventional resource play opportunities.


2017 ◽  
Vol 21 (9) ◽  
pp. 4573-4589 ◽  
Author(s):  
Liang Gao ◽  
Limin Zhang ◽  
Mengqian Lu

Abstract. Rainfall is the primary trigger of landslides in Hong Kong; hence, rainstorm spatial distribution is an important piece of information in landslide hazard analysis. The primary objective of this paper is to quantify spatial correlation characteristics of three landslide-triggering large storms in Hong Kong. The spatial maximum rolling rainfall is represented by a rotated ellipsoid trend surface and a random field of residuals. The maximum rolling 4, 12, 24, and 36 h rainfall amounts of these storms are assessed via surface trend fitting, and the spatial correlation of the detrended residuals is determined through studying the scales of fluctuation along eight directions. The principal directions of the surface trend are between 19 and 43°, and the major and minor axis lengths are 83–386 and 55–79 km, respectively. The scales of fluctuation of the residuals are found between 5 and 30 km. The spatial distribution parameters for the three large rainstorms are found to be similar to those for four ordinary rainfall events. The proposed rainfall spatial distribution model and parameters help define the impact area, rainfall intensity and local topographic effects for landslide hazard evaluation in the future.


2018 ◽  
Vol 855 ◽  
pp. 1116-1129 ◽  
Author(s):  
Nicolas Tobin ◽  
Leonardo P. Chamorro

Using a physics-based approach, we infer the impact of the coherence of atmospheric turbulence on the power fluctuations of wind farms. Application of the random-sweeping hypothesis reveals correlations characterized by advection and turbulent diffusion of coherent motions. Those contribute to local peaks and troughs in the power spectrum of the combined units at frequencies corresponding to the advection time between turbines, which diminish in magnitude at high frequencies. Experimental inspection supports the results from the random-sweeping hypothesis in predicting spectral characteristics, although the magnitude of the coherence spectrum appears to be over-predicted. This deviation is attributed to the presence of turbine wakes, and appears to be a function of the turbulence approaching the first turbine in a pair.


Nanophotonics ◽  
2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Diego R. Abujetas ◽  
Nuno de Sousa ◽  
Antonio García-Martín ◽  
José M. Llorens ◽  
José A. Sánchez-Gil

Abstract Bound states in the continuum (BICs) emerge throughout physics as leaky/resonant modes that remain, however, highly localized. They have attracted much attention in photonics, and especially in metasurfaces. One of their most outstanding features is their divergent Q-factors, indeed arbitrarily large upon approaching the BIC condition (quasi-BICs). Here, we investigate how to tune quasi-BICs in magneto-optic (MO) all-dielectric metasurfaces. The impact of the applied magnetic field in the BIC parameter space is revealed for a metasurface consisting of lossless semiconductor spheres with MO response. Through our coupled electric/magnetic dipole formulation, the MO activity is found to manifest itself through the interference of the out-of-plane electric/magnetic dipole resonances with the (MO-induced) in-plane magnetic/electric dipole, leading to a rich, magnetically tuned quasi-BIC phenomenology, resembling the behavior of Brewster quasi-BICs for tilted vertical-dipole resonant metasurfaces. Such resemblance underlies our proposed design for a fast MO switch of a Brewster quasi-BIC by simply reversing the driving magnetic field. This MO-active BIC behavior is further confirmed in the optical regime for a realistic Bi:YIG nanodisk metasurface through numerical calculations. Our results present various mechanisms to magneto-optically manipulate BICs and quasi-BICs, which could be exploited throughout the electromagnetic spectrum with applications in lasing, filtering, and sensing.


2007 ◽  
Vol 73 (18) ◽  
pp. 5760-5766 ◽  
Author(s):  
James J. McDevitt ◽  
Ka Man Lai ◽  
Stephen N. Rudnick ◽  
E. Andres Houseman ◽  
Melvin W. First ◽  
...  

ABSTRACT Interest in airborne smallpox transmission has been renewed because of concerns regarding the potential use of smallpox virus as a biothreat agent. Air disinfection via upper-room 254-nm germicidal UV (UVC) light in public buildings may reduce the impact of primary agent releases, prevent secondary airborne transmission, and be effective prior to the time when public health authorities are aware of a smallpox outbreak. We characterized the susceptibility of vaccinia virus aerosols, as a surrogate for smallpox, to UVC light by using a benchtop, one-pass aerosol chamber. We evaluated virus susceptibility to UVC doses ranging from 0.1 to 3.2 J/m2, three relative humidity (RH) levels (20%, 60%, and 80%), and suspensions of virus in either water or synthetic respiratory fluid. Dose-response plots show that vaccinia virus susceptibility increased with decreasing RH. These plots also show a significant nonlinear component and a poor fit when using a first-order decay model but show a reasonable fit when we assume that virus susceptibility follows a log-normal distribution. The overall effects of RH (P < 0.0001) and the suspending medium (P = 0.014) were statistically significant. When controlling for the suspending medium, the RH remained a significant factor (P < 0.0001) and the effect of the suspending medium was significant overall (P < 0.0001) after controlling for RH. Virus susceptibility did not appear to be a function of virus particle size. This work provides an essential scientific basis for the design of effective upper-room UVC installations for the prevention of airborne infection transmission of smallpox virus by characterizing the susceptibility of an important orthopoxvirus to UVC exposure.


Author(s):  
Zh. S. Abdimuratov ◽  
Zh. D. Manbetova ◽  
M. N. Imankul ◽  
K. S. Chezhimbayeva ◽  
A. Zh. Sagyndikova

Under electromagnetic impact (EMI) of a sufficient level, temporary disruption of functioning, processing, transmission and storage of information in cellular equipment is possible. Possible problems of electromagnetic compatibility (EMC) of a mobile phone and a base station (BS) of cellular connection under the influence of electromagnetic radiation (EMR) from other sources and their negative impact on functioning are considered. The energy of the HF electromagnetic field (EMF) after passing through the protective case can affect the devices of shielded radio electronic equipment (REE), therefore, the possible negative consequences of the impact of high-energy EMF on the REE are described. Possible negative consequences under certain conditions from the influence of the skin-effect, the effects of electrostatic discharge and electromagnetic pulses on electronic devices are given. It is shown that the constructional method of protecting REE from the effects of external electromagnetic factors consists in reducing the collected and transmitted EMF energy by improving the design, placement and installation of equipment. Components of some vendors for 5G systems that are resistant to external interference are given, and the possibilities for reducing the radiation level of a cell phone are noted. The necessity of an integrated approach to solving EMC problems is substantiated, which consists in the use of structural, circuitry and structural-functional methods of EMC provision. The new 5G (Fifth Generation) standard will operate at higher operating frequencies compared to previous generations. Due to the workload of the electromagnetic spectrum at frequencies below 6 GHz, 5G networks will be based on wireless radio access systems operating at frequencies of 30–100 GHz, that is, in the lower band of the extremely high frequency range EHF (Extremely High Frequency), 30–300 GHz.


Author(s):  
John B. Meisel ◽  
John Navin ◽  
Timothy S. Sullivan

Thirty years ago, the United States Federal Communications Commission (FCC) gave birth to the mobile wireless industry by granting two licenses in each cellular geographic market across the United States. In the next three decades the FCC continually provided more access to the electromagnetic spectrum which is a critical input for the provision of mobile wireless communications services to, in part, promote a more competitive market structure in the mobile wireless industry. One objective of this chapter is to describe and analyze the trends in the overall competitiveness of the mobile wireless market during this time by utilizing a modified Porter competitive forces framework. This analysis will be supplemented with an analysis of the most recent proposed merger in the mobile wireless industry – between AT&T and T-Mobile. The proposed merger is an example of a continuing trend in the industry, consolidation of national mobile wireless carriers. This chapter will analyze the impact of proposed merger on the ability of the remaining mobile wireless carriers to constrain the market power of the national wireless carriers in the industry. Specifically, the arguments for and against the merger by major stakeholders are reviewed. There are signs that the mobile wireless industry may return to a duopoly structure. Recommendations regarding the horizontal merger will be offered.


2019 ◽  
Vol 11 (6) ◽  
pp. 1742 ◽  
Author(s):  
Ruoyu Yang ◽  
Weidong Chen

In order to study the present situation regarding SO2 emissions in China, problems are identified and countermeasures and suggestions are put forward. This paper analyzes spatial correlation, influencing factors and regulatory tools of air pollution in 30 provinces on the Chinese mainland from 2006–2015. The results of exploratory spatial data analysis (ESDA) show that SO2 emissions have obvious positive spatial correlations, and atmospheric pollution in China shows obvious spatial overflow effects and spatial agglomeration characteristics. On this basis, the present study analyzes the impact of seven socioeconomical (SE) factors and seven policy tools on air pollution by constructing a STIRPAT model and a spatial econometric model. We found that population pressure, affluence, energy consumption (EC), industrial development level (ID), urbanization level (UL) and the degree of marketization can significantly promote the increase of SO2 emissions, but technology and governmental supervision of the environment have significant inhibitory effects. The reason why China’s air pollution is curbed at present is because the government has adopted a large number of powerful command-controlled supervision measures, to a large extent. Air pollution treatment is like a government-led “political movement”. The effect of the market is relatively weak and public force has not been effectively exerted. In the future, a comprehensive use of a variety of regulation tools is needed, as well as encouraging the public to participate, strengthening the supervision of third parties and building a diversified and all-encompassing supervision mechanism.


2020 ◽  
Vol 499 (2) ◽  
pp. 2598-2607
Author(s):  
Mike (Shengbo) Wang ◽  
Florian Beutler ◽  
David Bacon

ABSTRACT Relativistic effects in clustering observations have been shown to introduce scale-dependent corrections to the galaxy overdensity field on large scales, which may hamper the detection of primordial non-Gaussianity fNL through the scale-dependent halo bias. The amplitude of relativistic corrections depends not only on the cosmological background expansion, but also on the redshift evolution and sensitivity to the luminosity threshold of the tracer population being examined, as parametrized by the evolution bias be and magnification bias s. In this work, we propagate luminosity function measurements from the extended Baryon Oscillation Spectroscopic Survey (eBOSS) to be and s for the quasar (QSO) sample, and thereby derive constraints on relativistic corrections to its power spectrum multipoles. Although one could mitigate the impact on the fNL signature by adjusting the redshift range or the luminosity threshold of the tracer sample being considered, we suggest that, for future surveys probing large cosmic volumes, relativistic corrections should be forward modelled from the tracer luminosity function including its uncertainties. This will be important to quasar clustering measurements on scales $k \sim 10^{-3}\, h\, {\rm Mpc}^{-1}$ in upcoming surveys such as the Dark Energy Spectroscopic Instrument (DESI), where relativistic corrections can overwhelm the expected fNL signature at low redshifts z ≲ 1 and become comparable to fNL ≃ 1 in the power spectrum quadrupole at redshifts z ≳ 2.5.


Sign in / Sign up

Export Citation Format

Share Document