Photonic technologies for undersampling and compressive sensing of high-speed RF signals

Author(s):  
George C. Valley ◽  
George A. Sefler ◽  
T. Justin Shaw
Author(s):  
George C. Valley ◽  
George A. Sefler ◽  
T. Justin Shaw ◽  
Andrew D. Stapleton

2021 ◽  
Author(s):  
Hatice Kosek

Subcarrier multiplexed (SCM) transmission of multimedia radio signals such as CATV (5-860 MHz), cellular wireless (900 MHz) and wireless LAN (2.4 GHz) over fiber is frequently used to deliver broadband services cost effectively. These multi-channel radio-over-fiber (ROF) links have interesting applications and can connect enhanced wireless hotspots that will support high speed wireless LAN services or low speed cellular services to different customers from the same antenna. The SCM signals need to be demultiplexed, preferably in the optical domain for many reasons. Prefiltering of SCM signals with fiber-based optical filters warrants the use of inexpensive photodetectors and increases network flexibility. However, realizing optical demultiplexing as sub-GHz level is challenging and thus necessitates optical filters with high selectivity and low insertion loss and distortion. We developed a novel sub-picometer all-optical bandpass filter by creating a resonance cavity using two closely matched fiber Bragg gratings (FBGs). This filter has a bandwidth of 120 MHz at -3 dB, 360 MHz at -10 dB and 1.5 GHz at -20 dB. Experimental results showed that the filter is capable of separating two radio frequency (RF) signals spaced as close as 50 MHz without significant distortion. When this demultiplexer was employed to optically separate 2.4 GHz and 900 MHz radio signals, it was found to be linear from -38 dBm to +6 dBm with ~ 25.5 dB isolation. There was no significant increment in the BER of the underlying multimedia data. Results verified that the fabricated narrow bandpass filter can be a potential candidate in demultiplexing of RF signals in networks based on subcarrier multiplexed schemes.


2020 ◽  
Vol 24 (06) ◽  
pp. 83-90
Author(s):  
Ali Mohammad A. AL-Hussain ◽  
◽  
Maher K. Mahmood ◽  

Compressive sensing (CS) technique is used to solve the problem of high sampling rate with wide band signal spectrum sensing where high speed analogue to digital converter is needed to do that. This leads to difficult hardware implementation, large time of sensing and detection with high consumptions power. The proposed approach combines energy-based detection, with CS compressive sensing and investigates the probability of detection, and the probability of false alarm as a function of the SNR, showing the effect of compression to spectrum sensing performance of cognitive radio system. The Discrete Cosine Transform (DCT) is used as a sparse representation basis of the received signal, and random matrix as a compressive matrix. The 𝓁1 norm algorithm is used to reconstruct the original signal. A closed form of probability of detection and probability of false alarm are derived. Computer simulation shows clearly that the compression ratio, recovery error and SNR level affect the probability of detection.


2018 ◽  
Vol 2 (1) ◽  
pp. 2 ◽  
Author(s):  
Paolo Pozzi ◽  
Laura Maddalena ◽  
Nicolò Ceffa ◽  
Oleg Soloviev ◽  
Gleb Vdovin ◽  
...  

The use of spatial light modulators to project computer generated holograms is a common strategy for optogenetic stimulation of multiple structures of interest within a three-dimensional volume. A common requirement when addressing multiple targets sparsely distributed in three dimensions is the generation of a points cloud, focusing excitation light in multiple diffraction-limited locations throughout the sample. Calculation of this type of holograms is most commonly performed with either the high-speed, low-performance random superposition algorithm, or the low-speed, high performance Gerchberg–Saxton algorithm. This paper presents a variation of the Gerchberg–Saxton algorithm that, by only performing iterations on a subset of the data, according to compressive sensing principles, is rendered significantly faster while maintaining high quality outputs. The algorithm is presented in high-efficiency and high-uniformity variants. All source code for the method implementation is available as Supplementary Materials and as open-source software. The method was tested computationally against existing algorithms, and the results were confirmed experimentally on a custom setup for in-vivo multiphoton optogenetics. The results clearly show that the proposed method can achieve computational speed performances close to the random superposition algorithm, while retaining the high performance of the Gerchberg–Saxton algorithm, with a minimal hologram quality loss.


Sign in / Sign up

Export Citation Format

Share Document