Abstract
We introduce a detailed theoretical, numerical, and experimental study of the effects of laser's phase noise on the performance of phase-sensitive optical time-domain reflectometry (φ-OTDR) sensors that use optical pulse compression (OPC). Pulse compression is a technique that can be used to improve the received signal amplitude by increasing the effective energy of the pulses that are launched into the fiber without degrading the spatial resolution of the measurements. Therefore, it is a valuable tool to extend the range of these sensors and mitigate fiber attenuation constraints. However, it has been observed that the limited coherence of the laser source has a degrading effect on the actual performance enhancement that this method can provide. Here, we derive a theoretical model that can be used to quantify this degradation for any type of OPC such as those based on either linear frequency modulation (LFM) pulses or perfect periodic autocorrelation (PPA) bipolar bit sequences. The model facilitates numerical estimation of the sensitivity of the φ-OTDR measurements. It also produces theoretical expressions for the mean and the variance of the phase-noise perturbed backscatter response. These results are validated via numerical simulations and experiments in φ-OTDR setups using LFM as well as PPA OPC. Furthermore, we demonstrate the use of the model to investigate the basic trade-offs involved in the design of OPC φ-OTDR systems.
Original language | English |
---|---|
Pages (from-to) | 2561-2569 |
Number of pages | 9 |
Journal | Journal of Lightwave Technology |
Volume | 40 |
Issue number | 8 |
DOIs | |
State | Published - 15 Apr 2022 |
Keywords
- Phase noise
- distributed acoustic sensing
- linear frequency modulation
- optical pulse compression
- optical time domain reflectometry
- perfect periodic autocorrelation codes
All Science Journal Classification (ASJC) codes
- Atomic and Molecular Physics, and Optics