With the current rise of research on transparent lightwave-communications networks, interest has focused on the requirements that such networks will impose on their constituent lightwave terminal, amplification, and switching components. It is becoming clear, for example, that transparent networks may impose stringent requirements on the in-band crosstalk of their switching components [1][2][3], chiefly due to the efficiency with which such crosstalk converts laser phase noise to intensity noise, and thus to photocurrent fluctuations in the network's receivers. However, although this noise-conversion process is efficient when signal and crosstalk fields exhibit the worst case of matched polarizations, the intensity noise completely vanishes for orthogonal polarizations. This raises the practical question whether deployed networks must be designed for precisely worst-case operation. We here show theoretically and experimentally that the interference process generates virtually worst-case noise performance an unexpectedly large fraction of the time, and that networks must consequently be designed for worst-case operation. This implies that the random polarization states arising in deployed networks will not significantly relax the in-band crosstalk performance required of their components.
展开▼