Abstract
An optical delay interferometer (ODI) is employed to suppress the pattern effect manifested on a 10 Gb/s return-to-zero (RZ) data stream when amplified by a semiconductor optical amplifier (SOA) operated in deep gain saturation. The experimental results verify the competence of the scheme to confront the problem for this signal format by achieving a far better performance than that with the SOA alone.
| Original language | English |
|---|---|
| Article number | 085005 |
| Journal | Optical Engineering |
| Volume | 49 |
| Issue number | 8 |
| DOIs | |
| Publication status | Published - Aug 2010 |