Industry Publications Index ... Click Here




Optical Amplification

Originally published  February, 2000
by Carlo Kopp
¿ 2000, 2005 Carlo Kopp

Perhaps one of the most remarkable and important pieces of technology to emerge in the last quarter of the 20th century, the optical fibre is literally the backbone of the "digital revolution". In the two decades since its entry into the market, the optical fibre has almost completely displaced the copper cable in long haul applications, and continues to displace copper in medium and short haul applications.

The most recent development in optical communications is the idea of optical amplification, which yields significant gains over the established technology of semiconductor based amplification. In this month's article, I will discuss the idiosyncrasies and limitations of established technology, and explore the basic ideas behind optical amplification.

Performance in Conventional Optical Links

The decisive measure of the worth of any communications medium is its transmission performance, which at the most fundamental level is measured by the signal loss through a finite length of a channel, and the bandwidth limiting effects which constrain the speed of the channel.

Signal loss is a very simple metric, which compares the power fed in at one end of a channel with the power extracted at the other end. The greater the loss, the less power is available to a receiver, and the more difficult it becomes to extract a signal from the ever present noise with a reasonable error rate. Indeed this is a central aspect of Shannon's communications theory. Therefore the lower the loss rate of the medium, the more useful it becomes for carrying information.

"Bandwidth limiting effects" are a very rough metric, which really covers a whole range of possible phenomena, all of which result in limitations in the data rate we can push through a medium regardless of the throughput limiting effects of signal loss itself.

Therefore the ideal communications channel is one with infinite bandwidth and zero loss. The closer we can get to this ideal with a practical channel, the greater the possible throughput through this channel.

Prior to the advent of the optical fibre, the best cable technology available was the coaxial cable, colloquially known as coax. Coax is still widely used, and is the primary distribution medium for the cable TV network (a somewhat short-sighted decision in this observer's view).

A coaxial cable is essentially a annular waveguide, with a tubular metal shield, and a solid metal core. The annular waveguide cavity between the core and the shield contains a dielectric, either gas, or a solid or foamed plastic like polyethylene or Teflon. The performance of a coax cable is mostly limited by the frequency dependency of the dielectric losses, and the ever present skin effect. Even with an excellent dielectric like air or gas, the skin effect, which causes electrical currents to be confined to an increasingly shallower depth in the shield and core, will ultimately limit the usefulness of coax for high speed data. A coaxial cable which has a loss rate of about 30 deciBels/km at 100 MHz will typically lose thousands or more deciBels/km at 10 GHz, and is thus unusable.

A typical communications link of this period was built using transistor analogue, and later digital repeaters, separated by lengths of coax of the order of a kilometre in length. To span hundreds of kilometres, you needed hundreds of repeaters. Therefore the economics of coax for long haul transmission are not spectacular, exacerbated by the severe degradation of cable performance with aging and corrosion. A decade is considered to be a reasonable life for a coax cable.

The basic idea behind optical fibres was to create an optical rather than electrical waveguide, to exploit the potentially very low loss rates of very pure glass materials. While very simple in concept, the implementation of an optical cable is anything but trivial, indeed major obstacles in pure glass manufacture and fibre fabrication techniques, as well as materials and design techniques in semiconductor lasers, had to be overcome before the potential of the "light pipe" could be realised.

Since the basic material is glass, the durability problems of metal/dielectric coaxial cables disappear. The primary limitation to the life of contemporary fibre cables is the degradation of the organic materials used in the mechanical construction of the cable itself !

The optical fibre traps light internally by using a cladding and core with different optical refractive indices. Light fed into the core cannot escape since it is internally reflected at the boundary between the core and cladding.

An optical communications link therefore comprises a high speed laser at one end, a length of fibre, and a high speed receiver at the other end. An electrical signal is used to modulate the laser, and an optical detector in the receiver produces a faint electrical output which is electrically amplified and then demodulated to extract the signal. A simple model, not unlike the coaxial cable link, but using an electrical/optical and optical/electrical conversion at either end of the cable.

As always, the devil is in the details. To best understand the limitations of this scheme, it is helpful to digress into the materials issues, and the evolutionary history of optical fibres.

The loss performance of a fibre is determined by the scattering and absorption the optical signal experiences in the glass. This depends upon the purity of the glass, its homogeneity, and the material properties of the glass itself.

Absorption losses arise from impurities, such as OH (ie water) ions and transition metals in silica glasses, but also from resonances in the glass itself, and harmonics of these. These losses vary with an inverse sixth power law of the wavelength. For typical silica glasses, "windows" of very low loss arise at 0.81 microns, 1.3 microns and 1.55 microns of wavelength. These intrinsic losses are typically about 0.3 and 0.15 dB/km respectively, for the latter wavelengths. This is between 1/100 and 1/200 the dB/km loss of a coaxial cable ! An interesting statistic is that the first transatlantic fibre cable used 95 repeaters, spaced about 70 km apart.

In practice many silica fibres can get very close to this loss performance, although fabrication becomes quite finicky. Inhomogeneous glass structure, at a microscopic level, introduces an inverse fourth power Raleigh scattering loss.

Fibre fabrication technology converged quite rapidly to very low loss fibre designs, and the required solid state laser technology for 1.31 and 1.55 micron operation appeared by the mid eighties.

The bigger issue in fibre design was achieving Gigabit/s and faster transmission speeds over significant distances. To best understand why we have to explore the structure of the fibre, and the limitations of the receivers and laser transmitters.

At this point in time we can classify optical fibre designs into four discrete generations. These are step index multimode fibres, graded index multimode fibres, single mode fibres and advanced single mode fibres.

The simplest fibre design is the step index (SI) fibre, in which the refractive indices of the core and cladding are constant through the cross section of the fibre. A typical fibre of this ilk has a 100 micron core and 140 micron cladding. It has one very ugly limitation, which is termed modal dispersion. The light which is coupled into the fibre travels different distances, depending upon the angle at which it entered the fibre. As a result, part of your signal arrives earlier, and part later. This effect gets worse, the longer the fibre cable is. For all practical purposes, the step index fibre is limited to short haul low speed links.

By the early eighties, the step index fibre was supplanted by the more sophisticated graded index (GI) fibre, in which the refractive index between the core and cladding changed smoothly, rather than abruptly, following a power law or Gaussian profile. The idea was to cleverly bend the light rays in such a manner, to minimise the modal dispersion. A typical GI fibre has a 62.5 micron core and 125 micron cladding diameter. While the GI fibre was an enormous improvement over the SI fibre, it still fell short.

The mainstay of today's long haul communications is the single mode (SM) fibre. An SM fibre has a core so small in diameter, that it supports only a single mode of transmission, and thus modal dispersion is eliminated. Such fibres typically have core diameters of single microns. The price to be paid for almost unlimited speed is however the small diameter, and thus much more difficult coupling of light into the fibre. Indeed a good part of the higher cost of SM lasers is need to add a precise optical coupling arrangement to get the light from the laser into the fibre with minimal losses.

Once the modal dispersion speed limit was broken, another one was found. This was chromatic or colour dispersion. Chromatic dispersion arises as a result of the dependency of propagation velocity of light upon its wavelength. This in turn is a result of the refractive index of the fibre varying with wavelength. In practical terms this means that a pulse of light transmitted at slightly differing wavelengths arrives slightly earlier or later at the far end of the fibre, depending on the wavelength of the light.

The conventional semiconductor laser is an "impure" light source. Due the physics of the laser cavity design, it typically puts out light with a Gaussian colour spectrum, centred on its nominal wavelength.

If we are trying to pump pulses through a fibre at Gigabit/s rates, the pulses will be spread in time, in a manner reflecting the colour spectrum of the laser we are using and the length of the fibre.

It is a fortuitous accident that the chromatic dispersion performance of a silica glass is minimised at 1.3 microns, which also happens to be region of decent loss performance. As a result most early SM fibre systems operated at 1.3 microns.

Advances were subsequently made in laser technology, to produce lasers which had a single dominant spectral line, ie truly spectrally pure lasers. The distributed feedback laser (DFB) incorporates internal corrugations which form a diffraction grating. This in effect tunes the laser very sharply to a single wavelength, thereby defeating the chromatic dispersion effect in the glass.

The DFB laser allowed the much lower loss 1.55 micron transmission region to be exploited, virtually halving the dB/km loss seen in a basic silica glass fibre.

This generation of fibre transmission equipment first appeared in the mid eighties. I had the opportunity to participate in the design of a 140 Megabit/s commercial fibre system, using 1.31 micron technology, in 1984, while also performing performance test work on 400 Megabit/s 1.55 micron hardware. At that stage we began to run into the next performance barrier to be beaten, which was the speed (and cost) of the electronics at either end of the fibre, and by default in every repeater.

While the digital portions of the designs could be readily built using Emitter Coupled Logic (ECL) gate arrays and glue chips, the analogue portions of the designs required to interface to optical detectors and lasers were a genuine headache. The speed at which the ones and zeroes are being clocked is significantly faster than the carrier frequencies of most TV stations !

Laser drivers and wideband front end receivers for such speeds are significantly more difficult to design well than narrowband RF hardware. This is because the circuits must have very low phase distortion, to avoid unwanted shape distortion of the transmitted and received optical signal. I still take much pleasure in the knowledge that I was one of a select few who had mastered this artform.

The result is increasingly more expensive hardware, with increasing speeds. With the need to insert repeaters in every 50-200 km apart, this severely impacted the economics of link design.

Laser performance also proved to be a big issue, since the startup transients of the lasers were often quite nasty.

The basic modulation speed of the lasers became the next target for improvement. In the decade between 1984 and 1994 we have seen laser modulation speeds go up from 4,5 GHz up to a staggering 20 GHz.

While the availability of such lasers raised the achievable link speeds, the economics of the electrical repeaters inserted between fibre spans did not dramatically improve. Using GaAs MMIC receivers, GaAs logic and very fast ECL, on printed circuit board layouts which must be hand crafted by high speed analogue design engineers, is not a recipe for cheap, economical mass production designs ! The problem is further complicated by the basic inflexibility of such hardware, which must be crafted around a specific bit rate and modulation technique. If you want to change either, you have to ditch every single repeater in the link. In an extremely competitive, deregulated telecommunications market, this imposes strong pressures for the rapid amortisation of the investment, yet the competitive pressures force a minimal profit margin per bandwidth.

Detector sensitivity also became an issue for long haul links, since with increasing distance between repeaters the number of photons in a single bit dropped to dozens. Extracting them from the noise in the detector became increasingly difficult, imposing limits on repeater spacing.

The solution to these problems is direct optical amplification.

Optical Amplification

The basic idea behind optical amplification dates back to the early days of laser research. The basic physics of a laser are based on the idea of exciting ("pumping") a volume of a suitable material (solid or gas) in such a manner, that impinging photons of a specific wavelength can stimulate the emission of further photons of the same wavelength. Therefore if you shine light at that wavelength into one end of a volume of so excited material, you get much brighter light coming out of the opposite end. If you place suitable mirrors at either end, the photons will bounce back and forth and you get an optical oscillator, termed a laser. If one of the mirrors is slightly leaky, you can extract optical power from the laser.

The first serious application of optical amplification was in high power military laser experiments performed in the US during the late sixties and seventies. A low power laser was used as a "master oscillator", and the light from it was fed into a cascade of "power amplifier" stages, which boosted it to a much higher power level. This is termed a MOPA (Master Oscillator Power Amplifier) arrangement.

While the MOPA idea was well understood very early, adapting it to the world of optical fibres was not that simple.

The payoff in using this idea for boosting signals in fibre links is very high. An optical amplifier doesn't care about the signal modulation, it simply spits out some large number of photons for every photon it is fed with, faithfully reproducing whatever the original modulation was. It is thus inherently an extremely fast, low distortion amplifier. It is also a very simple amplifier, since there is no need for any conversion between optical and electrical signals. All that is required is an external source of optical excitation to "pump" the lasing medium in the amplifier.

The gain of an optical amplifier, ie the ratio of photons emitted per photons received (or more precisely the ratio of output optical power to input optical power), is dependent upon the length of the amplifier and the gain per unit length of the excited medium. Therefore very high gains can be achieved by cascading multiple optical amplifier stages. Noise performance can be excellent compared to conventional electro-optical systems.

By eliminating complex and inflexible electro-optical repeaters in links, costs can be significantly improved and the longevity of a cable installation significantly extended. Performance upgrades in many instances will require only the replacement of the line terminal equipment at either end of the link.

By the early eighties the race was on in the research community to find suitable laser technology for adaptation to the unique fibre environment. Early research explored Raman effect amplification and adaptations of established semiconductor lasers. Both proved to be disappointing, with Raman effect designs requiring unreasonably large pumping power due to their low efficiency, and semiconductor lasers having high distortion.

The breakthrough came in the mid eighties when a research group in Southampton in the UK devised a rare earth ion doping technique for silica fibres, building on the same sixties research which led to the now standard military Nd:YAG laser, used for bomb guidance. The first Erbium Doped Fibre Amplifier (EDFA) was published in 1987.

The next important development was the 1.48 micron InGaAsP laser diode, the pump power source required to optically excite the Erbium ions embedded in the glass, to amplify at 1.55 microns. Japan's NTT published their results in 1989.

With a compact and efficient pump source and doped fibre technology the practical EDFA became a reality.

To build one, the starting point is a spool of Erbium doped optical fibre of a suitable length. The input, where the EDFA is coupled to the end of a fibre link, uses a wavelength selective coupler. The coupler is used to feed the flow of pumping photons from the pump laser into the EDFA. These photons propagate along the fibre, exciting it. The "signal" photons pass through the coupler into the EDFA, and are amplified in number as they pass through the excited fibre. When they reach the output end of the spool, they are fed into an optical splitter. Most of the photons go into the next segment of the fibre link, but some are split off to feed a local optical detector. The electrical output from the detector is then used in a negative feedback loop to control the power level produced by the pump laser. In this manner the EDFA gain can be quite precisely controlled.

While conceptually the EDFA is fairly simple, the laser physics and system design issues can be quite complex (serious readers are directed to http://132.203.76.61:591/copl/lco/anglais/index.html, who have an excellent collection of material online).

Commercially available EDFA technology at this time covers a wide range of packaging and performance specifications, and a number of different pumping wavelengths and pumping designs. Current designs commonly employ a two stage arrangement, using silica glass, combinations of silica glass and fluoride glass, or fluoride glass fibres alone.

Noise figures typically vary between 4.5 and 9 dB (competitive with GaAs electrical receivers), gains between 25 and 40 dB, and output power levels between 13 and 20 dBs. We are also seeing the first commercial designs optimised for 1.3 micron systems, using praseodymium doped fluoride fibre amplifier (PDFFA) technology, and achieving similar performance to 1.55 micron EDFAs.

At the time of writing the leading players in the market were JDS-FITEL, ORTEL, FTI, Bosch Telecom, NTT and Galileo Corp.

Optical amplifiers will see further improvements in coming years, as the technology matures and is further refined. We are already seeing significant reductions in the cost of long haul telecommunications and this trend will continue as the technology further proliferates in the market. For the forseeable future the performance bottleneck for long haul fibre links will continue to be in the line terminal equipment, limited by laser and electronic circuit costs.



$Revision: 1.1 $
Last Updated: Sun Apr 24 11:22:45 GMT 2005
Artwork and text ¿ 2005 Carlo Kopp


Industry Publications Index ... Click Here