Research Horizons

The Vapor Cell Atomic Clock

John Coffer, Jeremy Milne (in back), and James Camparo stand in front of their laser-pumped, rubidium-atom, vapor-cell-clock test bed.

John Coffer, Jeremy Milne (in back), and James Camparo stand in front of their laser-pumped, rubidium-atom, vapor-cell-clock test bed.

Advanced atomic clocks, suitable for space deployment, must allow for extended periods of autonomous constellation operation, which enhances system robustness. Additionally, advanced atomic clocks can lower a mission control station’s workload. Air Force space programs that depend on precise timekeeping, such as GPS, Milstar, and Advanced EHF (AEHF), place constraints of size, weight, power, and environmental sensitivity on the spacecraft atomic frequency standards.

James Camparo, Electronics and Photonics Laboratory, said, “The specific objective of this effort is to develop prototypes of advanced rubidium vapor-cell and cesium atomic-beam spacecraft clocks, and to aim the development of these prototypes toward improving performance while reducing the overall size, weight, and power of the clock.” The development of these prototypes is designed to help solve the scientific and engineering problems confronting next-generation spacecraft clocks. Those working on this effort also include John Coffer and He Wang, both of the Photonics Technology Department.

The operation of an atomic clock requires the creation of a population imbalance between two atomic states connected by a microwave transition: the greater the imbalance, the better the frequency stability of the clock. In current rubidium clocks, such as those used for GPS, the population imbalance is created by optical pumping with a discharge lamp. For these devices, fractional population imbalances of ~0.1 percent are typical. Theoretical work conducted by Camparo and his team has shown that the population imbalance could be increased by nearly two orders of magnitude using a diode laser. Additionally, efforts are underway to use coherent (laser-induced) atomic excitation processes to generate atomic clock signals without a population imbalance. These efforts are aimed at chip-scale atomic clocks and take advantage of a phenomenon called coherent population trapping (CPT). While most research organizations focus on ground-based standards, The Aerospace Corporation’s laser-pumped rubidium clock activities concentrate on compact devices suitable for space applications. Two significant problems in this area include understanding the origin of excess noise in laser-pumped clock signals (and developing means for its mitigation), and creating means for smart-clock technology (i.e., a clock that senses and corrects perturbations that could lead to frequency instability).

Michael Huang adjusts a diode laser used to generate an atomic clock signal. In this experiment, the microwave signal is superimposed on an optical carrier, a technology that has allowed atamomic clocks to reach "chip-scale" dimensions.

Michael Huang adjusts a diode laser used to generate an atomic clock signal. In this experiment, the microwave signal is superimposed on an optical carrier, a technology that has allowed atamomic clocks to reach “chip-scale” dimensions.

In the cesium-beam clocks used in GPS, Milstar, and AEHF, a population imbalance between atomic states is achieved by passing an atomic beam through state-selecting magnets. These magnets transmit less than 1 percent of the atoms in the beam. Previous studies prepared by The Aerospace Corporation showed that 100 percent of the beam could be used if magnetic state selection was replaced with laser optical pumping. In addition to increasing clock signal, optical state preparation uses the clock’s cesium supply efficiently, increasing clock lifetime. Though laser-pumped beam clocks in many other laboratories are large instruments in carefully controlled environments, the efforts at The Aerospace Corporation focus on compact, lightweight devices suitable for spacecraft use.

A second major application of lasers in cesium beam clocks relates to atomic momentum manipulation. Using lasers to slow the speed of atoms (i.e., longitudinal cooling) increases the time that the atoms spend in a microwave cavity, thus narrowing the clock transition’s line shape. Transverse cooling results in the beam’s collimation and “brightening,” thus improving the clock’s signal-to-noise ratio. A significant technological problem addressed in this area is the creation of a cold, continuously operating (as opposed to pulsed) atomic-beam clock for use onboard spacecraft.

Camparo said, “Over the years, MOIE atomic clock investigations have provided the basis for continuous technical support to the Air Force and national security space programs. This support has primarily been to on-orbit anomaly resolution, assistance in manufacturer clock development efforts, and simulations of system-level timekeeping.”

In the coming year, The Aerospace Corporation’s research team will continue to operate its atomic-clock flight simulation test bed for Milstar/AEHF rubidium atomic clocks. In particular, this will include exercising the rubidium clock under stressing conditions and developing means to mimic the behavior of a mixed Milstar/AEHF constellation. Also investigated is the operation of RF-discharge lamps that produce the atomic signal in the rubidium clocks flown on Milstar, AEHF, and GPS satellites. These investigations have shown that RF power variations in the lamp’s circuitry primarily affect the lamp’s operation via heating of the rubidium vapor within the lamp. This may have implications in explaining anomalous on-orbit clock frequency jumps observed for a number of GPS satellites. The team also continues to examine integrity monitoring for the GPS system, where the clock autonomously senses that a problem has occurred and sets the satellite’s navigation message to nonstandard code. While the second harmonic signal from the rubidium clock is used as a status-of-health indicator, it is not understood how this signal depends on various clock parameters; research is aimed at addressing that question. Finally, the team constructed a Monte Carlo simulation of AEHF system timekeeping and used it to verify the contractor’s ability to meet certain system-level requirements.

Advanced Visible and Infrared Focal-Plane Sensors

Bruce Lambert works on the dynamic infrared modulation transfer function measurement system.

Bruce Lambert works on the dynamic infrared modulation transfer function measurement system.

Space-based electro-optical (EO) imaging systems collect vast quantities of data across various spectral regimes from a wide range of orbital altitudes. These systems range in size and complexity from units as small as consumer cameras to structures as large as NASA’s Webb infrared (IR) telescope/observatory with its 20-foot-diameter primary mirror. At the heart of an EO system, focal-plane imaging chips convert optical data into electronic analog (and eventually digital) signals for each pixel.

An Aerospace study, “Advanced Visible and Infrared Sensors,” has been investigating characteristics of these devices—in particular, signal, noise, and image quality. Funded by the Mission Oriented Investigation and Experimentation (MOIE) program, the study has examined how those properties are analytically modeled, as well as their experimental characterization. The experimental work is particularly important in diagnosing anomalies and design errors and in describing the devices’ fundamental imaging properties, thus providing feedback for design improvement.

Terry Lomheim, distinguished engineer in the Sensor Systems Subdivision, explained that “Visible and IR focal-plane devices are complex, mixed-mode (analog and digital) light-sensing integrated circuits (ICs). The most familiar ones—charge-coupled devices (CCD) and complementary metal-oxide semiconductor (CMOS) focal planes designed for detecting light in visible wavelengths—are part of cell phone cameras, camcorders, and digital still cameras. They consist of single (monolithic) silicon IC chips with numerous pixels wherein light enters through the frontside of the device.” Lomheim is the team’s principal investigator, and his coinvestigators are Jonathan Bray and Bruce Lambert of the EO Device Evaluation Lab and Jerry Johnson and Jeff Harbold of the Visible and IR Sensor Systems Department.

One motivation behind the project, Lomheim noted, is the fact that lower payload mass, power, and volume result in lower sensor-system life-cycle costs: “Smaller payload mass and power level increase compatibility with commercial spacecraft buses, for instance, and allow the use of lower-cost launch systems,” he said. “Improved radiation hardness may allow the use of orbital altitudes that are associated with higher space radiation dose levels, but are more cost-effective in terms of overall sensor constellation architecture.” Moreover, he said, advances in payload signal processing can reduce the cost of ground systems. “Visible and IR camera systems that collect images in many spectral bands, measure changes in the polarization of light, or operate at extremely low light levels all might enhance the information-extraction ability—and therefore the utility—of space EO sensor missions.”

Bruce Lambert works on the dynamic infrared modulation transfer function measurement system.

Bruce Lambert works on the dynamic infrared modulation transfer function measurement system.

The DOD and NASA have used advanced versions of these devices for several decades, and the architectures are maturing, with higher detection efficiencies, improved sensitivity, higher frame rate (the rate at which unique images are consecutively produced), larger pixel formats, and on-chip analog-to-digital conversion (ADC). On-chip ADC enables these devices to operate in a “photons in, bits out” manner.

Advanced devices include extremely thin silicon imagers that collect light through the backside for enhanced, as well as hybrid imagers. In the hybrid imagers, a grid of light-sensing pixels is mated to a corresponding grid of pixel unit cells inside a readout integrated circuit (ROIC). These pixel unit cells process the signal photocurrent, converting it to signals in the voltage domain. Each one contains a photocurrent/charge-to-voltage conversion preamplifier with a minimum of three transistors.

The wide variety of EO camera applications dictates a wide diversity of focal-plane requirements for parameters such as line rate, frame rate, dynamic range, linearity, operability, noise/sensitivity levels, and radiation hardness. As a result, a broad range of operating characteristics is needed, one that includes distinctly different focal-plane pixel unit cell electronics, multiplexing circuits, numbers of ADCs, operating modes, and operating temperatures. Aerospace has been examining how to optimize focal-plane designs to meet the appropriate signal-to-noise and image-quality requirements despite the limitations of the detecting material technologies and the IC manufacturing process.

In the area of advanced signal and noise modeling, Lomheim’s team has concentrated on focal plane arrays (FPAs) with built-in ADC capability, novel unit cell ROICs, circuits optimized for processing multispectral and hyperspectral data, and new detector technologies that span the visible to longwave IR region. Special pixel unit cells capable of wide dynamic range and low noise may further enhance these applications.

In 2009, the team also concentrated on perceptive signal and noise models on focal-plane devices that use smaller photolithographic design features. These devices will be manufactured using 0.18 micron CMOS design rules and must function at cryogenic temperatures. This represents a new operating regime for the key transistors in the mixed-mode pixel unit cell circuits.

Aerospace-developed narrowband Er:YAG laser seed source. An Er:YAG crystal is configured in a nonplanar ring oscillator (NPRO) geometry to achieve narrowband output at 1645 nanometers with a line width less than 1 megahertz. The output of the NPRO will seed a larger Q-switched laser to generate high-peak-power narrowband pulses for eye-safe LIDAR applications. The observable green emission, derived from an optical upconversion process, traces the infrared optical path within the NPRO resonator.

Simplified frequency-agile midwave-infrared source concept with optical parametric amplification (OPA) stage. Tunable 3-micron infrared (IR) light is generated by mixing Nd:YAG laser pulses with the output of a tunable 1.5-micron laser diode within a difference frequency generation (DFG) crystal. The infrared output from the crystal is subsequently amplified within a second nonlinear OPA crystal.

Aerospace is also studying image-quality measurements. CCD visible focal-plane technology—the workhorse for advanced imaging systems since the mid- to late 1970s—is gradually being replaced by frontside- and backside-illuminated monolithic CMOS and hybrid silicon PIN visible focal-plane approaches. (A PIN photodiode serves as a light-receiving element to convert an optical signal into an electrical one.) New system applications of interest include panchromatic and multispectral sensors that require large-area, high-frame-rate two-dimensional arrays also capable of ultrawide dynamic range (i.e., full sunlight to night imaging). Key figures of merit for these focal planes include the modulation transfer function (MTF) or, equivalently, the point spread function (PSF); noise floor; well capacity; and uncorrected pixel gain and offset nonuniformity effects. Precise characterization of these parameters for a high-performance visible focal plane requires precise multicolor calibration of the optical system. Image quality is sensitive to spatial noise effects, which are determined empirically by nonuniformity and nonlinearity characterization over the pixel dynamic range. When the low end of this dynamic range covers lunar illumination, simulation in a laboratory setting requires optical setups involving multiple light sources and extreme “light tightness.”

In another image-quality measurement activity, Aerospace’s MTF and spot-scan characterization capabilities have been refined to enable precision-staring pixel spot-scanning over a wide range of spectral wavelengths. In this technique, a small spot of light is generated and moved around a pixel for diagnostic purposes. The Aerospace effort involved a confocal microscopic setup aimed at detailed pixel inspection in support of the spot-scanning work. The work of Lomheim’s team improves the corporation’s ability to cover these new measurement regimes to support SMC and other customers developing large, small-pixel visible/IR focal planes for an ultrawide dynamic range.

The MOIE project has scrutinized the process of modeling the focal plane sensors, with productive results. Understanding the properties of new imaging devices is vital to the design and planning of imaging systems, and one way Aerospace is achieving this understanding is through modeling the spectral MTF and PSF characteristics of the latest focal-plane pixel designs. The MTF, a numeric value, characterizes the response of the array to increasing spatial detail in the scene being observed. The PSF describes a system’s response to a point source, like a star. Such modeling will provide crucial design guidance in the development of these large arrays.

Aerospace-developed narrowband Er:YAG laser seed source. An Er:YAG crystal is configured in a nonplanar ring oscillator (NPRO) geometry to achieve narrowband output at 1645 nanometers with a line width less than 1 megahertz. The output of the NPRO will seed a larger Q-switched laser to generate high-peak-power narrowband pulses for eye-safe LIDAR applications. The observable green emission, derived from an optical upconversion process, traces the infrared optical path within the NPRO resonator.

Aerospace-developed narrowband Er:YAG laser seed source. An Er:YAG crystal is configured in a nonplanar ring oscillator (NPRO) geometry to achieve narrowband output at 1645 nanometers with a line width less than 1 megahertz. The output of the NPRO will seed a larger Q-switched laser to generate high-peak-power narrowband pulses for eye-safe LIDAR applications. The observable green emission, derived from an optical upconversion process, traces the infrared optical path within the NPRO resonator.

In CMOS visible imager MTF and PSF modeling, Aerospace has refined numerical two-dimensional Fourier transform methodology for converting empirical spot-scan–generated PSF data to a precision MTF description of a pixel response. This has proven successful and has clarified certain effects, thought to have been data anomalies, as real physical effects in the pixel response. The effort employed a new process involving the mapping of multipixel data into a single effective pixel grid. This allows much shorter data-collection times and avoids data uncertainties associated with systematic drifts and slow instabilities in the spot-scanning optical setup. The effort also demonstrated how model development directly affects experimental research work and vice versa.

The Aerospace sensor project has completed a significant upgrade to its experimental color-dependent spot-scan capability. The updated configuration includes additional diagnostic tools that more completely characterize the operation of the system and a confocal microscope fitted into the optical system for more precise determination of spot focus. The new configuration permits acquisition of highly accurate and repeatable wavelength-dependent pixel response data with time-reduction factors as high as 100.

Aerospace used two independent experimental techniques to derive wavelength-dependent MTF data for two CMOS imagers: a tilted-knife-edge method, with an Offner relay optical reimaging system, and the spot-scanning method described above. These techniques quantified the impact of design and manufacturing variations on the color-dependent MTF characteristics of the imagers. Specific diffusion-related and pixel circuitry layout effects were precisely correlated to the measured spectrally dependent MTF degradation.

This MOIE project’s improved, efficient MTF/PSF laboratory characterization capability has enabled the detailed color-dependent characterization of a frontside-illuminated CMOS imager (developed by JPL) using precision spot-scanning and corresponding/confirming tilted-knife-edge MTF characterization. Lomheim described the imager’s electronics: “This CMOS imager has a spacing between adjacent photodiode pixels of 9 microns. Its photodiodes are formed between an n well and p epitaxial layer, characterized by a lower doping level and hence a much deeper depletion depth that would prevail for typical cell phone camera CMOS imagers. For this device, the photodiode area is inscribed toward the center of a pixel pitch and surrounded by pixel electronics and an opaque contact along one direction and pairs of overlying metal lines along the orthogonal direction.”

This type of detailed pixel-level examination of the relationship between the device manufacturing layer parametrics and the imager’s EO imaging capability is essential to improving this technology and guiding it toward the future goals and requirements of Aerospace customers.

Lasers For Space Applications

In 1971, Aerospace performed its first illumination of a Defense Support Program satellite in orbit to calibrate the sensor on board. The illumination from the ground was accomplished with a hydrogen fluoride (HF) laser, which emits light near 3 microns. For the next 25 years, this laser was used for all Aerospace satellite illuminations and became the cornerstone of Aerospace’s laser beacon effort. Aerospace’s success led to an increasing demand for this capability, as well as the desire to illuminate satellites from multiple ground sites. This prompted the need to develop a more reliable, transportable, and user-friendly replacement for the HF laser. By the mid-1990s, an Aerospace Mission Oriented Investigation and Experimentation (MOIE) effort began for this purpose, and led to the development and implementation of two solid-state 3-micron sources—an Er:YAG laser and an optical parametric oscillator (OPO). World-record output power and efficiencies were achieved with both devices.

Ongoing research involves the evaluation and development of new laser technologies for improving defense capabilities in remote sensing and satellite sensor calibration.

“Our most recent laser development efforts have focused on a 3-micron wavelength-agile source for remote detection of toxic chemical species; a narrowband eye-safe 1.6-micron laser for various light detection and ranging (LIDAR) applications, including cloud, wind, and plume detection; and a 4.5-micron laser source for national security space applications,” said Todd Rose, principal investigator of the project and laboratory manager in the Photonics Technology Department (PTD). Coinvestigators from PTD include DaWun Chen, senior scientist, and Steven Beck, department director.

“Frequency-agile laser sources are useful for remote sensing applications that use differential absorption LIDAR, or DIAL, to detect trace chemical species in the gas phase. DIAL can be used to track plumes of toxic industrial chemical vapor formed by accidental or adversary-caused release near populated areas or other areas of interest. Detection of multiple species in a timely manner requires laser systems whose frequency (color) can be tuned quickly and accurately to select spectral absorption features of target gas species,” Rose said. The team is working on a rapidly tunable 3-micron DIAL source, which is based on a nonlinear optical approach called difference frequency generation and optical amplification. “The goal of this effort is to demonstrate a 10-watt wavelength-agile system using an available high-power 37-kilohertz-repetition-rate Nd:YAG laser pump and commercially available telecom tunable laser diodes,” Rose said.

For defense applications, a tunable OPO is being developed to provide output near 4.5 micron. This device will be pumped with a 20-watt, 1.9-micron thulium fiber laser and will generate midwave-infrared output via a nonlinear optical process similar to difference frequency generation. A second approach using a pulsed holmium YAG laser to pump and OPO is also being pursued. Other LIDAR applications, such as the characterization of winds in the vicinity of aircraft or target identification on a battlefield, necessitate the use of eye-safe sources. A compact narrowband Q-switched (pulsed) Er:YAG laser is being constructed for this purpose. A key component of this laser system is a new Aerospace-developed nonplanar ring oscillator (NPRO) that provides narrowband seed light at 1.6 micron. This is the first demonstration of an NPRO operating in this important eye-safe wavelength region.

Back to the Spring 2010 Table of Contents