Measured bars of light
Martin Rowe, Senior Technical Editor- September 1, 2007
Daniel Shi, a staff test engineer at Coherent (Santa Clara, CA), is responsible for metrology, calibration, and measurement-system analysis for production test of diode lasers, including single-emitter, bar, and stack products. These semiconductor devices produce light at levels from less than 1 W up to 1500 W. Coherent’s diode lasers are used in industrial, medical, and military applications that require high power levels. Martin Rowe spoke with Shi by telephone to learn more about his work.
Q: What is a diode-laser bar?
A: It’s an array of laser diodes that produce photons from electrical power. Bars can produce 100 W of light (we’re developing higher-power bars). Each bar is cleaved from a semiconductor wafer, and a typical bar measures 1 cm long and 150 µm thick. The third dimension, normally referred to as “cavity length,” varies from less than 1 mm to several millimeters. We can stack bars vertically or horizontally to increase the power of each packaged final product.
Q: What are your roles in test and measurement?
A: I’m responsible for three areas: metrology, calibration, and measurement-system analysis for production.
Q: Please differentiate between metrology and calibration.
A: Testing a laser bar involves measurements such as optical power, wavelength, and operational current. In my metrology work, I investigate instrumentation and procedures to best measure these properties. The ability to effectively measure these properties influences our internal operations from development to obtaining pass/fail decisions on the production line.
My calibration duty refers to correctly transferring known standards from outside sources such as NIST to dozens of production stations and carefully maintaining them to ensure measurement accuracy. I lead a group of technicians, equipment engineers, and manufacturing engineers to develop calibration procedures for production test stations. The measurement accuracy is important so that we meet customer expectations.
Q: How do you analyze measurement systems?
A: Often, if you repeat a measurement on the same station, the numerical results will slightly differ. While performing measurement-system analysis, I use statistical theory and analysis tools on test data to determine the performance and capability of test stations. I pick samples from a normal production stream and repeatedly run them through the targeted test station under controlled conditions.
I assess the sources of the discrepancies between measurements—the so called “uncertainty”—and determine how much error comes from equipment. From that data, I can determine if the station or the process needs improvement.
Q: How do you make production measurements?
A: We use a light-integrating sphere to make wavelength and power measurements. The sphere integrates the total output light, and we attenuate it by allowing only a small amount to reach a calibrated photo detector. We calibrate the photo detector with a NIST-traceable thermopile and optical meter. Their combined uncertainty is ±3%. Measurements also include a profile of a device’s light output along the emitting surface [figure]. We call it “near-field measurement.”
Diode-laser arrays emit light from one surface only and often require profiling.
Q: Is ±3% uncertainty sufficient?
A: In most cases, yes. Some customers, however, want better. Thus, we’ve embarked on a six-sigma program to further reduce uncertainty in production. It’s difficult in many ways. One difficulty arises from a lack of high-accuracy standards. To the best of my knowledge, the uncertainty of the best optical standard that NIST can provide is 0.5%. There’s no 1-W or 10-W standard available at wavelengths we need. Even if we had such a standard, we’d still have technical challenges to accurately transfer it to our instruments.