Ethernet vs. Camera Link
John Egri, Imperx- May 1, 2006
READ OTHER MAY ARTICLES:
Contents, May 2006
When does Ethernet mean "Gigabit Ethernet"?
Camera Link and GigE improve image speeds
A need for speedVision meets Ethernet
Over the past decade, test engineers' perceptions of video as a component in a variety of automated testing applications has gone from "Why would I need it?" to "Gotta have it." Test engineers have found that the visual information acquired from high-performance digital cameras can be useful in many test-related situations, from assembly-robot guidance and verification to flexure measurement of consumer-electronic subassemblies during vibration testing.
While specialized industrial digital cameras can acquire the image data, they still have to pipe them to a host computer where the image data can be integrated with measurements obtained through more traditional data-acquisition means. There are numerous ways to connect cameras and computers, but currently only two video-acquisition standards are fast enough and have a large enough industrial installed base to qualify as top contenders: Camera Link and Gigabit Ethernet (Figure 1).
|Figure 1. Camera Link cameras (left) use one or two
D-shell connectors to transmit digital video information, while Gigabit Ethernet cameras (right) use a standard Gigabit Ethernet connector.
The Automated Imaging Association (AIA), a consortium of more than 200 camera and frame-grabber vendors, developed Camera Link to provide a standard interconnection between digital cameras and frame grabbers. Prior to Camera Link's introduction in 2000, a number of nonstandard parallel interfaces employed either low-voltage digital signaling (LVDS) or RS-422 signaling. Different connectors and pin-outs made cable production difficult and confusing, and no standard communications method for configuring cameras existed. Figure 2 shows the hardware structure behind "base" mode Camera Link, which is based on the earlier Channel Link interface. Channel Link consists of a set of driver/receiver pairs, each sending an LVDS signal directly over a pair of conductors. Each driver accepts 28-bit single-ended data, plus a clock. The 28-bit data includes 24 bits of video data (organized as three 8-bit ports) and four strobes—FVAL (frame valid), LVAL (line valid), DVAL (data valid), and a spare—multiplexed 7:1 and serialized. The resulting four data streams plus clock transfer over five LVDS pairs. The clock rate runs from 20 to 85 MHz, to yield up to 2.04 Gbps over a 10-m (maximum) cable.
|SEE LARGER IMAGE OF FIGURE 2.
Figure 2. The Camera Link architecture is based on Channel Link (top, highlighted in blue), which consists of a set of driver/receiver pairs, each sending an LVDS signal directly over a pair of conductors. Camera Link (yellow) includes an additional LVDS receiver and driver, and it also adds four more LVDS pairs for discrete general-purpose camera control signals.
Camera Link uses Channel Link for video data, strobes, and clock. It then adds an additional LVDS receiver and driver to provide a bidirectional asynchronous communications channel; the serial line to the frame grabber is called SerTFG, while the serial line to the camera is called SerTC. Camera Link also adds four more LVDS pairs for discrete general-purpose camera-control signals. These signals are often used for external triggering or dynamic exposure control.
The standard provides three signaling modes with progressively greater bandwidth: base, medium and full (Table 1). Base mode provides four video data lines, as shown in Figure 2. Medium mode adds a second Channel Link to double the number of digital data bits to 48, giving a bandwidth of 4.08 Gbps. Full mode adds a third Channel Link, upping the number of data bits to 64 and the bandwidth to 5.44 Gbps. Base mode uses a single 26-pin D-shell connector. The medium and full modes require two connectors to provide the additional conductor paths.
Gigabit Ethernet is a generic name for an Ethernet system capable of transporting digital signals in full-duplex mode at 1 Gbps over a maximum cable length of 100 m. Various flavors of Gigabit Ethernet carry signals over various types of links, such as copper, fiber optics, and microwave. Figure 3 shows the most common, 1000Base-T (Ref. 1), which uses five-level (quinary) pulse amplitude modulation (PAM5) to transmit full-duplex 1-Gbps signals over a category 5 cable containing four shielded twisted pairs of conductors (Ref. 2).
In Gigabit Ethernet, the sending computer breaks its information into discrete packets, which it sends individually over the link. Each packet includes headers and footers carrying protocol information, such as sending and receiving IP addresses. This leads to an encapsulation penalty.
|Figure 3. Gigabit Ethernet employs PAM5 encoding to provide 1-Gbps full-duplex transmission over a four-twisted-pair category 5 cable. Adapted from Ref. 1.|
How large the encapsulation penalty becomes in a video-transmission application depends on the camera setup, over which the user has some control. The tradeoff is that smaller packets lead to less latency but larger encapsulation penalties (and therefore less efficient use of the bandwidth), while larger packets result in a lower encapsulation penalty but contribute to greater transmission latency. Latency is an integral issue with all Ethernet schemes. Whether this interferes with conducting the test depends on the test's goals and how the test program is set up.
Because Ethernet is a computer-to-computer communication standard, it requires "intelligent" devices capable of protocol processing. The appearance of so-called "smart cameras" that feature built-in microprocessors, and their ready acceptance as the favored choice for incorporating video into test applications, has made Gigabit Ethernet a viable choice.
Choosing your poison
Both Gigabit Ethernet and Camera Link are capable of carrying digital video signals from high-performance industrial cameras to data-acquisition host computers. Which will do the better job in a given situation depends on the application's characteristics.
Gigabit Ethernet has shown itself to be a stellar performer in applications requiring long link lengths and simple cabling schemes and where triggered timing and latency are not an issue. For example, this standard is the overwhelming favorite for video surveillance applications.
Gigabit Ethernet is ideal for many applications in which a cable length of 10 m is simply inadequate, such as in control systems for large commercial aircraft, radar systems, or high-energy physics experiments. These physically spread-out applications can benefit from the use of video to verify whether, for example, an aircraft control surface has moved the correct amount.
Additionally, Ethernet's easy cabling solution might be an advantage to engineers setting up a production facility with multiple pick-and-place robots or several bare-board and solder-inspection stations. All that is needed to hook up a Gigabit Ethernet link is to plug in the ends of a standard cable of the right length. Keep in mind, though, that most computer Ethernet ports are not Gigabit Ethernet ports (see "When does Ethernet mean 'Gigabit Ethernet'?").
The fact that Gigabit Ethernet is used in a wide range of applications, reaching far beyond just those requiring the sharing of video signals, offers its own advantages: reduced cost and increased availability of technical assistance. Wider acceptance leads to larger manufacturing volumes, greater economies of production scale, and reduced costs for components such as cables and connectors. Wider acceptance also leads to larger user communities and greater availability of experts in the set up and use of the systems.
Despite its short cable length, Camera Link does offer advantages of its own. For example, Camera Link includes separate out-of-band signaling pathways for camera-control signals, making it possible to modify camera settings on the fly without interfering with video transmission. Camera Link also provides separate paths for control signals, such as triggers, which are critically important for many data-acquisition and test applications. Camera Link also avoids latency by separating control and video signal paths, providing the full bandwidth to the video payload without any protocol overhead or encapsulation penalty.
Being specifically a video-transport standard, Camera Link provides all of the protocols needed to move a signal from the image sensor to the computer memory. Ethernet, being a general data-communications standard, does not have a formal description of how to format video signals. While such a video standard (called GigE Vision) is in committee, it has not yet been completed or ratified. In the meantime, it is up to the equipment manufacturer to specify video data formatting. Since Camera Link already has such a description, you can be confident that every Camera Link camera will work with every Camera Link frame grabber.
While requiring the use of a frame grabber increases installation complexity and expense, the Camera Link frame grabber provides resources that can help reduce the host computer's processor workload. Frame grabbers generally have an onboard buffering capability to ease computer-bus traffic congestion. High-end frame grabbers also can apply some simple hardware-based image-processing algorithms, such as thresholding and filtering, to further reduce the load on the host computer.
With its full-mode bandwidth of 5.44 Gbps, Camera Link provides up to a factor of 5 greater bandwidth than the 1-Gbps speed of Gigabit Ethernet. So, for high-speed, high-resolution applications, Camera Link is generally the better choice.
Most inspection and test applications are much smaller than commercial aircraft and thus do not require the long cable lengths that are available with Gigabit Ethernet. For example, a wave-solder inspection system is easily served by Camera Link's 10-m cables. In fact, the network latency of Ethernet could be a critical problem for an inspection system checking for defects on bare boards moving down a high-speed conveyor. A trigger signal delayed by Ethernet latency would make reliable inspection impossible. While installing separate trigger lines to coordinate video acquisition with data acquisition may be a viable solution, it often defeats the whole point of using a networking standard.
Frame-grabber suppliers have been manufacturing LVDS frame grabbers, which evolved into Camera Link frame grabbers, for more than 20 years. These companies produce market-tested software development kits (SDKs) and imaging libraries that provide image-analysis algorithms used in many machine-vision and inspection applications. Gigabit Ethernet, being a new technology, lacks these extensive imaging libraries. Gigabit Ethernet camera vendors are currently developing these software tools and soon many of the Camera Link frame-grabber vendors will also be offering imaging SDKs based on Gigabit Ethernet platforms.
The choice of whether to use Gigabit Ethernet or Camera Link in a particular test application is up to the system integrator. In most cases, one or more application characteristics will mandate one or the other standard. In other applications, there will be no clear winner—either will do a good job. At that point, it's a matter of personal preference.
Table 1. Comparison of Camera Link mode specifications
|Mode||8-bit ports||Video DATA bits||Connectors||Bandwidth (Gbps)|
|Base||A, B, C||24||1||2.04|
|Medium||A, B, C, D, E, F||48||2||4.08|
|Full||A, B, C, D, E, F, G, H||64||2||5.44|