Demand for digital: Challenges and solutions for high-speed ADCs and RADAR systems
Modern advanced radar systems are being challenged on a number of fronts, with additional operational requirements including a need to support multi-function processing and dynamic mode adjustment. Moreover, recent changes in frequency allocations have resulted in many radar systems potentially operating in close proximity to communications infrastructure and other spectrally demanding systems. With further spectrum congestion anticipated in the coming years, the problem is expected to be compounded to a point that radar systems will need to be run-time adaptable to suit their environmental and operational requirements, which is driving the need toward cognitive and digital radar systems.
The need for more digital signal processing is pushing the radar signal chain to transition to digital as early as possible, moving the A/D converter (ADC) closer to the antenna, which in turn introduces a number of challenging system-level considerations. To explore this further, Figure 1 illustrates a high level overview of a typical current X-band radar system. Within this system two analog mixing stages are typically utilized. The first stage mixes the pulsed radar return to a frequency of around 1 GHz and the second to an IF in the region of 100 to 200 MHz to enable sampling of the signal using a 200 MSPS or lower A/D converter, to a resolution of 12 bits or higher.
Figure 1 – Example Radar Receiver Architecture Utilizing 1st and 2nd IFs
Within this architecture, aspects such as frequency agility and pulse compression may be implemented in the analog domain, which may require signal processing modifications and adjustments, but for the most part the system functionality is limited by the digitization rate. It should be noted that even sampling at 200 MSPS data rates has enabled a significant leap forward in radar processing, but as we move to the next stage in this evolution we are required to migrate even further toward the all-digital radar.
In recent years gigasample per second (GSPS) ADCs have been pushing the transition to digital nearer to the antenna by moving the digitization point in the system to after the first mixing stage. Using a GSPS converter with an analog bandwidth in excess of 1.5 GHz already supports digitization of the first IF, but in many cases the performance of current GSPS ADCs has limited the acceptability of this solution as the linearity and noise spectral density of the device has not met the system requirements.
Furthermore, until recently high-speed ADCs predominantly used parallel low-voltage differential signal (LVDS) interfaces as the means of moving data between the high-speed ADC and the digital signal processing platform, typically an FPGA. However, using an LVDS data bus to output the data from the converter brings some technical challenges as a single LVDS bus would need to operate well beyond the max rate of the IEEE standard and what an FPGA can handle.
To accommodate this, the output data is de-mutiplexed onto two, or more generally, four LVDS buses to reduce the data rate per bus. For example, 10-bit A/D converters operating at sample rates in excess of 2 GSPS would typically require the output to be de-multiplexed by a factor of 4 creating a 40-bit-wide LVDS bus. With many radar systems, and particularly phased arrays, using multiple GSPS ADCs this soon becomes an unmanageable hardware development, with so many lanes required to be routed and matched in length. Not to mention the number of FPGA pins required for the interconnect!