Design Con 2015

How to interpret a linear power supply’s data sheet, Part 1

-January 02, 2013

Although variable DC power supplies might appear to be relatively simple devices common to every engineer’s benchtop, they are actually sophisticated instruments. They must reliably deliver voltage and current that is stable, precise, and clean, no matter the type of load—resistive, inductive, capacitive, low impedance, high impedance, steady state, or variable. One popular type of variable DC power supply is the linear power supply (Figure 1), which is durable, accurate and delivers power with low noise.

A linear power supply’s simple, direct feedback mechanisms provide excellent load regulation and overall stability. A power supply’s specifications are intended to clarify both its performance and its limitations. However, with no consistent industry standard for expressing these specifications, there is often significant variation from one manufacturer’s data sheet to another’s. Choosing the most appropriate power supply for a specific application requires understanding what those specs convey or (sometimes) what they may be intended to de-emphasize.


Figure 1. Simplified block diagram of a programmable linear power supply.

Reading a Linear Power Supply’s Data Sheet
Although a linear power supply’s data sheet may seem to list many different specifications, they can all be grouped into three logical categories: accuracy and resolution, stability, and AC characteristics.
Most DC power supplies have two modes of operation: constant voltage (CV) mode, in which the power supply regulates the output voltage based on the user settings, and constant current (CC) mode, in which the power supply regulates the current. The mode in use depends not only on the user settings but on the resistance of the load as well. Different specifications apply when a power supply is in CV mode than when it is in CC mode.

Specs Related to Accuracy and Resolution
At any given time, either voltage or current is being regulated by the power supply and matches the setting within the instrument’s accuracy.
•    In CV mode, the output voltage is determined by the voltage setting within the accuracy specifications of the instrument. The delivered current is based on the magnitude of the load impedance.
•    In CC mode, the output current is determined by the current limit setting. The resulting voltage is based on the load impedance.
Historically, DC power supplies used potentiometers to set output voltage or current. Today, microprocessors receive input from the user interface or from a remote interface. A digital-to-analog converter (DAC) takes the digital setting and translates it into an analog value that is used as the reference for the analog regulator. The quality of this conversion and regulation process determines the setting resolution and accuracy values.

Voltage and current settings (sometimes listed in the data sheet as limits or programmed values) have resolution and accuracy specifications associated with them. The resolution of these settings indicates the minimum increment by which the output can be adjusted, and the accuracy describes the extent to which the value of the output matches a national or international standard such as the National Institute of Standards and Technology’s (NIST) voltage and current standards. Setting and readback specifications are different parameters, and they should be considered separately. Good performance on readback accuracy does not necessarily mean good performance in setting accuracy.

Most DC power supplies provide built-in capabilities for measuring the voltage and current being delivered by the power supply output. Because they are reading the voltage and current back into the power supply, the readings these measurement circuits produce are often called readback values. Most professional power supplies incorporate digital meters that use analog-to-digital converters; for these internal instruments, the specifications are similar to those for a digital multimeter. The power supply displays these readback values on its front panel and can also transmit them over a remote interface.

Setting accuracy determines how close the regulated parameter is to its theoretical value as defined by an international or national standard. Output uncertainty in a power supply is largely due to error terms in the DAC, including quantization error. Setting accuracy is determined by measuring the regulated variable with a traceable, precision measurement system connected to the output of the power supply. Setting accuracy is given as:
±(% of setting + offset)

For example, Keithley’s Model 2200-32-3 Programmable 32V/3A DC power supply has a voltage setting accuracy specification of ±(0.03% + 3mV). Therefore, when it is set to deliver 5V, the uncertainty in the output value is (5V)(0.0003 + 3mV) or 4.5mV. Current setting accuracy is specified and calculated similarly.

Setting resolution is the smallest change in a voltage or current setting that can be selected on the power supply. This parameter is sometimes called programming resolution. The resolution specification limits the number of discrete levels that can be set. Often, this is defined by a combination of the number of user interface digits available and the number of bits available in the DAC. A DAC with more bits has finer control of its output and can deliver more distinct values for the control loop to use as a reference. However, with corrections for offset and gain errors, there will be less resolution than the number of bits in the DAC would suggest.

Changing a setting in a single step of resolution may not always cause a corresponding change in the output. However, the setting accuracy specification governs the relationship between settings and output, and a calibrated instrument should perform within this tolerance.

Setting resolution may be expressed as an absolute unit value or as a percentage of full scale. For example, the voltage setting resolution on the Keithley 2200-32-3 is 1mV and the current setting resolution is 0.1mA.

Readback accuracy is sometimes called meter accuracy. It determines how close the internally measured values are to the theoretical value of the output voltage (after setting accuracy is applied). Just as with a digital multimeter, this is determined using a traceable reference standard. Readback accuracy is expressed as:
±(% of measured value + offset)
Readback resolution is the smallest change in internally measured output voltage or current that the power supply can discern. It is usually expressed as an absolute value but may also be given as a percentage of full scale. The voltage readback resolution on the Keithley 2200-32-3 is 1mV and the current readback resolution is 0.1mA. See Figure 2.


Figure 2. The least significant digits on the upper display correspond to the 1mV and 0.1mA readback resolution of Keithley Series 2200 instruments. The least significant digits on the lower display correspond to the setting resolution.

Loading comments...

Write a Comment

To comment please Log In

FEATURED RESOURCES