Real world RS-485: Low power, low EMI
While RS-485 has been around for some time, it is still a viable network technology because of its simplicity of signaling, voltage levels, and implementation. Industrial sensors still use the interface for those reasons.
Because it is not a recent standard, many engineers apply cookbook solutions which are not optimized for the application or as robust as expected. The net effect is using parts larger than they need to be (higher power terminating resistor) or EMI problems. A few key considerations can assist an engineer in zeroing in on the right solution quickly, first time, every time.
There are many great white papers on the subject . This article provides missing details, or simplifications that can enhance the performance of your design.
Figure 1 Basic RS-485 topology
There are three primary tools available to the designer to manage EMI (shielding is covered separately at the end of the article).
1. Device speed
2. Transceiver operating voltage
3. Terminating resistor currents
It has been said before, but bears mentioning again. Do not use a baud rate faster than is needed for the application – and that includes the speed of the transceivers. Transceivers are available in different speed options that affect the rise/fall times of the signals. For instance, many RS-485 links run below 1Mbps, so a device such as TI's SN75HVD12DR is a good choice. For 128kbps links, the slower Intersil part would suffice.
The slower rise time of these parts (e.g., 100ns ) is more than fast enough for these applications, and minimizes EMI radiation. It may also reduce susceptibility from nearby noise sources since it has a slower response speed. Read the specs of the transceiver, as many standard devices will run at 10Mbps or faster which is much higher than needed for these links typically.
Table 1 Example RS-485 transceivers and speeds
EMI is proportional to the voltage swing of any signal. Reduce the voltage swing and you can reduce the EMI radiated by the connections. Many newer devices are fully rated to operate at 3.3V while satisfying the minimum requirements of the RS-485 signaling standard. As a bonus, 3.3V is more common these days than 5V in many system designs. What do we give up by using a lower voltage? Speed capability and noise immunity are reduced at this voltage. But if the device is rated for the speed required, and shielding is used, a 3.3V RS-485 signal is usually adequate. Again, it is up to the designer to consider all relevant conditions, and check the datasheet. As a side note, resist the urge to add a capacitor across the input of a transceiver unless you compute the frequency response as being 5-10 times that of the signaling rate (1/2 the baud rate) to avoid signal degradation.
RS-485 has a wide operating voltage range, from an Rx threshold of 200mV to a maximum differential signal of 10V. Usually 2V P-P is the minimum recommended drive level, and 3.3V devices will meet that criterion while interfacing just fine with 5V-powered receivers, providing reasonable signal-to-noise levels, especially for shorter runs. Keep in mind that if you need high speed (defined by >5MHz), you may need 5V power, so check the datasheet.
Because EMI problems can be magnetic in nature, the currents flowing through the terminating resistors may be considered a factor. Magnetic interference can be more difficult to control as copper has a relative permeability of about 1, and may induce coupling from an aggressor circuit in spite of nearby shielding. Lower transient currents reduce the magnetic signature, and aid in minimizing coupling to other nearby circuits.
How do we do that? Isn’t the terminating resistor value fixed? No it isn’t, as long as your cable is not “electrically long” relative to the edge rate of your signals. There is no rule that says you can’t raise the value for other engineering reasons. If the primary issue or concern is susceptibility rather than radiation, then the lower the termination resistance, the better. But as in all engineering designs, there are trade-offs. Comparing a 5V/120Ω system with a 3.3V/499Ω system shows a factor-of-six reduction in current.
This design criteria is very susceptible to "rule of thumb" abuse. The default value most of us learn initially is 120Ω applied differentially across the (+) and (-) data terminals at the far ends of the network. But 120Ω is not always the best choice. The original terminations were selected to match the impedance of twisted pair cable commercially available. No matter your application, don’t consider running without, even for short runs, as it provides noise immunity. Termination is required for two reasons:
Cable is "electrically long" such that 2·tp ≥ tr/5 , where tp is signal transit time, one-way, across a cable, and tr = rise time of the signal from a given driver (10%-90%; see below for computing signal transit time based on the velocity factor). If the cable is not electrically long, you have more flexibility in adjusting the termination value (Rt). This is another reason to use as slow a driver as will satisfy the application.
Susceptibility: Without any termination, a receiver input (single-ended) for the SN75HVD12DR is estimated at about 109kΩ (based on maximum input current spec and 12V on pin). Input impedance this high is susceptible to crosstalk from nearby signals on a PCB or within a cable (if more than one pair is under a shield). Lowering this impedance by applying a terminating resistor in parallel will minimize crosstalk, but at the expense of power dissipation. A compromise is recommended, but never give away "free" noise immunity; always include some value of termination.