Advertisement
Articles
Advertisement

Choose the right DAQ board

Tue, 05/01/2012 - 10:33am

Data acquisition (DAQ) hardware serves as the interface between a computer and signals from the outside world. There are hundreds of different types of DAQ hardware with a range of sampling rates, resolutions, signal conditioning options, and form factors. Use these five questions as a guide to help you choose the right DAQ hardware for your application.

1. What types of signals do I need to measure or generate?
Different types of signals need to be measured or generated in different ways. DAQ devices can include several types of input and output (I/O) functions for measuring and generating different signal types:

• Analog inputs measure analog signals
• Analog outputs generate analog signals
• Digital inputs/outputs measure and generate digital signals
• Counter/timers measure digital events or generate digital pulses

Multifunction DAQ devices offer the best value and performance by combining analog inputs, analog outputs, digital inputs/outputs, and counters in a single package. Since the number of I/O channels per device is fixed, you should consider choosing a device with more channels than you currently need if you plan on expanding functionality in the future.

Modular DAQ systems consist of a chassis and interchangeable I/O modules. Chassis can hold anywhere from one to 18 modules. Modules are available for a variety of I/O types including specialized sensor measurements and high voltages. While modular systems are typically more expensive than multifunction devices, they give you the flexibility to configure the system to your exact requirements.

2. Do I need signal conditioning?
Some sensors generate signals too difficult or dangerous to measure directly with a standard ±10V DAQ device. These sensors require signal conditioning before a DAQ device can effectively and accurately measure them. For example, thermocouples output small signals in the mV range and require amplification, low-pass filtering, and cold-junction compensation. Table 1 provides a summary of common signal conditioning for different types of sensors and measurements.


As an alternative to designing your own signal conditioning circuitry or using an external signal conditioning box, many DAQ devices offer built-in signal conditioning. This helps simplify the overall setup and improves measurement accuracy.

3. How fast do I need to acquire or generate samples of the signal?
The sampling rate is the speed at which a DAQ device’s analog-to-digital converter (ADC) takes samples of a signal. It is controlled by software or by a clock built into the DAQ hardware that can perform high-speed sampling up to millions of times per second.

When choosing a sampling rate you should consider the Nyquist Theorem, which states that to accurately reconstruct a signal, you must sample at two times the highest frequency component of interest. In practice, you should sample at least 10 times the maximum frequency in order to represent the shape of your signal.

For example, suppose you are measuring a sine wave with a frequency of 1 kHz. Figure 1 compares the 1 kHz signal measured at 2 kHz and 10 kHz. As you can see, sampling at 2 kHz accurately represents the frequency component according to the Nyquist Theorem, but sampling a 10 kHz better represents the shape of the original signal.



4. What is the smallest change in the signal that I need to detect?
Resolution and input range are common specifications of a DAQ device that determine the smallest change in the signal that it can detect. Resolution refers to the number of binary levels an ADC can use to represent a signal. For example, a 3-bit ADC can represent eight (23) discrete voltage levels, while a 16-bit ADC can represent 65,536 (216).
 

These voltage levels are evenly distributed across the DAQ device’s input range. For example, a DAQ device with a ±10 V range and 12 bits of resolution (212 or 4,096 evenly distributed levels) can detect a 5-mV change.

5. How much measurement error does my application allow?
Accuracy is defined as a measure of the capability of an instrument to faithfully indicate the value of a measured signal. This term is not related to resolution; however, accuracy can never be better than the resolution of the instrument.

In an ideal world, an instrument always measures the true value with 100 percent certainty. In reality, instruments report a value with an uncertainty specified by the manufacturer. The uncertainty can depend on many factors, such as system noise, gain error, offset error, and nonlinearity. A common specification for a manufacturer’s uncertainty is absolute accuracy. This specification provides the worst-case error of a DAQ device at a specific range. An example calculation for absolute accuracy is given below:

Absolute Accuracy = ([Reading*Gain Error] + [Voltage Range*Offset Error] + Noise Uncertainty)

There are wide ranges of DAQ devices with varying degrees of accuracy. Where a basic DAQ device may provide an absolute accuracy over 100 mV, a higher performance device with self-calibration, isolation, and other special circuitry, may provide an absolute accuracy around 1 mV.

Summary
The most important specifications to consider when choosing a DAQ device are its I/O functions, number of channels, signal conditioning, sampling rate, resolution, and accuracy. While a DAQ device is an important part of a measurement system, there are several other components to consider including sensors, computer, and software. To learn more about questions you should ask yourself when selecting each component of a measurement system, download the Complete Guide to Building a Measurement System at www.ni.com/dataacquisition.

Download the Complete Guide to Building a Measurement System.

 

Advertisement

Share this Story

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading