What trend or new technology will drive the test instrument market in 2013?
Chris Armstrong, North America for Rigol Technologies USA, www.rigolna.com
Over the past decade a number of technology trends have started to significantly impact design and function of new products in the test and measurement market. In 2013, expect that trend to continue accelerating. Further FPGA and DSP advances will enable manufacturers to quickly develop more and more advanced products as their processing power continues to grow. This enables new, lower cost products to cover applications where more advanced and costly ASIC based instruments were previously a requirement. Over the past few years this trend has coincided with a trend toward deeper memory instruments. That trend should start to change in 2013 as the memory capabilities of oscilloscopes priced under $1000 is now measured in the 10s of millions of points. With all this data improvements in instruments will start to trend more toward advanced analysis and in instrument formulation. The types of calculations that were previously only available to Windows based instruments will become simple and straightforward on instruments without the need for the cost overhead of a full blown PC and operating system being included. Examples include more advanced statistical pass/fail analysis of large sets of data and customizable user formulas created and executed on the instruments themselves.
|Charles Sweetser, OMICRON, www.omicron.at
The industry is always looking for better methods and techniques for confidently determining the condition of power transformers. Maintenance practices and philosophies are always being scrutinized and re-evaluated in hopes of maximizing diagnostic value and balancing economic efficiency. Traditionally, our industry has practiced conventional off-line tests which depend on a single measurement at a single frequency, constant voltage, or constant current. Often, only having conventional test data for review has resulted in inconclusive analysis that often leads to more unanswered questions. The industry is demanding reliable diagnostic information that is representative of the best possible condition estimation.In 2013, the emerging trend is to extract as much additional diagnostic information as possible by applying smarter advanced methods and techniques to existing procedures. This will require using multi-functional test instruments with advanced features. The idea is not to create new tests and increase testing overhead. Varying parameters to conventional tests, such as frequency, provide a new avenue for analysis. Based on research, practical experience, and advancement in measurement instruments, it is now possible to extract in-depth information that was not available in the past. These advanced diagnostic methods or “extensions” provide new and critical information about the transformer condition. This proliferation, along with modern instrumentation, has transformed diagnostic applications and the need for in-depth information for making decisions.
Mark Schrepferman,Peregrine Semiconductor Corporation, www.psemi.com
The trend to LTE advanced and the use of smartphones is challenging test-and-measurement (T&M) systems to stay ahead of advancements in RF component performance and innovations. As a result, T&M system designers will be challenged to provide faster, extremely repeatable, rugged, and best-in-class testing environments. Test equipment will be expected to last over multiple generations of product introductions, meaning that the performance requirements of the RFICs used in test equipment must be better than the device under test by a factor of several generations.Further to this, next-generation communication systems that use higher-order modulation schemes such as Orthogonal Frequency-Division Multiplexing (OFDM), with high peak to average ratios, are driving the need for the components used in the test equipment’s signal chain to have higher linearity. Additionally, more frequency bands will be introduced, driving the need for broader bandwidths and higher operating frequencies. This new, crowded spectrum will require additional filtering, so filter-bank switching is expected to drive the need for lower-loss components. Finally, despite the fact that test solutions are growing in complexity, end customers will continue to expect lower overall test costs per unit. In order to accomplish this, test equipment must enable lower overall test time, which can be accomplished by RF components with a fast settling time.
|Jim McElroy, LDRA Technology, www.ldra.com
If we focus just on software, there continues to be an increasing reliance on software to differentiate and provide flexibility for future product expansion. This fundamental reliance on software drives device and system manufacturers to look at ways to shorten the software development lifecycle and reduce costs, while at the same time increasing functional complexity and improving software quality. Streamlining the development team and their development processes through automation, from requirements engineering and traceability, down through verification and deployment is a way to achieve this. Other factors pushing better automation of test processes include:greater emphasis on the safety- and security-critical market segments, including aerospace, defense, industrial controls, smart energy, medical, automotive and rail transportation; Increased focus on software quality and reliability; Growing requirements for software certification and risk mitigation ;Reduction in development and verification resources. Device and system manufacturers need to implement more mature, traceable development processes where requirements, from concept through to code, automatically link to development and verification artifacts, such as development plans, design documents, verification plans, test procedures, and test results. No longer will manual methods be adequate to properly test, execute, and trace the results throughout the development process both from a reliability and cost perspective.
Gina Bonini, Tektronix, www.tek.com
Inexpensive RF technologies are being integrated into everyday applications from apparel tags, to livestock monitoring clips, store price displays, and short-range wireless control of household objects. ABI Research says that wireless sensor networks will grow from 10.2 million chipsets in 2009 to 645 million in 2015, a 99.6 percent growth rate.With this growth, engineers face tough measurement challenges to ensure that the RF systems interact properly with the rest of the device electronics. Interference from RF sources in the environment and other electrical signals in the system also must be analyzed. From an instrument perspective, the growth of RF means that designers will need an oscilloscope, logic analyzer and a spectrum analyzer, adding significant expense. What’s more, engineers must learn different instrument UIs and struggle with limited to no correlation across the analog, digital and RF signals. It’s clear that an affordable mixed-domain oscilloscope (MDO) that combines a mixed signal oscilloscope with a spectrum analyzer is greater than sum of its parts. Not only does it reduce pressure on the equipment budget, it adds the critical element of a single integrated view across domains. As embedded RF continues to grow so will the market demand for MDOs.
Alan Lowne, Saelig Company, Inc., www.saelig.com
First, a decline in entry-level instrument costs as high quality instruments from overseas penetrate the market (e.g. Rigol, Owon, Siglent, Instek). Some of these manufacturers have been contract-manufacturing oscilloscopes, AWGs, and spectrum analyzers for big-name American test equipment companies for quite some time, and they’ve decided that now it’s time to make their own name known.Second, an increasing use of built-in sensors in equipment and manufacturing environments. Precision sensor-based test and measurement technology helps ensure quality production as well as enabling remote equipment monitoring. Predictive and preventive maintenance, as well as increasing reliability demands, have resulted in an increased use of accelerometers, ambient sensors, flow transmitters, and data-acquisition devices. It is even possible build-in a tiny multifunction oscilloscope into equipment with the xMiniLab scope-on-a-chip. Lastly, an increase of test devices using available computer power for their intelligence, display, and networking capabilities. Pico Technology (picotech.com) pioneered this approach a number of years ago with their PC-based scope adapters, and many other vendors have followed suit (Link Instruments, National Instruments, Protek, etc.) Pico’s product range now reaches up to GHz capability. Oscium has similarly pioneered add-ons for iPhones and iPads. Their products leverage iOS’s intuitive features and touchscreen technology to make test tools easy to use and extremely compact – ideal for mobile service kits. The Oscium range includes spectrum analyzers, logic analyzers, and oscilloscopes.
|Andy Botka, Microwave Communications Division, Agilent Technologies, www.agilent.com
One trend is inescapable: the deepening integration of RF, microwave, and high-speed digital technologies inside the devices. To satisfy end-user demand for always-on access, mobile devices need powerful processing, dependable connectivity, and long battery life. This requires faster chipsets, buses and memory; multiple radios running myriad standards; new antenna techniques; and low-power operation. For developers, isolated test tools are necessary, but insufficient. The ability to rapidly transform ideas into validated products requires solutions that support specific needs—and the bedrock is software and instrumentation working in harmony. Beyond design and simulation, the software should address signal-scenario creation, signal analysis, and EMI behavior. Connected hardware must provide scalable performance and essential capabilities—signal generation, waveform characterization, signal analysis, logic and protocol analysis—to address numerous communications formats and technologies.
Heiko Ehrenberg, GOEPEL Electronics, www.goepelusa.com
I believe 2013 will see the approval of the latest revision of IEEE Std 1149.1 and the approval of a new standard - IEEE 1687. Both of these standards will have a major influence on the way we test printed circuit board assemblies and systems. Both define means to access test instrumentation embedded into integrated circuits, which can then be utilized for board and system test applications, a methodology we at GOEPEL refer to as Embedded System Access (ESA). Considering the ever more diminishing test access through probe points, I see ESA as playing a major role in how we test products throughout their entire life cycle, from design validation and prototype verification, through manufacturing tests, to field tests and upgrades. ESA strategies include well-known applications such as JTAG / boundary scan (as defined in IEEE Std 1149.1) and in-system programming of Flash, serial EEPROMs, and CPLDs, as well as more recent developments such as processor-emulation tests and tests utilizing chip-embedded instrumentation (for example, various built-in self test [BIST] methodologies are already widely used). By utilizing the standardized JTAG TAP interface for access to embedded instrumentation, board and system tests can be minimally invasive and can cover a wide range of defects.
|Robert Emerson, ZES ZIMMER Inc., www.zes.com
Quoting technologist and journalist Ben Hammersley, “Fortunately, for most of us the future arrives not in an explosion but gradually, layering or assimilating with the past.” Looking toward the near future, it is important to know where we are and how we got here. Other than public policy, which may have the largest impact but is the hardest to predict, one of the trends driving the industry will continue to be energy production and efficiency.The individual technologies and applications are diverse: battery- and hybrid-powered devices and vehicles, utility-scale storage, green energy production, micro-grids, ultracapacitors, power supplies, LEDs and on and on. For example, the Federal government now requires third-party certification for products to qualify for the EnergyStar program. Additionally, every handheld device manufacturer seeks new ways to deliver the brightest screen and longest battery life at lowest cost. Hybrid and electric vehicle efficiency gains will be from improved motors, drives and generators. Solar inverters are above 95% efficient and rising. Wind turbines are increasing in size and efficiency. Bio-engineering research seeks to increase fuel yields from plants and to create better and diverse feedstocks. Significant R&D investments from public and private sources are driving commercially viable results to satisfy market and environmental forces. New energy technology and applications will demand improved test equipment performance in the areas of sensitivity, accuracy and range.
Peter Anderson, Measurement Computing, www.mccdaq.com
MCC conducted a survey of computer use in data acquisition and to our surprise, over 50 percent of the respondents stated that they expect to use data acquisition equipment with a tablet in the coming year. This response shows that tablet computers will have a huge impact in the Test and Measurement market. This will only be possible, however, as T&M companies incorporate technologies that serve this market, including wireless integration and small-footprint drivers that support iOS and Android. The switch to tablets in the lab and in the field will inevitably mirror the transition from stationary desktop computers to their more portable siblings – laptop computers. Just like USB instruments overcame the lack of PCI slots in laptops, the lack of widely available host-mode USB ports in today’s tablets will necessitate wireless products that offer easy device pairing. While few wireless options are available today in the T&M marketplace, this will be an area of tremendous growth in the coming years.The bigger obstacle to using tablets in T&M applications, however, is porting device driver software to the tablet operating systems iOS and Android. Because of the dominance of Microsoft Windows systems over the years, many companies have invested heavily in bloated device drivers that require a large memory footprint. It will be challenging to adapt these device drivers to the less powerful processors and limited memory of today’s tablets. Although there will be challenges to the industry, the tectonic shift to tablet computing will drive T&M companies to modify their product offerings to support these devices in the coming years.
|Eugene Cabanban, PLX Technology, www.plxtech.com
While the test-instrument market continues to play a vital role in the electronic components industry, there’s a trend emerging toward alternatives to costly equipment, helping to accelerate chip testing and verification. PCI Express interconnect devices – with their ultra-fast speeds and applicability across a wide range of cloud, storage, server, and consumer markets – pose a particular need among designers for these highly efficient alternatives. And they now have them in the form of on-chip testing capabilities, coupled with complementary software tools. These tools provide system designers with instant logic analyzer support, effortless high-speed scope view, pattern generation, and error injection – all capabilities that work in conjunction with existing test equipment, and in some cases eliminate the need for it. Used by OEMs worldwide for thousands of PCIe-based products, these on-chip diagnostics, such as those on PLX Technology PCI Express switches, and software solutions not only are bottom-line-friendly, they speed time to market for a wide range of designs. Because PCI Express provides the interconnect backbone to high-speed designs serving virtually all market segments, these on-chip testing features enable designers to glean valuable assessment of performance in an array of designs, both saving on development costs and generating revenue from the devices that much faster.
Andreas Roessler, Rohde & Schwarz , www.rohde-schwarz.us/
Of course there is more than one answer to that question. There are more than one trend that we are monitoring and further there is not a new technology, but some advancements that have our attention. By adding additional features to the Release 10 version for the relevant technical specifications 3GPP ensured that LTE becomes truly a 4G mobile communication technology, while meeting the IMT-Advanced requirements set by the ITU. These enhancements are also known as LTE-Advanced. The most demanded feature out the defined feature set is carrier aggregation (CA), with the goal to increase peak data rates. The real driving force behind CA, however, is to obtain a much more efficient use of the fragmented spectrum allocations available to network operators, especially in the US. The push of network operators to deploy this feature leads to a high-workload in the wireless industry, especially on the chipset and handset side, including a strong demand for test solutions. Another trend is smart traffic offloading. Due to exponential increase of data traffic operators running into capacity issues. The counter measure is a smart offload of data traffic from LTE to WiFi. Many different strategies are currently under discussion, using slightly different technical approaches, which leads to different requirements in terms of test and measurement solutions.
|Chris Armstrong, North America for Rigol Technologies USA, www.rigolna.com
Over the past decade a number of technology trends have started to significantly impact design and function of new products in the test and measurement market. In 2013, expect that trend to continue accelerating. Further FPGA and DSP advances will enable manufacturers to quickly develop increasingly advanced products as their processing power continues to grow. This enables new, lower cost products to cover applications where more advanced and costly ASIC based instruments were previously a requirement. Over the past few years this trend has coincided with a trend toward deeper memory instruments. That trend should start to change in 2013 as the memory capabilities of oscilloscopes priced under $1000 is now measured in the 10s of millions of points. With all this data improvements in instruments will start to trend more toward advanced analysis and in instrument formulation. The types of calculations that were previously only available to Windows based instruments will become simple and straightforward on instruments without the need for the cost overhead of a full blown PC and operating system being included. Examples include more advanced statistical pass/fail analysis of large sets of data and customizable user formulas created and executed on the instruments themselves.
Mike Fox, FLIR Test & Measurement, http://www.flir.com
A fresh new way of thinking about test and measurement is emerging. As we look to 2013, we are transforming testing and diagnostics with technologies that accelerate and improve the efficiency of diagnostic/repair/approval workflows and processes into what is a true "diagnostic ecosystem." In addition to the use of touchscreens with intuitive menu controls that emulate today’s personal electronics, several of our thermal imagers use Wi-Fi technology and mobile apps to connect to Android or Apple iOS tablets and smartphones. Now, a plant maintenance technician can quickly transfer images, generate inspection reports on-the-fly, and email them to managers.FLIR also uses Bluetooth to connect to Extech clamp meters to collect and cross-reference electrical readings like voltage and current with related IR images of problem components. Electrical readings can be stamped right on the image. And wireless datastreaming DMMs can share readings with PCs. Look for smartphone/tablet connectivity as well.The goal of the diagnostic ecosystem is not only to improve communication among diagnostic and communication devices but also among technicians and their customers and managers, by leveraging accurate and coordinated readings from related tools as well as rapid and actionable communication.