Automated Test - Five Trends Shaping its Future

Kirtesh Mistry, Technical Marketing Engineer at National Instruments investigates five key trends affecting the way automated test is developing.

When I ask test engineers and managers what led them to attend one of our conferences or summits, I often get the reply, "to keep up with what's new in test."

This question often results in further discussions where managers come to realise that the latest technologies can help them optimise their processes, ensure that their team of engineers are as productive as they can be and ultimately give their business a competitive edge. The technology changes that occur in automated testing are driven by a number of factors, including the increasing complexity of devices under test (DUTs) and the needs to meet market demands.

One of the biggest challenges for test engineers and their managers is keeping up to date on technology trends.

To help them stay up-to-date, NI annually publishes the Automated Test Outlook, drawing on relationships with over 35,000 companies worldwide to provide a comprehensive view of key technologies and methodologies impacting the test and measurement industry over the coming 1 to 3 years.

NI is well positioned to stay up to date on technology trends in automated test through internal research and development activities and close relationships with suppliers. The overall outlook can extend into technologies often far ahead of commercialisation.

The 2012 outlook identifies five key trends, each of which will be summarised in this article.

  1. Optimising Test Organisations

  2. Measurements and Simulation in the Design Flow

  3. PCI Express External Interfaces

  4. Proliferation of Mobile Devices

  5. Portable Measurement Algorithms

1. Optimising Test Organisations

"Test is a foundational activity in any development, manufacturing and maintenance endeavour. Not only must it be included when considering product quality, time to market and business objectives, but it must also be effective and affordable. At Lockheed Martin, we are investing in the people, process and technology aspects of automated test to ensure we meet our objectives." Tom Wissink, Director of Integration, Test and Evaluation, Lockheed Martin Corporate Engineering & Technology.

The business strategy for many organisations in the coming years will be to focus on optimisation.

Optimising test solutions

In many leading organisations, information technology has evolved over two decades from being a support function to a strategic asset. IT can now streamline critical line-of-business processes and help executives make real time decisions. The strategic importance of IT was confirmed by the Chief Information Officer (CIO) magazine 2010 State of the CIO Survey, which revealed that 70 percent of CIOs are now members of their companies' executive committees.

Similarly, an emerging trend for electronics and manufacturing companies is the elevation of the test engineering function from a cost centre to a strategic asset, for competitive differentiation.

This shift was confirmed by recent global NI survey of test engineering leaders who said their top goal over the next 1 to 2 years is to reorganise their test organisation structures for increased efficiency. This strategic realignment reduces the cost of quality and impacts a company's financials by getting better products to market faster.

Companies making this transformation must commit to a long-term strategy because, according to NI research, it generally takes 3 to 5 years to realise the full benefits. A company must have a disciplined and innovative investment strategy to transform the test organisation through four maturity levels: ad-hoc, reactive, proactive and optimised. Each level includes people, process and technology elements.

The right people are required to develop and maintain the cohesive test strategy. Process improvements are required to streamline test development and reuse throughout product development. And finally, tracking and incorporating the latest technologies is required to improve system performance while lowering cost.

An organisation steadily builds a foundation for strategic transformation by sticking to a sequential approach and identifying short-term initiatives that help the company improve its maturity level and that map to annual operating objectives. As the foundation is built, test productivity and asset utilisation increase, paying dividends on the original investment. This phased approach enables organisations to realise the benefits early on - after the completion of just one or two projects.

Typically, optimised organisations will develop standardised test architectures, with strong reuse components from design to production and provide systematic enterprise data management and analysis that result in company level business impact.

2. Measurements and Simulation in the Design Flow

"Connectivity between our EDA tools and NI's test software allows engineers to develop a test bench simultaneously with product development, providing earlier test feedback into the design process and greatly shortening design cycles by making development and test parallel rather than serial" - Serge Leef, Vice President / General Manager of System Level Engineering Division, Mentor Graphics.

A key objective for many organisations is to shorten the product development cycle. This has long been a case in the automotive and aerospace industries, for which the end product is a highly complex "system of systems". This is also seen as a trend in the semiconductor and consumer electronics industries, where shorter life spans and increasing product complexity are fuelling the pressure to reduce product development time.

Measurement and simulation

One approach to reduce development time is concurrent design and test, which is often represented by the V-diagram product development model. The left side of the V-diagram is considered "design" and right side represents "test". The idea is to increase efficiency by validating and testing subsystems during the design phase, before development of an entire system is complete.

A key method to empower this practice is increasing the connectivity between electronic design automation (EDA) simulation software and test software.

During initial design and simulation, EDA software is used to model either physical or electrical behaviours of a simulated product. During the validation and verification stage of product development, engineers use software to automate measurements on a real prototype. However, similar to the design and simulation phase, the validation and verification process requires measurement algorithms to those used by EDA software tools.

With National Instruments' recent acquisition of AWR, a leading supplier of electronic design automation (EDA) software for designing RF and high-frequency components and systems, engineers can now benefit from tighter software and hardware integration between AWR's Microwave Office Design Suite and NI LabVIEW graphical system design software.

One benefit of the connectivity between design and test software environments is that it allows design engineers to use significantly richer measurement algorithms earlier in the design process. A second benefit is that it allows test engineers to develop working test code much sooner, which ultimately reduces time to market for complex products.

For example, the design of a cellular multimode RF power amplifier is traditionally modelled using RF EDA tools that allow the engineer to simulate RF characteristics such as efficiency, 1dB compression point and gain.

However, the end product must meet additional RF measurement criteria explicitly established for cellular standards such as GSM/EDGE, WCDMA and LTE.

Historically, "standard specific" measurement data from metrics such as LTE error vector magnitude (EVM) and adjacent leakage channel ratio (ACLR) required instrumentation on a physical DUT, largely because of measurement complexity. Going forward, new connectivity between EDA and test automation software enables the use of these sophisticated measurement algorithms within the EDA environment on a simulated device. As a result, engineers will be able to identify system-related or complex product issues much earlier in the design cycle and therefore shorten design times.

3. PCI Express External Interfaces

Due to the combination of its excellent performance and pervasiveness, PCI Express is the default choice for systems buses. With new fibre-optic and copper cable technologies, it is emerging as the leading choice for high-performance external interfaces." - Mark Wetzel, Distinguished Engineer for Process Architectures, National Instruments.

PCI Express external interfaces

PCs in various form factors, such as desktops, workstations, industrial and embedded systems have been used to provide central control for instrumentation and automate test procedures since the invention of GPIB in the 1960s. These days, they offer a variety of interface buses such as USB, Ethernet, Serial, GPIB, PCI and PCI Express to interface to instrumentation hardware in automated test systems. Because PCs play such a critical role in an automated test system, the test and measurement industry must track the progression of the PC industry and exploit any new technologies for increasing capabilities and performance, while lowering the cost of test.

Since PCI Express is a serial bus, it has a variety of inherent advantages over parallel buses such as PCI and VME. Technical challenges like timing skew, power consumption, electromagnetic interference and crosstalk across parallel buses become more and more difficult to circumvent when trying to increase data bandwidth. PCI has been the fundamental bus on the motherboard of many computers, and PCI Express, since its release in 2004, has seen continuous improvements in its data transfer capabilities. Furthermore, PCI Express uses the same software stack as PCI and provides full backward compatibility.

PCI or PCI Express offer better performance over other external interfaces like GPIB because they are directly available from the CPU inside a PC. A more recent implementation of PCI Express as an external interface, Thunderbolt, is a technology Intel pioneered under the code name Light Peak which has the potential to be extremely pervasive. Thunderbolt combines PCI Express and DisplayPort video protocol into a serial interface bus that can be driven over either copper or fibre-optic cables. Since PCs will natively offer Thunderbolt ports, it has the promise to be a high performance, low cost and ubiquitous solution.

Devices under test are increasing in complexity and require a multitude of measurements such as RF, audio, vibration and even video analysis, for example. The volume of data produced drives the demand for greater bus bandwidth. Platforms utilising PCI/PXI Express are commonly used to address this need for high bandwidth, whether for rapid streaming to Redundant Array of Inexpensive Disks (RAID) for post processing, or even shuttling data between test systems. By eliminating bottlenecks caused by insufficient bandwidth, test times can be reduced. These benefits will drive the adoption of the external PCI Express port in its various form factors.

Automated test systems that leverage PCI Express, in its various implementations, are positioned to offer the highest performance and most flexibility, as well as low cost. For these reasons, PXI as a modular instruments test platform can very easily become a default choice for many automated test and measurement applications.

4. The Proliferation of Mobile Devices

"Tablets and smartphones are becoming increasingly ubiquitous computing devices and we expect them to complement laptops and desktops when it comes to remote access to important data." - Jean-Claude Monney, Chief Technology Strategist for Microsoft US Discrete Industries.

Proliferation of mobile devices

As an avid reader of Radio-Electronics.com, you may already be familiar with the many articles that explain the development of mobile devices from RF component design through to functional testing. To spin this around, it is also worth now considering how the mobile device can be used by the test engineer as part of the test setup.

While tablets and smartphones cannot replace ubiquitous PC or PC-based measurement platforms like PXI, they offer unique benefits when used as extensions to a test system. When the Nielsen Company surveyed consumers in 2011 to understand why they were using tablets instead of traditional PCs, the top reasons cited included user experience improvements like superior portability, ease of use, faster start up time and longer battery longevity.

For engineers, the PC is probably one of the most important engineering tools and so as mobile device adoption increases engineers likely transition to use mobile devices and tablets for many of the same reasons outlined by consumers in the Nielsen study.

The expected use cases for mobile devices within automated test include test system monitoring control and test data report viewing.

Perhaps, as an engineer, you need to remotely monitor and control your test rig so you can check on how things are going and look for alarm states while you travel between two locations. Maybe you are monitoring from across a room, a building or across the world. The mobile device provides a secondary user interface to a test system that is located on the other side of the world. A tablet or smartphone can instantaneously view a wide variety of information related to remote test system, or control its mode of operation.

Rather than interact with the test system directly, test engineers may want to consolidate test reports that characterise the results of previous tests and identify trends.

The explosion of mobile devices like tablets and smartphones provides compelling benefits to engineers, technicians and managers involved in automated test, who need remote access to test status information and results. Test organisations will need new expertise to unite the networking, web services and mobile app portions of the solution.

5. Portable Measurement Algorithms

"With business needs demanding computing platforms beyond the venerable microprocessor, our familiar programming paradigms are struggling to keep pace. Providing tools that offer efficient design capture through a variety of models of computation, combined with ability to target multiple types of processing hardware, is a key goal for NI investment." - David Fuller, Vice President of Application and Embedded Software, National Instruments.

Portable measurement algorithms

Over the past 20 years, the concept of user-programmable, microprocessor-based measurement algorithms has become mainstream allowing engineers to rapidly adapt to changing test requirements. This is approach is called virtual instrumentation.

If the microprocessor initiated the virtual instrumentation revolution, then the field-programmable gate array (FPGA) is ushering in its next phase. FPGAs have been used in instruments for many years. For instance, today's high-bandwidth oscilloscopes collect so much data, it is impossible for users to quickly analyse all of it. Hardware-defined algorithms on these devices, often implemented on FPGAs, perform data analysis and reduction (averaging, waveform math and triggering), compute statistics (mean, standard deviation, maximum and minimum) and process the data for display, all to present the results to the user in a meaningful way. While these capabilities offer obvious value, there is lost potential in the closed nature of these FPGAs. In most cases, users cannot deploy their own custom measurement algorithms to this powerful processing hardware.

Open, user-programmable FPGAs on measurement hardware offer many advantages over processor-only systems. Because of their immense computational capabilities, FPGAs can deliver higher test throughput and greater test coverage, which reduces test time and capital expenditures. The low latency of FPGA measurements also provides the ability to implement tests that are not possible on a microprocessor alone. Their inherent parallelism offers true multisite test, even more so than with multicore processors. And finally, FPGAs can play a key role in real-time hardware sequencing and DUT control.

An example is that provided by ST-Ericsson, which manages communication protocols with FPGA RF instrumentation based on NI FlexRIO. NI FlexRIO, which contains a Xilinx Virtex-5 FPGA, requires an adapter module to access its digital I/O at high speeds. They used the NI 6581 digital adapter module for FlexRIO, which can access I/O at up to 100 MBit/s, to customise the socketed component-level IP (CLIP) Node, which interfaces with the NI 6581 to include a digital clock manager(DCM). It performs a layout of the external clock and provides some protocols and derived clocks to the bench.

As the project advanced, they needed a digital RF high speed (1.4 Gbit/s) transfer, so they developed an adapter module using the NI FlexRIO Adapter Module Development Kit. This module replaces the NI 6581 and provides differential RF channel (RX, TX and CLK) transfer speeds that were needed.

They were able to exploit VHDL code reuse, and it was possible to quickly implement a complex protocol without having to rewrite it in the LabVIEW FPGA single cycle Timed Loop.

Hardware description language abstraction, high level synthesis and different models of computation with development software will provide greater levels of hardware abstraction and flexibility across execution targets to deliver higher performance, cost effectiveness and shorter time to market.

The outlook

Developments in areas from business strategy to test architecture are shaping the evolution of automated testing. By adopting these strategies and technologies, businesses will be better positioned in years to come with optimised test processes that are able to address the constraints and complexities of their DUTs, maintaining competitive edge over their rivals. Don't get left behind.


發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章