automated test outlook 2012download.ni.com/evaluation/ate/ato2012.pdf · 2014-05-12 ·...

16
Automated Test Outlook 2012 A Comprehensive View of Key Technologies and Methodologies Impacting the Test and Measurement Industry Business Strategy Architecture Computing Software I/O

Upload: others

Post on 21-May-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Automated Test Outlook 2012download.ni.com/evaluation/ate/ATO2012.pdf · 2014-05-12 · Manufacturing, Capability Maturity Model Integration (CMMI), and Agile Product Development

Automated Test Outlook 2012A Comprehensive View of Key Technologies and Methodologies

Impacting the Test and Measurement Industry

Business StrategyArchitectureComputingSoftwareI/O

Page 2: Automated Test Outlook 2012download.ni.com/evaluation/ate/ATO2012.pdf · 2014-05-12 · Manufacturing, Capability Maturity Model Integration (CMMI), and Agile Product Development
Page 3: Automated Test Outlook 2012download.ni.com/evaluation/ate/ATO2012.pdf · 2014-05-12 · Manufacturing, Capability Maturity Model Integration (CMMI), and Agile Product Development

Table of Contents

4 How We Arrived at the Trends Review the automated trends of the last six years and how they informed this year's topics.

6 Optimizing Test Organizations Organizations are elevating test engineering to a strategic asset to gain a competitive edge over the competition.

8 Measurements and Simulation in the Design Flow Using sophisticated models with real-world measurements improves product quality and reduces development time.

10 PCI Express External Interfaces The high-speed, low-latency bus internal to the PC is making possible new system topologies due to external interface enhancements.

12 The Proliferation of Mobile Devices The smartphone in every pocket and tablet in every bag are changing how you can control and monitor your test systems.

14 Portable Measurement Algorithms New tools are helping you develop measurement IP once and then deploy it to a wide variety of disparate processing elements.

2 + 3

Page 4: Automated Test Outlook 2012download.ni.com/evaluation/ate/ATO2012.pdf · 2014-05-12 · Manufacturing, Capability Maturity Model Integration (CMMI), and Agile Product Development

Predicting the future is hard work. Fortunately, we cast a wide net in terms of the inputs we use to arrive at the trends. As a supplier of test technology to more than 35,000 companies worldwide each year, we receive a broad range of feedback across industries and geographies. This broad base creates a wealth of quantitative and qualitative data to draw on.

We stay up to date on technology trends through our internal research and development activities. As a technology-driven company, we invest more than 16 percent of our annual revenue in R&D. But as a company that focuses on moving commercial technology into the test and measurement industry,

our R&D investment is leveraged many times over in the commercial technologies we adopt. Thus, we maintain close, strategic relationships with our suppliers. We conduct biannual technology exchanges with key suppliers that build PC technologies, data converters, and software components to get their outlook on upcoming technologies and the ways these suppliers are investing their research dollars. Then we integrate this with our own outlook. We also have an aggressive academic program that includes sponsored research across all engineering disciplines at universities around the world. These projects offer further insight into technology directions often far ahead of commercialization.

How We Arrived at the Trends

FPGAs

ModularHybrid Test

EnterpriseIntegration

BusinessStrategy

Architecture

Computing

Software

I/O

BusinessStrategy

Architecture

Computing

Software

I/O

Cost of Test

Parallel Testing

Virtualization

Standardization

MultichannelRF Test

Peer-to-PeerComputing

HeterogeneousComputing

PCI ExpressExternal Interfaces

Proliferation ofMobile Devices

Portable MeasurementAlgorithms

IP to the PIN

SystemSoftware Stack

OrganizationalTest Integration

Optimizing TestOrganizations

Measurementsand Simulation inthe Design Flow

Embedded Design and Test

ReconfigurableInstrumentation

WirelessStandards

Wireless/RFInstrumentation

Modelsof Computation

2007 2008 2009 2010 2011 2012

Data Streaming

Multicore Processing

Page 5: Automated Test Outlook 2012download.ni.com/evaluation/ate/ATO2012.pdf · 2014-05-12 · Manufacturing, Capability Maturity Model Integration (CMMI), and Agile Product Development

And, fi nally, we facilitate advisory councils each year for which we bring together leaders from test engineering departments to discuss trends and share best practices. These councils include representatives from every major industry and application area—from fi ghter jets to the latest smartphone to implantable medical devices. The fi rst of these forums, the Automated Test Customer Advisory Board, has a global focus and is in its 12th year. We also conduct regional meetings, called regional advisory councils, around the world. Annually, these events include well over 300 of the top thought leaders developing automated test systems.

We’ve organized this outlook into fi ve categories (see above fi gure). In each of these categories, we highlight a major trend that we believe will signifi cantly infl uence automated test in the coming one to three years. We update the trendsin these categories each year to refl ect changes in technology or other market dynamics. We will even switch categories if the changes happening are signifi cant enough to warrant it.

As with our face-to-face conversations on these trends, we hope that the Automated Test Outlook will be a two-way discussion. We’d like to hear your thoughts on the industry’s technology changes so we can continue to integrate your feedback into this outlook as it evolves each year.

FPGAs

ModularHybrid Test

EnterpriseIntegration

BusinessStrategy

Architecture

Computing

Software

I/O

BusinessStrategy

Architecture

Computing

Software

I/O

Cost of Test

Parallel Testing

Virtualization

Standardization

MultichannelRF Test

Peer-to-PeerComputing

HeterogeneousComputing

PCI ExpressExternal Interfaces

Proliferation ofMobile Devices

Portable MeasurementAlgorithms

IP to the PIN

SystemSoftware Stack

OrganizationalTest Integration

Optimizing TestOrganizations

Measurementsand Simulation inthe Design Flow

Embedded Design and Test

ReconfigurableInstrumentation

WirelessStandards

Wireless/RFInstrumentation

Modelsof Computation

2007 2008 2009 2010 2011 2012

Data Streaming

Multicore Processing

4 + 5

Page 6: Automated Test Outlook 2012download.ni.com/evaluation/ate/ATO2012.pdf · 2014-05-12 · Manufacturing, Capability Maturity Model Integration (CMMI), and Agile Product Development

In tough economic conditions, companies are more diligently looking for opportunities to gain a competitive advantage while growing revenue, profi ts, and customer loyalty. This has led to a strong adoption of business improvement strategies such as Six Sigma, Lean Manufacturing, Capability Maturity Model Integration (CMMI), and Agile Product Development. Additionally, companies will elevate and strategically take advantage of a support function within an organization as a marketplace differentiator.

For example, the role of information technology (IT) has changed dramatically over the last two decades. IT was originally a support function that provided standard computing applications, data storage, and routine task automation. In leading organizations, IT can now streamline critical line-of-business processes and help executives make real-time decisions at the core of a company’s business. The strategic importance of IT was confi rmed by the Chief Information Offi cer (CIO) magazine 2010 State of the CIO Survey, which revealed that 70 percent of CIOs are now members of their companies’ executive committees.

Similar to IT, product testing has been historically viewed as a support function during the product development and manufacturing process—just a

necessary cost center. Hence, many companies invest at much higher rates in other areas of “strategic” value such as product development and sales enablement. This leaves the test organization fragmented, outmatched to meet business requirements, and outdated with old technologies and test methodologies that often create bottlenecks for their organizations. However, as research has shown, test is critical because it validates a product’s performance, reduces development time, increases quality and reliability, and lowers return rates.

The Research Triangle Institute conducted a study for the National Institute of Standards and Technology in 2002 to estimate the impact of inadequate software testing on the US automotive and aerospace industries. It found that the industry-level impact for underappreciating test was $1.47 billion. Another study conducted by researchers at NASA Johnson Space Center in 2004 stated that the cost of fi nding a product defect during production was 21 to 78 times more expensive than during design. The primary recommendation from both studies was to increase testing during design because of the dramatic reduction in relative cost to repair defects. By catching defects earlier in product development and collecting the data to improve a design or process, test delivers tremendous value to the organization.

Optimizing Test Organizations

> Monitored business objectives

> Centralized strategy

> Standardized architectures, tools, and processes

> Strong reuse from design to production

> Dynamic resource usage

> Systematic enterprise test data management

Enterprise Alignment

Business Planning

Deployment Life Cycle

System Development

Test Technologies and ArchitecturesTest Technologies and Architectures

(Cost Center)

Enterprise AlignmentEnterprise Alignment

(Contributor)

>Enterprise Alignment

(Business Enabler)

> Monitored business objectives

(Strategic Asset)(Cost Center)

Ad-Hoc(Contributor)

Reactive(Business Enabler)

Proactive(Strategic Asset)

Optimized

Transforming a

test organization into a

strategic asset requires

commitment to a long-term

phased approach.

Page 7: Automated Test Outlook 2012download.ni.com/evaluation/ate/ATO2012.pdf · 2014-05-12 · Manufacturing, Capability Maturity Model Integration (CMMI), and Agile Product Development

An emerging trend for electronics manufacturing companies is using product test for competitive differentiation. This has resulted in elevating the test engineering function from a cost center to a strategic asset. This shift was confi rmed by a recent global NI survey of test engineering leaders who said their top

goal over the next one to two years is to reorganize their test organization structures for increased effi ciency. This strategic realignment reduces the cost of quality and impacts a company’s fi nancials by getting better products to market faster. Research has revealed that “optimized” is the ideal maturity level—when a test engineering organization provides a centralized test strategy that spans the product life cycle. This optimized organization develops standardized test architectures with strong reuse components, enables dynamic resource utilization, and provides systematic enterprise data management and analysis that result in company-level business impact.

Companies making this transformation must commit to a long-term strategy because, according to NI research, it generally takes three to fi ve years to realize the full benefi t. A company must have a disciplined and innovative investment strategy to transform the test organization through the four maturity levels: ad-hoc, reactive, proactive, and optimized. Each level includes people, process, and technology elements. The right people are required to develop and maintain the cohesive test strategy. Process improvements are required to streamline test development and reuse throughout product development. And fi nally, tracking and incorporating the latest technologies are required to improve system performance while lowering cost.

When companies implement changes to process, people, or technology, they are sometimes tempted to bypass transition projects because they believe they can attain a higher level of maturity more quickly. However, before an organization can achieve an optimized level, it must fi rst reach the proactive level in each major competency area: enterprise alignment, business planning, deployment life cycle, system development, and test technologies and architectures.

An organization steadily builds a foundation for strategic transformation by sticking to a sequential approach and identifying short-term initiatives that help the company improve its maturity level and that map to annual operating objectives. And as the foundation gets built, test productivity and asset utilization increase, paying dividends on the original investment. This phased approach enables organizations to realize benefi ts early on—after the completion of just one or two projects. Examples of these transition projects include the following:

> Standardized Test Architecture/Process (Ad-Hoc->Reactive)—Adopting standardized software and hardware architectures and test methodologies improves productivity with faster test code development and increased test asset utilization.

> Test Total Cost of Ownership (TCO) Financial Model(Reactive->Proactive)—Creating a TCO fi nancial model for test helps companies calculate business productivity metrics and fi nancial metrics (return on investment, payback period, net present value, internal rate of return, and so on) for test improvement initiatives.

> Enterprise Test Data Management(Proactive->Optimized)—Developing a comprehensive test data infrastructure that spans across sites with universal access improves real-time decision making.

This transformation requires a shift from only supporting ongoing operations to developing innovation-based initiatives alongside ongoing operations. The test industry is still early in its transformation. Using the IT industry as an external benchmark, IBM published in its 2010 global technology outlook that highly effi cient companies that strategically transformed their IT organizations spend only 60 percent of their IT budgets for ongoing operations, leaving 40 percent for new and innovative initiatives, compared to other organizations with an 85/15 split in their legacy business models. Similarly for test, leading companies gain a competitive edge by keeping their test organizations agile and matching the level of innovation leveraged in other strategic departments.

When test engineering organizations become strategic assets, they create standard test platforms, develop valuable test-based intellectual property, deliver a more productive workforce while lowering operating costs, and align with the business objectives by continually contributing to better product margins, quality, and time to market.

“Test is a foundational activity in any development, manufacturing, and maintenance endeavor. Not only must it be included when considering product quality, time to market, and business objectives, but it must also be effective and affordable. At Lockheed Martin, we are investing in the people, process, and technology aspects of automated test to ensure we meet our objectives.”

—Tom Wissink, Director of Integration, Test, & Evaluation, Lockheed Martin Corporate Engineering & Technology

6 + 7

Page 8: Automated Test Outlook 2012download.ni.com/evaluation/ate/ATO2012.pdf · 2014-05-12 · Manufacturing, Capability Maturity Model Integration (CMMI), and Agile Product Development

Measurements and Simulation in the Design Flow

Shortening the product development cycle has long been a key objective of today’s R&D organizations. Especially in the automotive and aerospace industries, one method to reduce development time is concurrent design and test, which is often represented with the V-diagram product development model. In these industries, for which the end product is a highly complex “system of systems,” the left side of the V-diagram is considered “design” and the right side represents “test.” The idea behind the V-diagram is that greater effi ciency can be achieved by beginning the test and validation of subsystems before the development of an entire system is complete. While the use of concurrent design and test approaches such as the V-diagram is common in industries with highly regulated environments, adoption of these practices is growing in other industries and for other types of devices. For example, in the semiconductor and consumer electronics industries, shorter product life spans and increasing product complexity are fueling the pressure to reduce product development time.

According to a 2009 McKinsey survey on fabless semiconductor design processes, the ratio of product life cycle to product development time is approximately one-third of what it is in the automotive industry. Furthermore, the survey estimates that the average development time of a new semiconductor design is approximately 19 months. For this reason, the authors claim that “R&D Excellence” is a key differentiating factor.

Given the business imperative for improving R&D excellence in the product development process, the goal of concurrent design and test has become widespread throughout the electronics industry. A key method to empower this practice is increasing the connectivity between electronic design automation (EDA) simulation software and test software.

> Software in the Design ProcessTo understand the role of simulation software in the product design fl ow, engineers must understand the role of software in both the design and test phases of product development. During initial design and simulation, EDA software is used to model either the physical or electrical behaviors of a simulated product. Essentially, EDA software is a tool that uses mathematical models to represent the output of a device under test (DUT) based on a series of inputs and then presents these metrics to the designer.

During the validation and verifi cation stage of product development, engineers use software in a slightly different context—namely to automate measurements on a real prototype. However, similar to the design and simulation phase, the validation and verifi cation process requires measurement algorithms like those used by EDA software tools.

One emerging feature in today’s EDA software is the ability to provide increasing levels of software connectivity between the EDA environment and test software. More specifi cally, this connectivity enables (1) modern EDA software environments to drive measurement software and (2) measurement automation environments to automate the EDA design environment.

One benefi t of the connectivity between design and test software environments is that it allows design engineers to use signifi cantly richer measurement algorithms earlier in the design process. They gain not only more valuable knowledge of their designs earlier in the design process but also the opportunity to correlate simulations with measured data from the validation and verifi cation process. A second benefi t of increased connectivity between EDA and test environments is that it allows test engineers to develop working test code much sooner in the design process, which ultimately reduces time to market for complex products.

Page 9: Automated Test Outlook 2012download.ni.com/evaluation/ate/ATO2012.pdf · 2014-05-12 · Manufacturing, Capability Maturity Model Integration (CMMI), and Agile Product Development

> Using EDA Software to Produce Richer MeasurementsOne way that EDA and test software connectivity improves the design process is through richer measurements. Fundamentally, EDA tools use behavioral models to predict the behavior of a new design. Unfortunately, the modeled design is often verifi ed using measurement criteria that are ultimately different than those used to verify the fi nal

product, making it diffi cult to correlate simulated and measured data. One growing trend is to use a common toolchain for design through test—a trend that ultimately enables engineers to introduce measurements into the design fl ow earlier.

For example, consider the design of a cellular multimode RF power amplifi er. Traditionally, this type of component is designed and modeled using RF EDA tools. With the EDA environment, engineers typically “measure” RF characteristics such as effi ciency, 1 dB compression point, and gain through simulation. However, the end product must meet additional RF measurement criteria explicitly established for cellular standards such as GSM/EDGE, WCDMA, and LTE.

Historically, “standard specifi c” measurement data from metrics such as LTE EVM and ACLR measurements required instrumentation on a physical DUT largely because of measurement complexity. Going forward, new connectivity between EDA and test automation software enables the use of these sophisticated measurement algorithms within the EDA environment on a simulated device. As a result, they will be able to identify system-related or complex product issues much earlier in the design cycle and, therefore, shorten design times.

> Using Models to Parallelize the Test DevelopmentA second trend toward integrating design and test practices is to use EDA-generated behavioral models to accelerate the development of product verifi cation/validation and manufacturing test software. Traditionally, one source of ineffi ciency in the product design process is the delay of test code development for a particular product until after

the fi rst physical prototypes are available for testing. One way to accelerate this process is to use the software prototype of a given design as the DUT when writing either characterization or production test code. Using this approach, engineers can parallelize development time for both characterization and production test software with product design, which results in an overall improvement in time to market.

For example, consider the development approach Medtronic engineers chose for a recent pacemaker design. Using a new software package specifi cally designed to connect the Mentor Graphics EDA environment to NI LabVIEW software, the engineers could begin developing a test bench well before physical hardware was ever produced. The inherent parallelism achieved by this design approach fundamentally enables engineers to deliver products to market more quickly than before.

Going forward, integrated design and test practices will be a major factor in improving engineering design excellence. Because of greater connectivity between EDA and test software, engineers will more effectively use EDA software to provide richer simulations and more effectively use EDA simulations to improve their validation and production test processes.

“Connectivity between our EDA tools and NI's test software allows engineers to develop a test bench simultaneously with product development, providing earlier test feedback into the design process and greatly shortening design cycles by making development and test parallel rather than serial.”

—Serge Leef, Vice President/General Manager of the System Level Engineering Division, Mentor Graphics

System Simulation

Measurement Measurement

SoftwareEnvironment

DUT

I/O

Metrics

Behavioral Model

Virtual

Test Software

Design Test

Test Equipment

Physical

8 + 9

Integrating measurements

and simulations improves

product design and

shortens test development.

Page 10: Automated Test Outlook 2012download.ni.com/evaluation/ate/ATO2012.pdf · 2014-05-12 · Manufacturing, Capability Maturity Model Integration (CMMI), and Agile Product Development

Since the invention of GPIB in the 1960s, automated test systems have relied on PCs to provide the central control for instrumentation hardware and to automate testing procedures. PCs in various form factors, such as desktops, workstations, and industrial and embedded systems, have been used for this purpose. They offer a variety of interface buses, such as USB, Ethernet, serial, GPIB, PCI, and PCI Express, to interface instrumentation hardware in automated test systems. Because PCs play such a critical role in an automated test system, the test and measurement industry must track the progression of the PC industry and exploit any new technologies for increasing capabilities and performance while lowering the cost of test.

Over the last 10 years, PCs have evolved rapidly in many different ways. As predicted by Moore’s law, CPU processing capabilities have increased by over 75 times in the past decade. Besides the dramatic increase in processing capabilities, another signifi cant trend has been the emergence of serial communication interfaces and the demise of parallel communication interfaces. PCI Express has replaced PCI, AT, and ISA as the default internal system bus for interfacing peripheral system devices to the CPU. The PCI Special Interest Group (PCI-SIG), the consortium that owns and manages PCI specifi cations, announced in November 2011 that approximately 24 billion lanes of PCI Express have been shipped in the marketplace since its introduction in 2004, which is a strong testament to its adoption. Similarly, for external interfaces, serial buses such as USB and Ethernet have replaced the parallel port, SCSI, and other parallel communication buses. A market research report published by In-Stat in 2010 expects that by 2012, the number of wired USB-enabled devices shipped will exceed 4 billion. With the proliferation of wireless communications standards such as Wi-Fi and Bluetooth, another recently emerging trend is the consolidation of external physical interfaces on PCs.

The PCI Express bus, used in different implementations, will likely become the interface of choice for automated test systems. Offering the ideal combination of high data bandwidth and low latency, PCI Express is an extremely pervasive technology since it is a fundamental element of every PC. It has also started to blur the boundaries between a system bus, used for interfacing local devices within a system, and an interface bus, used for interfacing external peripheral devices to the system, and will likely continue to dissolve this delineation.

> PCI Express: System Bus for Automated Test PlatformsSince PCI Express is a serial bus, it has a variety of inherent advantages over parallel buses such as PCI and VME.

Technical challenges like timing skew, power consumption, electromagnetic interference, and crosstalk across parallel buses become more and more diffi cult to circumventwhen trying to increase data bandwidth. Besides being a technically superior bus, PCI Express, since its release in 2004, has seen continuous improvements in its data transfer capabilities. In 2007, the release of the PCI Express 2.0 specifi cation doubled the data rate from PCI Express 1.0, and in 2010, the release of the PCI Express 3.0 specifi cation doubled the data rate over PCI Express 2.0, providing the ability to transfer data at 16 GB/s per direction. Although the PCI Express standard has been consistently modifi ed, these improvements have not come at the cost of compatibility. PCI Express uses the same software stack as PCI and provides full backward compatibility.

Automated test and measurement platforms that use PCI Express as the internal system bus, such as PXI, can leverage all of these advancements to continue to offer more and more capabilities at a low cost. Such platforms, based on their technically superior capabilities, will likely become the central core of all automated test systems.

PCI Express External Interfaces

“Due to the combination of its excellent performance and pervasiveness, PCI Express is the default choice for system buses. With new fi ber-optic and copper cable technologies, it is emerging as the leading choice for high-performance external interfaces.“

—Mark Wetzel, Distinguished Engineer for Processor Architectures, National Instruments

Page 11: Automated Test Outlook 2012download.ni.com/evaluation/ate/ATO2012.pdf · 2014-05-12 · Manufacturing, Capability Maturity Model Integration (CMMI), and Agile Product Development

> PCI Express: External Interface Bus forAutomated Test SystemsThe high latency and low bandwidth of commonly used external interfaces for automated test, such as GPIB and Ethernet, present a barrier to lowering test times. These interfaces fundamentally constrain the overall effi ciency of a test system by limiting the data transfer rate and increasing the time it takes for every transaction. Since CPUs do not natively provide access to these external interfaces, usually some form of conversion occurs inside the PC to translate these external interfaces into the internal system bus, which is PCI Express. PCI Express offers better performance over these other external interfaces and is directly available from the CPU in a PC. This removes the bottleneck imposed by other external interface buses and lowers test times signifi cantly.

The concept of using PCI Express as an external interface bus is not new. The aforementioned PCI-SIG supports an external implementation of PCI Express, formally known as cabled PCI Express. Released in 2007, this implementation provides a transparent way to extend the system bus to interface external devices. Cabled PCI Express is already being used in modular instrumentation platforms such as PXI to provide fl exible and low-cost control options. The cabled PCI Express specifi cation formally supports only the use of copper cables, which limits the physical separation between the PC and the device to 7 m. However, when used with electro-optical transceivers, this technology can be extended over fi ber cables to offer more than 200 m of physical separation and electrical isolation.

The use of cabled PCI Express technology is reasonably successful in automated test environments. However, its adoption has been isolated to a few relatively niche

industries in contrast to the widespread adoption of general PCI Express technology. A more recent implementation of PCI Express as an external interface, Thunderbolt, is a technology Intel pioneered under the code name Light Peak that has the potential to be extremely pervasive. Thunderbolt combines PCI Express and the DisplayPort video protocol into a serial interface bus that can be driven over either copper or fi ber-optic cables. Since PCs will natively offer Thunderbolt ports, it has the promise to be a high-performance, low-cost, and ubiquitous solution. External PC interfaces based on PCI Express, along with other low-cost interfaces such as USB, will likely be the default interfaces for automated test systems. Applications such as high-volume production test or complex automated verifi cation and validation, which require high performance for data throughput and latency to lower the overall cost of test, will naturally gravitate toward PCI Express-based interfaces. Applications for which these requirements are not strongly desired will likely continue using other interfaces.

Based on current technology trends in the PC industry, such as the dominance of serial communication interfaces, I/O consolidation, and the pervasiveness of wireless communication, PCI Express is the default choice for a system bus and is expected to emerge as the leading external interface bus. Automated test systems that leverage PCI Express, in its various implementations, are positioned to offer the highest performance and most fl exibility as well as low cost. They will become the default choice for automated test and measurement applications.

Co

ntr

olle

r

PXI

Boxed Instrument

PXI

Computing Node

Peripheral Device(External Data Storage)PCI Express

Inst

rum

ent

Inst

rum

ent

Inst

rum

ent

Inst

rum

ent

Inst

rum

ent

Inst

rum

ent

Inst

rum

ent

10 + 11

PCI Express adds high-throughput, low-latency communication to external interfaces.

Page 12: Automated Test Outlook 2012download.ni.com/evaluation/ate/ATO2012.pdf · 2014-05-12 · Manufacturing, Capability Maturity Model Integration (CMMI), and Agile Product Development

One of the biggest trends in automated test over the last three decades has been the shift toward PC-based modular platforms that use the latest commercial off-the-shelf (COTS) computing technologies with increasingly powerful processors, new I/O buses, and more advanced OSs. While this trend is likely to continue, a completely new class of computing devices, namely tablets and smartphones, has emerged recently to offer new opportunities for forward-thinking organizations to take advantage of COTS technologies in automated test systems.

Intelligent handheld devices have been used for nearly a decade in the form of PDAs and the original smartphones. However, the introduction of the Apple iPhone and subsequent iPad, along with similar devices powered by software from Google, Microsoft, and others, has ushered in a new era of mobile computing, with hundreds of millions of smartphones and tens of millions of tablets sold to consumers and businesses.

> Mobile Devices for Automated TestWhile tablets and smartphones cannot replace ubiquitous PC or PC-based measurement platforms like PXI, they offer unique benefi ts when used as extensions to a test system. According to a Pew Research Center survey, most tablet owners use their tablets primarily for convenient content consumption. When The Nielsen Company surveyed consumers in 2011 to understand why they were using tablets instead of traditional PCs,

the top reasons cited included user experience improvements like superior portability, ease of use, faster startup time, and longer battery longevity. Given this information, the expected use cases for mobile devices within automated test include test system monitoring and control and test data and report viewing.

1. Test System Monitoring and Control—Test engineers, managers, and technicians benefi t from the ability to access a test system directly from a tablet or smartphone. This is useful when the test system is nearby (same building or campus), but it is especially convenient when the mobile device provides a secondary user interface to a test system that is located on the other side of the world. A tablet or smartphone can instantaneously view a wide variety of information related to a remote test system or control its mode of operation. To enable this use case, the test system itself or a proxy needs access to either a local intranet or the public Internet. Intranet access allows for remote monitoring from mobile devices on the same campus or with VPN access to the intranet, and a test system connected to the public Internet can theoretically be accessed by a mobile device anywhere in the world.

2. Test Data and Report Viewing—Rather than interact with test systems directly, test engineering personnel may elect to view consolidated test reports that characterize the results of previous tests and identify trends. In this use case, the test systems themselves do not need to be connected to the network so long as their data is available on another computer with network access. This secondary machine functions to house test results, analyze the data, and create reports that can be delivered to remote users with mobile access.

For both use cases, providing a test organization with mobile access to important information via tablets and smartphones involves two shared key challenges. The fi rst is choosing the right approach for exchanging data across the network. Test organizations have many communication protocol options, including TCP, UDP, and HTTP. However, a noteworthy trend in the information technology (IT) world is the shift toward web services for data exchange between servers and clients.

> Web ServicesA web service is an application programming interface (API) that can be accessed via HTTP by a wide range of clients. When called, web services return a human-readable response (typically XML). Calling a particular web service, which might represent the status of a test system, the latest test results for the day, or any other data that clients might want to view, simply involves programmatically making an HTTP request to a URL, parsing the response, and then displaying it to the user.

The Proliferation of Mobile Devices

“Tablets and smartphones are becoming increasingly ubiquitous computing devices, and we expect them to complement laptops and desktops when it comes to remote access to important data.“

—Jean-Claude Monney, Chief Technology Strategist for Microsoft US Discrete Industries

Page 13: Automated Test Outlook 2012download.ni.com/evaluation/ate/ATO2012.pdf · 2014-05-12 · Manufacturing, Capability Maturity Model Integration (CMMI), and Agile Product Development

Test organizations receive several benefi ts when using web services to communicate from a mobile client to a server application. First, web services are straightforward and simple to access from any programming language. Second, because web services sit on top of common protocols likeHTTP, the communication is considered IT friendly and can easily be encrypted via industry-standard technologies like SSL. For these reasons, major Internet companies like Google, Yahoo!, Microsoft, and Amazon have largely exposed their functionality (for example, search, mapping, cloud computing, and so on) to remote clients through web services.

> Native Versus Web-Based Mobile ApplicationsOnce the data is available to the client, an application on the mobile device can access and display it. Building a mobile application begins with one major decision: Should the application be native to the OS or should it run in the device’s browser? This choice has an enormous impact on every aspect of the application, from the expertise needed to develop it to the features it can include to its distribution method.

A native application (or app) on tablets and smartphones is designed for a specifi c device like the iPad. It uses built-in APIs for accessing device features or resources, depends on that device’s OS, and must abide by the vendor’s rules (both technical and policy) on what an app can do. Native apps are typically richer, more interactive, and more tightly integrated with the device than web-based applications. However, they require mobile platform knowledge and development tools to create, they largely cannot be reused from one platform to the next, and they must be distributed via a vendor’s store (for example, the Apple App Store).

It is important to note that the mobile device landscape has rapidly changed over the last fi ve years. While RIM and Nokia had enormous phone market share fi ve years ago, according to recent data from PC Magazine, Google’s Android has taken 43 percent of the smartphone market today (Apple came in second at 28 percent). Additionally, while the tablet market was tiny and fragmented fi ve years ago, the iPad dominates the nascent tablet market today, with fast-following competitors aggressively trying to catch up. At this rate of change, it is diffi cult to predict how long investments in native applications for a specifi c platform will provide value.

On the other hand, every tablet and smartphone includes a mobile web browser. Web-based applications written in HTML and JavaScript are largely portable to every device and are freely distributed without the need to interact with a vendor store. While it may sound like web applications offer tremendous benefi ts over native apps, there are clear trade-offs. Applications that run inside a mobile browser are

“sandboxed” or locked out from accessing built-in APIs on the device and so they typically provide a less interactive, polished, and immersive user experience when compared to native apps.

> A Complete Mobile Device SolutionThe explosion of mobile devices like tablets and smartphones provides compelling benefi ts to engineers, technicians, and managers involved in automated test who need remote access to test status information and results. While today’s technology offers solutions for monitoring or remote reporting via mobile devices, test organizations will need new expertise to unite the networking, web services, and mobile app portions of the solution.

HTTP Request

Server Response

Desktop Apps

Web-Based UIs

Mobile Apps

Test SystemsClient Applications

PXI

InstrumentationRack

(Traditional and PXI)

12 + 13

Using industry-standard web services simplifi es communication with a variety of client applications.

Page 14: Automated Test Outlook 2012download.ni.com/evaluation/ate/ATO2012.pdf · 2014-05-12 · Manufacturing, Capability Maturity Model Integration (CMMI), and Agile Product Development

Over the past 20 years, the concept of user-programmable, microprocessor-based measurement algorithms has become mainstream, allowing test systems to rapidly adapt to custom and changing test requirements. This approach is called virtual instrumentation, and, given its success, vendors continue to look for opportunities to take its benefi ts further: increased user customizability, greater use of off-the-shelf technology, higher performance, and decreased test system cost.

If the microprocessor initiated the virtual instrumentation revolution, then the fi eld-programmable gate array (FPGA) is ushering in its next phase. FPGAs have been used in instruments for many years. For instance, today’s high-bandwidth oscilloscopes collect so much data, it is impossible for users to quickly analyze all of it. Hardware-defi ned algorithms on these devices, often implemented on FPGAs, perform data analysis and reduction (averaging, waveform math, and triggering), compute statistics (mean, standard deviation, maximum, and minimum), and process the data for display, all to present the results to the user in a meaningful way. While these capabilities

Portable Measurement Algorithms

offer obvious value, there is lost potential in the closed nature of these FPGAs. In most cases, users cannot deploy their own custom measurement algorithms to this powerful processing hardware.

> Open FPGAs for Test“Open,” user-programmable FPGAs on measurement hardware offer many advantages over processor-only systems. Because of their immense computational capabilities, FPGAs can deliver higher test throughput and greater test coverage, which reduces test time and capital expenditures. The low latency of FPGA measurements also provides the ability to implement tests that are not possible on a microprocessor alone. Their inherent parallelism offers true multisite test, even more so than with multicore processors. And fi nally, FPGAs can play a key role in real-time test hardware sequencing and DUT control.

A 2011 market study on modular instrumentation from the industry research fi rm Frost & Sullivan reported,

“Advancements made by companies such as Altera and Xilinx in FPGA capabilities are extremely useful in test and measurement applications in which customers need highly deterministic and fast processing capabilities…” Also, a growing number of open FPGA products from PXI venders is on the market today.

While hardware options continue to reach the market, most test and measurement algorithms, developed for execution on microprocessors as part of the virtual instrumentation revolution, are simply not easily portable to FPGAs due to the data types, programming models, and hardware-specifi c attributes such as timing constraints. It takes signifi cant expertise and time to develop verifi ed, trusted FPGA measurement IP. This is why today, most FPGAs in instrumentation hardware use only fi xed, vendor-defi ned algorithms and are not user programmable.

The 2011 Automated Test Outlook discussed heterogeneous computing distributing algorithms across a variety of computing architectures (CPUs, GPUs, FPGAs, and the cloud) to select the optimal resource for algorithm implementation. While a powerful concept, heterogeneous computing presents unique challenges with programming each of these targets, and measurement algorithm portability between them can be diffi cult. To complicate matters, a recent global NI survey of test engineering leaders reported that 54 percent expect future technology advances to improve (decrease) their development time while increasing test throughput and reducing system cost. To overcome this paradox, the industry is addressing all of these challenges with advances in development tools that promise to provide algorithm portability across hardware targets and make the advantages of FPGAs available to all engineers developing test systems.

“With business needs demanding computing platforms beyond the venerable microprocessor, our familiar programming paradigms are struggling to keep pace. Providing tools that offer effi cient design capture through a variety ofmodels of computation, combined with the ability to target multiple types of processing hardware, is a key goal for NI software investments.“

—David Fuller, Vice President of Application and Embedded Software, National Instruments

Page 15: Automated Test Outlook 2012download.ni.com/evaluation/ate/ATO2012.pdf · 2014-05-12 · Manufacturing, Capability Maturity Model Integration (CMMI), and Agile Product Development

> HDL AbstractionThe fi rst set of such tools provides hardware description language (HDL) abstraction. HDLs describe gate and signal-level behavior in a text-based manner, and HDL abstractions attempt to deliver higher-level design capture, often in a graphical or schematic representation. These tools include Xilinx System Generator for DSP, Mentor Graphics Visual Elite HDL, and the NI LabVIEW FPGA Module. While they do offer a much lower barrier to FPGA technology adoption than HDL, they do not completely abstract some of the hardware-specifi c attributes of FPGA design such as pipelining, resource arbitration, DSP slice architecture, and on-chip memories. As such, algorithms still require rework and thus reverifi cation when ported to an FPGA, fueling future advances in development tools.

> High-Level SynthesisHigh-level synthesis (HLS) tools provide the ability to capture algorithms at a high level and then independently specify performance attributes for a given implementation such as clock rate, throughput, latency, and resource utilization. This decoupling offers algorithm portability because the specifi c implementation is not part of the algorithm defi nition. Moreover, algorithm developers do not need to incorporate hardware-specifi c considerations into their designs (pipelining, resource arbitration, and so on). The concept of HLS has been around for more than 20 years, but the tools on the market are only just becoming mature enough to be viable. Offerings include Synopsys Synphony, Xilinx AutoESL, Cadence C-to-Silicon, and Mentor Graphics Catapult C. These tools do offer advantages over HDL abstractions, but they target only FPGAs or application-specifi c integrated circuits (ASICs) and not other computing platforms such as microprocessors and GPUs. Attempting to address some of the limitations of these HLS tools, NI recently released beta software that incorporates the familiar LabVIEW datafl ow diagram with

the advantages of HLS for FPGA design. This promises to provide a path for the large number of LabVIEW measurement and control algorithms to an FPGA implementation, without compromising microprocessor execution or requiring signifi cant algorithm redesign for FPGA deployment. Because it is still in beta, the software is not yet ready for mainstream adoption, but initial results are promising.

> Models of ComputationThe last step in the evolution of development tools focuses on coupling measurement portability across hardware targets with multiple models of computation and design capture. These models of computation might include the LabVIEW datafl ow diagram, DSP diagrams for multirate

signal processing in RF and communication applications, textual math for capture of textbook-like formulas, or state machines for digital logic and protocols. Take, for instance, a future system on a chip (SOC) such as the Xilinx Zynq extensible processing platform that combines a dual-core ARM microprocessor with an FPGA. This silicon offers tremendous potential for heterogeneous computing, yet programming it could be challenging because separate languages and models of computation are required for the microprocessor and the FPGA. Ideally engineers would have a multitude of models of computation supported for all targets, to capture algorithms in the most effi cient manner, and then deploy them to the best execution target for a given application. Depending on business needs, “best” could mean highest performance, greatest cost-effectiveness, or shortest time to market. Tools that work with hardware-agnostic models of computation are under development and are inevitable based on the needs of today’s test system developers.

Hardware-AgnosticModels of Computation

High-Level Synthesis

Hardware Development Language Abstraction

Abs

trac

tion

and

Flex

ibili

ty

Time

Processor-Based Measurements

Development software will

provide greater levels of

hardware abstraction and

fl exibility across execution

targets to deliver higher

performance, cost

effectiveness, and shorter

time to market.

14 + 15

Page 16: Automated Test Outlook 2012download.ni.com/evaluation/ate/ATO2012.pdf · 2014-05-12 · Manufacturing, Capability Maturity Model Integration (CMMI), and Agile Product Development

ni.com/automatedtest

©2012 National Instruments. All rights reserved. LabVIEW, National Instruments, NI, and ni.com are trademarks of National Instruments. Other product and company names listed are trademarks or trade names of their respective companies. 351409B-01 03628