Advertisement
Articles
Advertisement

Design for testability

Tue, 03/19/2013 - 9:41am
Holger Goepel, CEO, GOEPEL electronics

Since the very early days of electronic components, failures have continuously been appearing. In spite of enormous development and production improvements, this situation has not changed. The increasing circuit density and board complexity are critical factors for producing faults. There are several test technologies, each with their individual unique requirements and efforts and of course their advantages and disadvantages.

Most disadvantages have their origin in contacting problems, as space for test points on electronic boards has continuously been reduced in the last few years. To reduce the disadvantages of invasive test methods, it may be required to take advantage of the component features that already exist in the circuits. The basic requirement is the development of a test concept with all required hardware and software elements in the unit to be tested. This endeavour is characterised as “test-oriented layout” or more commonly named “Design for Testability (DFT)”.

There are two emphasis problems with the examination of digital circuits: test pattern generation and test verification.

The test pattern generation is characterized as a process for the production of stimulus signals for a circuit, in order to prove their correct function. The test verification is to determine the answering behaviour of the circuit. Simultaneously, the automatic test pattern generation and test verification has been becoming more and more difficult due to the increasing complexity of the boards. This can be clarified best when considering a functional test as example. Therefore it is assumed that each digital circuit is detachable into sequential and combinatorial circuit parts.

 
According to Moore and McCluskey, the minimum number of test vectors for a 100% functional test is calculated — in other words: how many test vectors are necessary to test all possible functions of a circuit — as follows:
Q = 2(x+y)
x = number of inputs
y = number of storage elements (sequential circuit parts)

For an assumed circuit with 25 inputs (x) and 50 internal latches (y), a test rate of 100ns per test steps requires a test time of 107 years.

This testing problem can be avoided by designing circuits which are more efficiently testable — using the sequential circuit parts. That means, the circuit must be designed to being able to be tested with an acceptable fault coverage and in an acceptable time and, furthermore, to overcome the problem of test access.

The Design for Testability must be subdivided into different groups: the Ad-Hoc-Design and the Structured Design.

Ad-Hoc-Design
The Ad-Hoc-Design contains the partitioning, the importation of additional test points and the use of bus architectures. Partitioning means the breakdown of the entire circuitry into circuit parts which are esier to test. The sum of the effort to test these part PCBs is considerably less compared to the effort in testing the entire circuit. Bus architectures simplify the testability by a selective activating of the individual bus participants.

Structured Design
The Structured Design’s aim is to reduce the sequential complexity of a network in order to simplify the test pattern generation and test verification. This aim is achieved by creating possibilities to control and observe sequential machine’s states. Methods, that implement these two test types of a circuitry, are called “passive test aids”.

One of these passive test aids, which became accepted as systematic help and meanwhile is a standard for the manufacturers of mainframe computers, is the Scan Path method. Using this method, circuits with sequential storage elements can be subdivided into observable and controllable combinatorial circuit parts. Therefore, the storage elements’ internal states are required to be controllable and observable. This can be achieved by interconnecting the internal storage elements to shift registers, in order to enable the serial insertion of test items and the serial read-out of test answers. The classic among these methods is the LSSD (Level Sensitive Scan Design), which was developed by IBM for mainframe computers in the 1960s. It is based on the extension of functional storage elements to shift register elements, the so called “shift register latches” (SRL).


The Scan Path Method shows by the example of the LSSD that complex sequential circuits decompose into manageable, purely combinatorial circuits. Since combinatorial circuits, in contrary to sequential circuits, are testable with substantially fewer test vectors, test expenditure and testing time are significantly less.

The disadvantage of the sequential logic is changed into the advantage of its employment as part of the test machinery.

It seems logical that the described DFT facts, which have their roots in the circuit technology, also apply to PCBs to overcome test problems with these techniques. If the technology is applicable for circuits it must be transformed to assemblies. What is needed?

Between the circuits test cells are required, which are switchable as shift chains as well as controllable and readable via few lines. If these test points are placed in the lines they are used for the In-Circuit Test. But the increasing complexity of modern boards demands the integration of these test points in the components to clear additional space for nets.

This “test friendly” requirement is ideally met by Boundary Scan, also known as JTAG. Boundary Scan has been evolved to a standardised test of components and their interconnection networks.


Boundary Scan is possibly the most resourceful test technique which — in a similar way to the In-Circuit Test (ICT) but without physical contact — detects the failure location, sets thousands of test points — if necessary also under BGAs — and needs only four lines. While an ICT is only possible with specially constructed adapters, a Boundary Scan test is already useful if there is at least one Boundary Scan component on the board.

Boundary Scan essentially means “testing at the periphery (boundary)” of an IC. Besides the core logic and the contact points some additional logic is implemented in an IC. These test points are integrated between the core logic and the physical pins. The following image points it up graphically compared to the principle of the ICT.

Boundary Scan is simply and universally adaptable. In general one can argue that the technology supports the product throughout its entire life cycle. Already on the design stage, tests are possible by means of the CAD data, which also can be used later – even up to the customer’s application. That means that test pattern created for the design verification can be reused for the prototype debug and fabrication test. This is an important advantage since especially during the design of highly complex assemblies their testability for the future has to be considered.

Thus, the time and effort required for testing is enormously reduced. Only a few days or even hours are required to generate test programmes – compared with the high efforts which accompany an In-Circuit Test or Functional Test. Furthermore, the diagnosis times are minimized, not to mention the high production and storage costs of nail bed adapters.

Advertisement

Share this Story

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading