Current location - Trademark Inquiry Complete Network - Futures platform - How to improve the yield of integrated circuits
How to improve the yield of integrated circuits
EDA technology for high yield design

The decline of yield has become one of the biggest challenges in the design of nano-integrated circuits. How to develop high performance IC? At the same time, ensuring high yield has become a hot issue in academia and industry in recent years. ? One? Chip yield in the production of electronic products, the yield problem has been widely concerned by the industry, because it is directly related to production costs and corporate profits. If the yield is too low, the production cost will rise sharply, which will not only reduce the profit of enterprises, but also reduce the market competitiveness of products, and even cause the failure of the whole product project. ? The importance of the yield problem is also reflected in electronic products and IT? The supporting industry of the industry-the design and production of integrated circuits (IC). Besides, in IC? The yield problem of design and production is more prominent, mainly related to IC? Characteristics of design and manufacture. First of all, the production process of integrated circuits is very complicated. The production of a chip often goes through dozens or even hundreds of process steps, and the production cycle is long. The deviation of any process step in the whole manufacturing process will affect the product yield. Secondly, the investment in integrated circuit production is huge, and an ordinary production line often needs hundreds of millions of yuan, and the cost of advanced production lines is even more amazing. If the film yield is too low (below 30%), it will lack market competitiveness and be difficult to put into mass production. ? At present, the problem of yield has become a problem affecting IC? One of the key factors of investment risk in design and manufacturing enterprises. So, many ics? Development projects even do not hesitate to appropriately reduce IC? Performance indicators meet the requirements of the rate of return, so that at least products can enter the market to recover their investment. ? In recent years, it? Is the rapid development of industry pursuing it? High performance and convenience of products, IC? The scale is expanding and the characteristic line width is shrinking. Currently CMOS? The mainstream technology has changed from 0.25μm a few years ago. Down to 0. 10μm? Below. 90? Nano and 60? Nano-production line is becoming the mainstream production line of the next generation, and the decline of yield has become one of the biggest challenges in nano-integrated circuit design. Moreover, with the wide application of wireless products, higher requirements are put forward for bandwidth and device response speed. Research and development of high-performance RF integrated circuits and microwave monolithic integrated circuits (RFic, MMIC) and a large number of new materials, new processes and new devices are adopted as IC? Design has brought unprecedented challenges. These factors greatly increase the IC? Uncertainty in the manufacturing process makes IC? The output of products is more difficult to control. Due to the importance of the yield problem, in the current IC? In research and development, the consideration of yield has penetrated into IC? All stages of design and manufacture. How to develop high performance IC? At the same time, ensuring high yield has become a hot issue in academia and industry in recent years. ? Two? With Ada? Techniques to improve yield? Affect IC? There are many factors of yield, but they mainly come from two aspects: the first is the influence of process line level, material characteristics and environment. In IC? If the process line is unstable in the manufacturing process, it will lead to the deviation between the manufacturing results and the design and reduce the yield. At the same time, different materials have different processing techniques and processing difficulties, and material characteristics are also an important factor affecting the yield. And environmental factors such as temperature and humidity will also affect IC? The quality of products is affected, resulting in a decline in output. In terms of technology, the most prominent is the influence of defects on yield. Is the defect caused by IC? Unstable process line creates ideal IC? Structural changes, such as metal strip deformation, dust particles and redundancy. In order to solve these problems, we mainly improve and adjust the production line and implement process control. ? The second is the influence from design. If in IC? Unreasonable parameter design in design leads to IC? Defects in performance lead to low yield. Similarly, unreasonable structural design will also cause yield problems. Aiming at this kind of problem, we mainly improve the yield by improving the parameters and structural design and adding redundant structural design. In addition to the adjustment and control of the process line, it needs to be fully considered in the manufacturing stage, and other related yield problems can be solved in IC? Solve or improve in the design stage. Because the problem of yield can be fully considered in the design stage, and the risk caused by yield problem can be effectively avoided, EDA designed for high yield? Technology is getting more and more attention. ? At present, EDA is used in both process and design. Many effective methods to improve the yield design are put forward technically. The main purpose of these methods is to solve the following three problems:? 1. Reduce the error between design and manufacture. ? Mainly refers to the errors caused by factors such as technology, materials and environment. By improving the process route, materials and environment, the model accuracy is improved (the component simulation model considering various factors is established), so that the design parameters are basically consistent with the processing parameters. For example, in the ultra-deep submicron process, using statistical technology, through the statistical analysis of test data and Monte? Carlo? Simulation: according to the statistical distribution characteristics of parameter deviation and failure points (defects), a statistical model is established, and on this basis, sensitivity analysis, yield analysis and optimization are carried out to effectively improve the yield; Another example is the use of OPC (optical correction) technology, which can correct the irregular geometric figures inconsistent with the original design in the lithography process, thus reducing the errors with the original design. Another example is ultra-deep submicron technology. With the increase of frequency and the decrease of feature size, various high-frequency effects of interconnection lines are brought, which leads to many complex problems such as signal integrity and design parameter deviation. It is also an important problem for high yield design to establish an effective interconnection model and realize rapid simulation of interconnection network. ? 2. Output estimation. ? That is, before the film production, according to the specific situation of technology and design, EDA? If the rate of return of the tools used to predict the rate of return can't reach the predetermined target, further measures such as improving the design and adjusting the flow need to be taken to improve the rate of return and reduce the investment risk. For example, in VLSI design, in order to avoid the influence of process defects on the yield, the statistical distribution of defects is analyzed and the yield estimation results are obtained. ? 3. Output optimization. ? In the case of low yield, some tools are used to optimize the yield results (mainly referring to the optimization of design). Such as: design center method (design? Center), by adjusting the design parameter value to the center of the parameter value distribution area, the influence of random disturbance in the process on the circuit performance is avoided, thus improving the yield. ? Three? Common yield design algorithm? At present, the methods of yield analysis and optimization can be roughly divided into two categories. One is numerical method, which estimates and optimizes the yield according to the characteristics of circuit equations. It has the characteristics of fast operation and accurate estimation results, but it is not flexible enough to be applied to complex circuits. The other is statistical method, mainly Monte? Carlo? This method is simple and flexible, and can be used for yield analysis and optimization of complex circuits, but its accuracy depends on the accuracy of simulation model and simulation times, and its operating efficiency is also related to the complexity of model and simulation times. ? 1. Numerical method? A lot of research on yield analysis and optimization technology based on numerical algorithm (also known as geometric algorithm in some foreign literatures) has been carried out as early as 1960s and 1970s. At that time, it was mainly aimed at yield and tolerance analysis in the circuit. With the appearance of integrated circuits, most of these algorithms are also used to analyze and optimize the yield of integrated circuits. The numerical method has the characteristics of high efficiency and accurate calculation. It plays an important role in the design. ? The basic principle of yield analysis algorithm based on numerical method is: according to the performance index of circuit design and circuit equation, calculate the design parameter distribution area (hereinafter referred to as acceptable area) of acceptable circuit (circuit meeting the finished product index), and then compare the acceptable area with the distribution area of circuit design parameter error range (hereinafter referred to as parameter distribution area) in the manufacturing process to get the estimated yield under the current design parameters. If the yield is too low, the parameter distribution area can be changed by adjusting the design parameter values. Although the principle of numerical method is simple, there are many problems in actual circuit design: first, there are dozens or even hundreds of circuit parameters, and the acceptable region and parameter distribution region to be analyzed and solved are hyperellipsoid. With the increase of circuit parameters, the workload of circuit analysis increases exponentially, which brings great difficulties to the final analysis and solution of yield. Second, the complexity of the circuit equation, using IC? With the improvement of performance index and the application of new materials and devices, more and more factors need to be considered in the analysis, such as coupling, dispersion and skin effect. The difficulty of solving the circuit equation is greatly increased, which may lead to no solution to the final yield problem. Of course, some degenerate formulas and simplified methods can also be used to deal with it, but this will greatly reduce the accuracy of yield analysis and optimization results in yield problems, that is, the consistency between the results and reality, rather than accuracy, that is, the accuracy requirements are not strict. The third is the shape of the response function. At present, Newton method, least square method and their improved algorithms are mainly used for yield optimization. When the response function is convex, it can converge quickly and get the optimization result, but it is not suitable for the case where the response function is concave. At present, the commonly used algorithms in yield analysis and optimization include linear cutting method, simplex approximation method, simulated annealing method, Latin method and ellipsoid method. Technique? ) and so on. ? In recent years, due to IC? With the rapid development of technology, relying on pure numerical methods to analyze and optimize the yield, especially in solving higher-order differential equations and analyzing physical effects, has become inadequate and has been limited in many applications. With the development of computer technology and the wide application of modeling and simulation technology, IC based on statistical technology? The yield analysis optimization tool [CNELC] has gradually become EDA? The mainstream yield tool in. ? 2. Statistical method (statistical design method)? The core of yield analysis and optimization algorithm based on statistics (called statistical design method in some literatures) is Monte Carlo (Monte? Carlo? ) method. Monte Carlo method, also known as computer random simulation method, is a calculation method based on "random number". This method originated from the "Manhattan Project" of the United States to develop an atomic bomb in the First World War. One of the hosts of the project, mathematician von Neumann, used the world-famous casino-Monte, Monaco? Carlo? -give this method a name and cast a mysterious color. Actually, Monte Carlo? The basic idea of the method was discovered and utilized before 17? In the 20th century, people knew that the "frequency" of events was used to determine the "probability" of events. 19? In the 20th century, people used the method of throwing needles to determine pi. 40 in the last century? The appearance of electronic computers in the 1990s, especially the appearance of high-speed electronic computers in recent years, made it possible to simulate this kind of experiments on computers in a large number and quickly by mathematical methods. ? The problems in scientific and technological calculation are much more complicated than this. Such as the pricing of financial derivatives (options, futures, swaps, etc.). ) and transaction risk estimation, the dimension of the problem (that is, the number of variables) may be as high as hundreds or even thousands. For this kind of problem, the difficulty increases exponentially with the increase of dimension, which is the so-called "course? Dimension), the traditional numerical method is difficult to deal with (even with the fastest computer). Monte? Carlo? This method can deal with the dimension disaster well, because the computational complexity of this method no longer depends on the dimension. Let the incalculable problems be solved now. In the past, many circuit yield methods were based on nonlinear programming, such as linear cutting method and simplex approximation method. These methods transform the output problem into solving the constrained extreme value problem. Although the mathematical model is simple, the calculation is very complicated. With the expansion of the scale of circuit products, more and more circuit parameters are involved in the calculation, and the constraint function is becoming more and more complicated. These methods are no longer applicable to the calculation of circuit yield. ? With the rapid development of computer technology, a new circuit yield analysis method-Monte Carlo yield analysis method appeared in the 1960s and 1970s. According to the basic idea of Monte Carlo method, this method calculates the yield of circuit products by computer random simulation. For large-scale and complex circuits, the analysis results can be obtained in a short time, which greatly improves the efficiency of circuit yield analysis. Monte Carlo yield analysis method is still a widely used circuit yield analysis method. ? The rate of return obtained by Monte Carlo method is only an approximate statistical estimate of the actual rate of return, and this approximate statistical estimate is related to the size of the parameter sampling scale. The larger the sampling scale, the more accurate the statistical valuation. Generally speaking, in order to obtain a reasonable valuation, hundreds or even thousands of experiments are needed. For large-scale circuit networks, the calculation cost of circuit analysis is considerable, which often limits the application scope of Monte Carlo method. Simply applying Monte Carlo method can not get the best yield, the best rated parameters and the best tolerance. Nevertheless, Monte Carlo method is still the most basic method in circuit statistical design, and has obvious advantages, such as: although the calculation accuracy is square with the sampling scale, the sampling scale has nothing to do with the number of parameters to be solved; The method itself is simple and easy to program; Monte Carlo method has nothing to do with the shape of the acceptable area of the product, that is, whether the product is convex or not, which is undoubtedly an advantage for applying this method to the optimization of yield. Because of the above advantages, Monte Carlo method is still a powerful method widely used in circuit statistical design. ? The basic principle of yield algorithm based on Monte Carlo method: firstly, according to the characteristics of parameters in the circuit, the parameter distribution (usually the normal distribution of specific parameters) is assumed, and some sample points subject to the assumed distribution are generated by computer pseudo-random number algorithm. The sampling point values are substituted into the circuit simulation model for circuit simulation. By comparing the simulation results with the pre-set qualified indicators of finished products, the qualified sample points are counted, so the proportion of qualified sample points to the total sample points is the estimated value of finished products. ? Although the principle of Monte Carlo method is relatively simple, it needs to solve the following key problems in practical application: 2. 1. The consistency between the hypothetical distribution and the actual distribution. ? Because the actual distribution of circuit parameters needs to be obtained through a lot of tests, hypothetical distribution is often used instead of actual distribution in practical application, so the deviation between hypothetical distribution and actual distribution becomes the key to the accuracy of yield estimation. Moreover, improved algorithms are often used in practical applications, and most of these algorithms are derived according to the distribution of assumptions to reduce the number of simulations. If the hypothetical distribution is quite different from the actual distribution, the final yield estimation result may be wrong. ? 2.2. Simulation time. ? At present, the number of simulations in general yield analysis based on Monte Carlo is 200? Time ~ 2000? Time. Because the accuracy of Monte Carlo method is proportional to the square of simulation times, that is to say, the more simulation times, the more accurate the output estimation. However, with the increase of simulation times, the time of the whole yield analysis is greatly increased. Especially for complex circuits, a long simulation time may cause a yield analysis to take several days, which will bring great inconvenience to the later yield improvement work. The number of simulations is a key problem that affects the performance of yield analysis algorithm. At present, it is mainly solved from two aspects. One is to design sampling strategies and select sample points with parameter distribution characteristics, such as systematic sampling method and important sampling method, to reduce the number of simulations. Another method is to improve the efficiency of yield analysis by reducing the time of single simulation, mainly by constructing a fast model to replace the original model according to the characteristics of circuit simulation model, such as using artificial neural network method, fuzzy logic method, statistical model and so on. ? 2.3. Model accuracy. ? Eda? The tool is based on the model of circuit components, and the accuracy of the model directly affects the accuracy of simulation results. Similarly, in yield analysis, if the model accuracy is poor, the analysis results will be inaccurate or even wrong. ? Because the statistical design method is not limited by circuit characteristics, the method is simple, flexible and accurate, and it has become an EDA tool for high yield design. An important part of technology, many internationally famous large-scale EDA? Agilent and other tools and software? ADS、Cadence、Synopsys? They all integrate special statistical toolkits or statistical design tool modules to meet the requirements of high yield design. With the development of integrated circuit technology and the increase of design difficulty, EDA? Technology will be in IC? It has a better position in the design. ? Four? Development prospects? With IC? The competition between R&D and manufacturing enterprises is becoming more and more fierce. As a key factor affecting the economic benefits of enterprises, the yield problem has become an ic? An important weight for design and manufacturing enterprises to improve product market competitiveness. At present, in many large IC? Design and manufacturing enterprises are equipped with a special team of good products. And there are many integrated circuit design service companies to solve the problem of yield. Like PDF? Solution? The company is a supplier specializing in providing yield optimization solutions for fabs and foundries, and is currently interested in EDA? Expand the field and introduce a tool pDfx, which can be used in digital IC? In the physical synthesis stage of the design process, improve the design and increase the yield. Estimated annual software usage fee 15? Ten thousand dollars. ? Eda? Tool development has been booming since 2002? Since 2000, there has been a new output almost every year. Tool release, such as 2003? ChipMD? The company launched the yield optimization tool software DesignMD? According to the process data statistics and working conditions, the transistor size of analog/mixed signal devices can be adjusted to improve the yield by 30%? The performance is improved by 50%. Can this software run on Unix? What about Linux? Under the platform, its one-year service life is priced at 5? Ten thousand dollars. And in recent years, many old EDA? Company Cadence? 、Synopsys? And so on also launched a yield optimization toolkit, such as Cadence? An encounter launched by the company? Diagnosis? Tools, Silvaco? SPayn launched by the company? Wait a minute. And it is worth mentioning that some small EDA? The company only uses DFY (design? For what? Output)? Product statistical design tools, and achieved very gratifying market performance, such as: ZKOM? Company crystal? Output,? ChipMD? Company design? Wait, you can see that DFY is based on statistical technology? Technology is highly respected by the industry, statistical DFY-EDA? Has a good market prospect. EDA with high yield design? Tools have become EDA? New growth point of software industry. ? At present, many researches have been carried out in China, such as IC caused by defects in xidian university. Functional yield research, Zhejiang University uses optical correction technology (OPC) to improve IC? The research on yield has achieved good results. But because of our domestic EDA? The development of software industry started late. At present, EDA with independent intellectual property rights for high yield design has been commercialized in China. There are still few tools. China's integrated circuit industry is in a high-speed development stage. At present, EDA technology for high yield design has been further developed. Technical research and improvement of EDA? The development of tool software plays an important role in improving the technical level of integrated circuits and IC? The competitiveness of design and manufacturing enterprises is of great significance, which is of great significance to China EDA? The development of software industry also has great impetus.

-

Effect of chip layout on yield in wafer

Generally, in the design of chip layout in a wafer, we always try to make each wafer contain the largest number of chips in order to have the highest chip productivity. However, the output of the chip will be affected by many other factors, especially the exposure time of the step-and-repeat exposure machine and the number of tests on the probe table. This means that this chip layout strategy on the wafer may not necessarily achieve the highest yield. WaferYield Inc summarized the production situation of 16 integrated circuit manufacturing enterprises, and invented a better chip layout method in the wafer through research, which can improve the chip yield and thus increase the yield. This method can improve the chip yield by 6%.

Ron Sigura, president and CEO of WaferYield, said: "We found that two different chip layout methods can be used to design the same number of chips on a wafer, but the yield difference of the step exposure machine can be as high as 18%." He explained that on average, 7% of the capacity of step exposure or scanning exposure machine equipment is used to produce chips located at the edge of the wafer, accounting for 1% of the total number of chips, and the yield of these chips is very small. Their WAMA (wafer mapping) exposure field/chip area layout system can comprehensively consider the yield, production efficiency of exposure machine and test equipment, investment cost and return, and optimize all parameters as a whole, and finally get the optimal chip layout result. "This balanced layout method may not maximize the number of chips per wafer, but it will maximize the overall yield and production efficiency."

This research method shows that about half of the companies use manual layout, and the other half use internal software for layout, so as to maximize the number of chips on the chip. In a few cases, the arrangement strategy of minimizing the total number of exposure fields in the mask will be adopted. The starting point of this method is to assume that all mask exposure areas use the same number of masks. However, as Eitan Cadouri, Chairman and Chief Technology Officer of WaferYield said, today, this method is no longer correct, because some retinal areas only contain CMP layers (3 to 7 masks), while other retinal areas contain a complete set of masks (16 to 30 masks). The exposure time required in CMP area is much less than that in other areas. In addition, Cadouri also believes that not all areas have the same exposure time. "In some cases, we need to use blade technology, and the cross line area of the blade needs longer than the normal area." Our simulation results of step-by-step exposure time show that even if the number of chips is exactly the same, the process time required for step-by-step exposure with different layout methods will be 4~ 18% different.

In the exposure of the step-by-step exposure machine, they re-evaluated the exposure time of some chips at the edge of the wafer and found that it could play a certain role in improving some production efficiency. For example, if the exposure light field of the step-and-repeat exposure machine can expose four chips at a time, the registration process may take longer when exposure is performed at the edge of the wafer, and one or two chips may not contribute to the yield because only a part of the reticle pattern is in the wafer.

As for testing, usually the user completes the measurement layout of the chip in the wafer first, and then generates the corresponding test version of Tu Tu. On the other hand, WAMA software can consider some limitations in the test in advance when generating the wafer test layout.

Perhaps the biggest advantage of this layout strategy is that there is no need to change any production process. Support all chip manufacturers to use step exposure machine and scanning exposure machine, which can help engineers to operate in design, manufacturing, packaging, testing and other aspects.

-

Yield improvement method based on morphology and linear programming method

The maximum value of the structural element corresponding to each point on the number; Gray scale corrosion is the trajectory of structural elements "sliding" near the bottom of the signal, and its origin is depicted. They are marked as: f⊕g,fg. There are two effects on the expansion (or erosion) operation of gray-scale images: if the values of structural elements are all positive, the output image will be brighter (or darker) than the input image; According to the gray value of dark (or bright) details in the input image and the relationship between its shape and structural elements, it is subtracted or removed in the operation. Open and close operations in gray morphology can be used to extract features or smooth images. The opening operation of gray image can remove the convex structure that is inconsistent with the shape of the structure function, while retaining the consistent convex structure; However, the closing operation will fill those concave structures in the image that are inconsistent with the structure function, while retaining those consistent concave structures.

Chapter 5 Research on Key Area Method 2 1

The fifth chapter is the study of critical area method.

This chapter first discusses the concept of critical area and its significance to yield research, then studies the existing basic models of open circuit and short circuit critical area, analyzes its shortcomings, puts forward an improved application model of critical area, and designs an algorithm to extract critical area on this basis. Finally, the method of fault sensitivity analysis is studied, and the unity of MC method and critical zone method in fault sensitivity analysis is discussed.

5. Overview of1critical area method

The sensitivity of integrated circuits to manufacturing defects can be measured by critical area.

Area), it is generally considered that the definition of critical area is: the area of special area that will inevitably lead to circuit failure when defects appear on integrated circuit chips. Using the concept of critical area, the average number of failures caused by a certain manufacturing defect on the chip can be expressed as:

Aav? D

Product, d is the average defect density of this kind of defects. Aav can be expressed as: (3. 1) where λ is the average number of failures caused by such manufacturing defects on the chip and Aav is the average critical surface of such defects.

(3.2) AavA(R)h(R)dRR0R where A(R) is the critical area of a defect with particle size, and h(R) is the particle size distribution function RM of this defect.

Number, R0 represents the minimum line width of the layout, and RM represents the maximum defect particle size. The proposal of critical area implies an important concept: when a defect with a particle size of r appears on the chip during the manufacturing process, the defect does not necessarily lead to circuit failure, and whether it can lead to failure depends on whether its position is in the special area that constitutes the critical area.

A. the defects fall in key areas to form faults. B. The defect is not in the key area, which will not cause failure.

Fig. 3. 1 Schematic diagram of key areas causing circuit failure

The critical area determines whether the defect leads to failure, as shown in Figure 3. 1.

22. Yield improvement method based on morphology and linear programming method.

5.2 Research on Basic Model of Key Areas

There are many kinds of manufacturing defects, but the functional faults in the circuit can be mainly divided into line open circuit, conductor layer short circuit and conductor-to-conductor short circuit, in which the open circuit fault is mainly caused by conductor loss defect, the short circuit fault is mainly caused by conductor redundancy defect and the interlayer short circuit is mainly caused by pinhole defect. According to the failure mechanism of each defect, it is necessary to establish the critical zone model of each defect accordingly.

Fig. 3. Metal wires with length L and width W on a 2Y× X chip.

5.2. 1 Open key area basic model

Consider a simple layout pattern, as shown in Figure 3.2, with a length of L and a width of W (L >; W) metal line deposition.

Rc=R-W Ac(R)=(R-W)L

2W≤R W & lt; R & lt2W

Figure 3.3 Key areas of open circuit of long metal wires

The fifth chapter focuses on the field of law research 23 pages.

Yu Chang is Y(Y? L) On the insulating substrate with the width of x, the influence of missing object defects on metal open circuit is considered. Two conditions must be met when the metal wire is open due to the defect of the lost property. First, the particle size of the defect circle must be greater than or equal to the line width; Second, the center of the defect circle must fall in the shadow area, as shown in Figure 3.3. When these two conditions are met, the center of the defect must be located in the area with length L and width Rc, and the metal wire is completely broken. In this case, Rc and Ac(R) can be expressed as:

Rc? r? WAc(R)? Rc? L(3.3)

(R? W)L

The ratio of fault area width Rc to chip width w is defined as the fault core (equivalent to normalized fault rate), which is denoted as K(R? W). In this way, Ac(R) can be expressed as:

Ac(R)? AchipK(R? w)

(3.4)

Figure 3.4 Fault Core of Long Metal Wire Open Circuit

Where Achip stands for chip area. The fault core of long metal wire is shown in Figure 3.4, which can be expressed as:

0,R? WK(R? w),

X

1,0? r? WW? r? W? XR? W? X(3.5)

24. Yield improvement method based on morphology and linear programming method.

Figure 3.5 Core Features of Open Circuit Fault of Long Metal Wire

According to the fault core, when R is less than W, the missing object defect cannot cause the metal wire to open, that is, the fault rate is 0, while when R? W? X indicates that the particle size of the defect is larger than the chip width, and the circuit failure rate reaches the maximum. The key areas obtained from equation (3.4) are:

0? r? W? 0,(3.6) ? Communication (

R)L(R? W),W? r? W? X

x? y,R? W? x? When multiple metal wires are open, as shown in Figure 3.6, the defects of two adjacent metal wires are less than.

(2W? S), the critical area is equal to the sum of the critical areas of two metal wires, but when the particle size of the defect is greater than (2W? S), there are overlapping areas between fault areas, as shown in Figure 3.8. What is the length of the overlapping area xov? r? (2W? s),

The width of the fault zone is: (3.7) Rc?

2(R? w)? xov

Figure 3.6 Wiring unit diagram of two wires