Using Machine Learning To Increase Yield And Lower Packaging … – SemiEngineering

Posted: April 17, 2023 at 12:13 am


without comments

Packaging is becoming more and more challenging and costly. Whether the reason is substrate shortages or the increased complexity of packages themselves, outsourced semiconductor assembly and test (OSAT) houses have to spend more money, more time and more resources on assembly and testing. As such, one of the more important challenges facing OSATs today is managing die that pass testing at the fab level but fail during the final package test.

But first, lets take a step back in the process and talk about the front-end. A semiconductor fab will produce hundreds of wafers per week, and these wafers are verified by product testing programs. The ones that pass are sent to an OSAT for packaging and final testing. Any units that fail at the final testing stage are discarded, and the money and time spent at the OSAT dicing, packaging and testing the failed units is wasted (figure 1).

Fig. 1: The process from fab to OSAT.

According to one estimate, based on the price of a 5nm wafer for a high-end smartphone, the cost of package assembly and testing is close to 30% of the total chip cost (Table 1). Given this high percentage (30%), it is considerably more cost-effective for an OSAT to only receive wafers that are predicted to pass the final package test. This ensures fewer rejects during the final package testing step, minimized costs, and more product being shipped out. Machine learning could offer manufacturers a way to accomplish this.

Table 1: Estimated breakdown of the cost of a chip for a high-end smartphone.

Using traditional methods, an engineer obtains inline metrology/wafer electrical test results for known good wafers that pass the final package test. The engineer then conducts a correlation analysis using a yield management software statistics package to determine which parameters and factors have the highest correlation to the final test yield. Using these parameters, the engineer then performs a regression fit, and a linear/non-linear model is generated. In addition, the model set forth by the yield management software is validated with new data. However, this is not a hands-off process. A periodic manual review of the model is needed.

Machine learning takes a different approach. In contrast to the previously mentioned method, which places greater emphasis on finding the model that best explains the final package test data, an approach utilizing machine learning capabilities emphasizes a models predictive ability. Due to the limited capacity of OSATs, a machine learning model trained with metrology and product testing data at the fab level and final test package data at the OSAT level creates representative results for the final package test.

With the deployment of a machine learning model predicting the final test yield of wafers at the OSAT, bad wafers will be automatically tagged at the fab in a manufacturing execution system and given an assigned wafer grade of last-to-ship (LTS). Fab real-time dispatching will move wafers with the assigned wafer grade to an LTS wafer bank, while wafers that meet the passing criteria of the machine learning model will be shipped to the OSAT, thus ensuring only good parts are sent to the packaging house for dicing and packaging. Moreover, additional production data would be used to validate the machine learning models predictions, with the end result being increased confidence in the model. A blind test can even examine specific critical parts of a wafer.

The machine learning approach also offers several advantages to more traditional approaches. This model is inherently tolerant of out-of-control conditions, trends and patterns are easily identified, the results can be improved with more data, and perhaps most significantly, no human intervention is needed.

Unfortunately, there are downsides. A large volume of data is needed for a machine learning model to make accurate predictions, but while more data is always welcome, this approach is not ideal for new products or R&D scenarios. In addition, this machine learning approach requires significant allocations of time and resources, and that means more compute power and more time to process complete datasets.

Furthermore, questions will need to be asked about the quality of the algorithm being used. Perhaps it is not the right model and, as a result, will not be able to deliver the correct results. Or perhaps the reasoning for the algorithms predictions are difficult to understand. Simply put: How does the algorithm decide which wafers are, in fact, good and which will be marked Last to Ship? And then there is the matter that incorrect or incomplete data will deliver poor results. Or as the saying goes, garbage in, garbage out.

The early detection and prediction of only good products shipping to OSATs has become increasingly critical, in part because the testing of semiconductor parts is the most expensive part of the manufacturing flow. By only testing good parts through the creation of a highly leveraged yield/operations management platform and machine learning, OSAT houses are able to increase capital utilization and return on investment, thus ensuring cost effectiveness and a continuous supply of finished goods to end customers. While this is one example of the effectiveness of machine learning models, there is so much more to learn about how such approaches can increase yield and lower costs for OSATs.

Read the original:

Using Machine Learning To Increase Yield And Lower Packaging ... - SemiEngineering

Related Posts

Written by admin |

April 17th, 2023 at 12:13 am

Posted in Machine Learning




matomo tracker