Hostname: page-component-89b8bd64d-ktprf Total loading time: 0 Render date: 2026-05-08T20:17:47.972Z Has data issue: false hasContentIssue false

Demystifying data and AI for manufacturing: case studies from a major computer maker

Published online by Cambridge University Press:  08 March 2021

Yi-Chun Chen
Affiliation:
AI Center, Inventec Corp., Taipei, Taiwan
Bo-Huei He
Affiliation:
Skywatch Innovation Inc., Taipei, Taiwan
Shih-Sung Lin
Affiliation:
Skywatch Innovation Inc., Taipei, Taiwan
Jonathan Hans Soeseno
Affiliation:
AI Center, Inventec Corp., Taipei, Taiwan
Daniel Stanley Tan
Affiliation:
AI Center, Inventec Corp., Taipei, Taiwan
Trista Pei-Chun Chen
Affiliation:
AI Center, Inventec Corp., Taipei, Taiwan
Wei-Chao Chen*
Affiliation:
AI Center, Inventec Corp., Taipei, Taiwan Skywatch Innovation Inc., Taipei, Taiwan
*
Corresponding author: W.-C. Chen Email: chen.wei-chao@inventec.com

Abstract

In this article, we discuss the backgrounds and technical details about several smart manufacturing projects in a tier-one electronics manufacturing facility. We devise a process to manage logistic forecast and inventory preparation for electronic parts using historical data and a recurrent neural network to achieve significant improvement over current methods. We present a system for automatically qualifying laptop software for mass production through computer vision and automation technology. The result is a reliable system that can save hundreds of man-years in the qualification process. Finally, we create a deep learning-based algorithm for visual inspection of product appearances, which requires significantly less defect training data compared to traditional approaches. For production needs, we design an automatic optical inspection machine suitable for our algorithm and process. We also discuss the issues for data collection and enabling smart manufacturing projects in a factory setting, where the projects operate on a delicate balance between process innovations and cost-saving measures.

Information

Type
Industrial Technology Advances
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - ND
This is an Open Access article, distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives licence (http://creativecommons.org/licenses/by-nc-nd/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is unaltered and is properly cited. The written permission of Cambridge University Press must be obtained for commercial re-use or in order to create a derivative work.
Copyright
Copyright © The Author(s), 2021. Published by Cambridge University Press
Figure 0

Fig. 1. Our roles within the company. Manufacturing data tend to scatter throughout various IT systems, and part of our mission is to normalize the data by enabling AI projects with apparent impacts for the business units. The arrow indicates data flow.

Figure 1

Fig. 2. The data-driven forecasting model uses both the historical sales data and the forecasts from customers as input for each time step.

Figure 2

Fig. 3. Our system automatically update the data-driven forecasting model parameters through a RESTful API.

Figure 3

Fig. 4. Error comparison between our data-driven forecast model, customer's forecast, and ARIMA.

Figure 4

Fig. 5. A laptop system assembly facility. Top: an assembly line for higher production rate. Bottom: an assembly cell for lower yield but higher product variety.

Figure 5

Fig. 6. The overview of the laptop functional testing system.

Figure 6

Fig. 7. (a) Keyboard Actuator. (b) Camera (side view). (c)Camera (back view). The production hardware for the function testing system.

Figure 7

Fig. 8. Several typical test steps and their screen shots.

Figure 8

Fig. 9. Some examples of defects. (a) Defects that are clearly defined and easily recognizable. (b) Ambiguous or unexpected defects that are confusing to annotate. (c) Defect sample that illustrates the confusion in grouping defects. (d) Varying tightness of the bounding box.

Figure 9

Fig. 10. Overview of the Faster-RCNN pipeline.

Figure 10

Fig. 11. Overview of the Auto-Encoder pipeline for defect detection.

Figure 11

Fig. 12. Snapshot of our machine for automatic inspection of laptops.

Figure 12

Fig. 13. Visual results of Faster-RCNN compared to the Auto-Encoder for defect detection on laptop surfaces. We converted the segmentation results from the auto-encoder into bounding boxes for easier comparison. Red boxes show the ground truths; green boxes show the predictions.

Figure 13

Fig. 14. Visual results of Faster-RCNN compared to the Auto-Encoder for defect detection on DAGM (top) and MVTec (bottom) datasets, given only on five defective image samples per product for training. We converted the segmentation results from the auto-encoder into bounding boxes for easier comparison. Red boxes show the ground truths; green boxes show the predictions.

Figure 14

Table 1. Results on the DAGM dataset in terms of average precision (AP).

Figure 15

Table 2. Results on the MVTec dataset in terms of average precision (AP).

Figure 16

Table 3. Results on the proprietary laptop dataset compared to the Faster-RCNN baseline.