Abstract
Gas tungsten arc welding (GTAW) is the primary joining process for critical applications where joining precision is crucial. However, variations in manufacturing conditions adversely affect the joining precision. The dynamic joining process needs to be monitored and adaptively controlled to assure the specified weld quality be produced despite variations. Among required weld qualities, the weld joint penetration is often the most critical one as an incomplete penetration causes explosion under high temperature/pressure and an excessive penetration/heat input affects the flow of fluids and degrades materials properties. Unfortunately, detecting its development, how the melted metal has developed within the work-piece, is challenging as it occurs underneath and is not directly observable. The key to solving the problem is to find, or design, measurable physical phenomena that are fully determined by the weld penetration and then correlate the phenomena to the penetration. Analysis shows that the weld pool surface that is directly observable using an innovative active vision method developed at the University of Kentucky is correlated to the thermal expansion of melted metal, thus the weld penetration. However, the surface is also affected by prior conditions. As such, we propose to form a composite image from the image taken from the initial pool, reflecting prior condition and from real-time developing pool such that this single composite image reflecting the measurable phenomena is only determined by the development of the weld penetration. To further correlate the measurable phenomena to the weld penetration, conventional methods analyze the date/images and propose features that may fundamentally characterize the phenomena. This kind of hand engineering method is tedious and does not assure success. To address this challenge, a convolutional neural network (CNN) is adopted that allows the raw composite images to be used directly as the input without need for hand engineering to manually analyze the features. The CNN model is applied to train, verify and test the datasets and the generated training model is used to identify the penetration states such that the welding current can be reduced from the peak to the base level after the desired penetration state is achieved despite manufacturing condition variations. The results show that the accuracy of the CNN model is approximately 97.5%.
Original language | English |
---|---|
Pages (from-to) | 908-915 |
Number of pages | 8 |
Journal | Journal of Manufacturing Processes |
Volume | 56 |
DOIs | |
State | Published - Aug 2020 |
Bibliographical note
Publisher Copyright:© 2020 The Society of Manufacturing Engineers
Funding
This work was supported by the International Research Cooperation Seed Fund of Beijing University of Technology ; National Natural Science Foundation of China [grant numbers 51775007 , 51975014 ]; Beijing Municipal Natural Science Foundation [grant number 3192004 ].
Funders | Funder number |
---|---|
National Natural Science Foundation of China (NSFC) | 51775007, 51975014 |
Beijing University of Technology | |
Beijing Municipal Natural Science Foundation | 3192004 |
Keywords
- Active vision
- CNN
- Composite image design
- GTAW-P
- Penetration mode
ASJC Scopus subject areas
- Strategy and Management
- Management Science and Operations Research
- Industrial and Manufacturing Engineering