Abstract
While industrial robots are widely deployed in manufacturing, their reliance on pre-programmed rules limits adaptability to process variability and unexpected disturbances. To robotize complex processes like gas tungsten arc welding (GTAW), human-like cognitive abilities in perceiving sensory feedback and making dynamic decisions are essential. Learning from demonstration (LfD) provides a promising framework to equip robots with human expertise. This work learns human response patterns to visual weld pool feedback for welding speed adjustment, enabling effective GTAW robotization through LfD. First, human demonstrations are collected by recording operators counteracting intentionally induced welding disturbances. A human response model is then developed, comprising three components: a convolutional neural network (CNN) for visual feature extraction, SHapley Additive exPlanations (SHAP) for important feature selection, and long short-term memory (LSTM) regression for capturing temporal dynamics from visual features to human actions. This architecture correlates visual cues with control actions through the CNN–LSTM design, capturing dynamic human cognition, while SHAP-based feature selection enhances model efficiency, enabling effective training with limited demonstration data. Experimental validation demonstrates that the proposed model successfully extracts human process intelligence for adaptive GTAW control.
| Original language | English |
|---|---|
| Journal | Welding in the World, Le Soudage Dans Le Monde |
| DOIs | |
| State | Accepted/In press - 2025 |
Bibliographical note
Publisher Copyright:© International Institute of Welding 2025.
Funding
This study was supported by NSF-CMMI 2024614 (PI: Yuming Zhang).
| Funders | Funder number |
|---|---|
| Division of Civil, Mechanical and Manufacturing Innovation | 2024614 |
Keywords
- CNN
- GTAW
- Learning from demonstration (LfD)
- LSTM
- SHAP
ASJC Scopus subject areas
- Mechanics of Materials
- Mechanical Engineering
- Metals and Alloys