Additionally, reliance on sensor-based SHM methods poses reliability issues, while vision-based methods encounter difficulties in monitoring weld defects in intricate structures. The automated detection of weld defects based on weld images is still an evolving research discipline. Performing this involves several stages, including evaluation of the quality of the weld images, segmentation, and identification of defects. Deep learning models have shown considerable potential to automate weld image segmentation, allowing accurate segmentation of weld areas, which is crucial to determining the severity of defects.
Welding is a fundamental process in manufacturing and construction, playing a pivotal role in joining materials by melting and fusing them. This versatile technique finds widespread application across diverse industries, creating robust and structurally sound components. Regular inspection and continual oversight of the welding process are imperative. Manual inspection methods, however, prove inadequate in addressing the ever-growing demand for safety and precision. In recent years, integrating deep learning and computer vision technologies has revolutionized the welding field by enhancing and automating inspection and safety protocols.
This paper proposes DeepFuse WeldNet, an automated Weld detection Framework that employs a novel Gated Attention Squeeze and Excitation Fusion U-net (GASEUNet) for weld defect segmentation. Our research focuses on enhancing the accuracy of the model for more precise identification and measurement of weld porosity within images. The model provides a significant boost in segmentation accuracy that is better than existing methods. Our experimental results demonstrate the effectiveness of GASEUNet, with an impressive Jaccard Coefficient (JC) of 98.12. The proposed DeepFuse WeldNet also consists of a fully automated Weld Inspection Framework that classifies the weld defects as safe or unsafe by comparing the measurements to the standards provided by the American Welding Society (AWS)
Read the full article at Nature.com