Zastosowanie głębokiego uczenia na danych hiperspektralnych do klasyfikacji materiałów przyszwy obuwia

Streszczenie
Rosnąca w społeczeństwie świadomość ekologiczna sprawia, że w Europie rośnie zapotrzebowanie na używaną odzież i obuwie. Jednak wysokie koszty siły roboczej w krajach UE sprawiają, że ręczne sortowanie odzieży staje się nieopłacalne. Skuteczna klasyfikacja materiałów odzieży i obuwia w oparciu o tradycyjne rozwiązania, np. kamery mono/RGB nie jest możliwa. W artykule opisujemy opracowany i wdrożony system automatycznej klasyfikacji materiałów przyszwy obuwia na linii sortującej, wykorzystujący sztuczne sieci neuronowe do analizy obrazu z kamer hiperspektralnych w paśmie NIR-SWIR (900–1700 nm).
Słowa kluczowe
analiza obrazu, HSI, HSI-CNN, obrazowanie hiperspektralne, sieci neuronowe
Application of Deep Learning on Hyperspectral Data for Classification of Shoe Upper Materials
Abstract
The growing environmental awareness in society is driving an increasing demand for second-hand clothing and footwear in Europe. However, the high labor costs in EU countries make manual sorting of clothing not economically viable. Effective classification of garment and footwear materials using traditional solutions, such as mono/RGB cameras, is not possible. In our article, we describe the development and implementation of an automated system for classifying upper shoe materials on a sorting line, utilizing artificial neural networks to analyze images from hyperspectral cameras in the NIR-SWIR range (900–1700 nm).
Keywords
HSI, HSI-CNN, hyperspectral imaging, image analysis, neural network
Bibliografia
- Saccani N., Bressanelli G., Visintin F., Circular supply chain orchestration to overcome Circular Economy challenges: An empirical investigation in the textile and fashion industries, “Sustainable Production and Consumption”, Vol. 35, 2023, 469–482, DOI: 10.1016/j.spc.2022.11.020.
- Wójcik-Karpacz A., Karpacz J., Brzeziński P., Pietruszka-Ortyl A., Ziębicki B., Barriers and Drivers for Changes in Circular Business Models in a Textile Recycling Sector: Results of Qualitative Empirical Research, “Energies”, Vol. 16, No. 1, 2023, DOI: 10.3390/en16010490.
- Yuen P. WT, Richardson M., An introduction to hyperspectral imaging and its application for security, surveillance and target acquisition, “The Imaging Science Journal”, Vol. 58, No. 5, 2010, 241–253, DOI: 10.1179/174313110X12771950995716.
- ElMasry G., Sun D.-W., CHAPTER 1 – Principles of Hyperspectral Imaging Technology, “Hyperspectral Imaging for Food Quality Analysis and Control”, 2010, 3–43, DOI: 10.1016/B978-0-12-374753-2.10001-2.
- Daveri A., Paziani S., Marmion M., Harju H., Vidman A., Azzarelli M., Vagnini M., New perspectives in the non-invasive, in situ identification of painting materials: The advanced MWIR hyperspectral imaging, “TrAC Trends in Analytical Chemistry”, Vol. 98, 2018, 143–148, DOI: 10.1016/j.trac.2017.11.004.
- Liu Y., Yang Z.-H., Yu Y.-J., Wu L.-A., Song M.-Y., Zhao Z.-H., Chromatic-Aberration-Corrected Hyperspectral Single-Pixel Imaging, “Photonics”, Vol. 10, No. 1, 2023, DOI: 10.3390/photonics10010007.
- Ye Seong Kang, Chan Seok Ryu, Jeong Gyun Kang, Presenting a Multispectral Image Sensor for Quantification of Total Polyphenols in Low-Temperature Stressed Tomato Seedlings Using Hyperspectral Imaging, “Sensors”, Vol. 24, No. 13, 2024, DOI: 10.3390/s24134260.
- Skauli T., A Performance Characteristic for Net Light Collection in Hyperspectral and Conventional Cameras: A*, “IEEE Transactions on Geoscience and Remote Sensing”, Vol. 60, 2022, DOI: 10.1109/TGRS.2022.3228071.
- Qingli Li i in., Review of spectral imaging technology in biomedical engineering: achievements and challenges, “Journal of Biomedical Optics”, Vol. 18, No. 10, 2013, DOI: 10.1117/1.JBO.18.10.100901.
- Boldrini B., Kessler W., Rebner K., Kessler R.W., Hyperspectral Imaging: A Review of Best Practice, Performance and Pitfalls for in-line and on-line Applications, “Journal of Near Infrared Spectroscopy”, Vol. 20, No. 5, 2012, 483–508, DOI: 10.1255/jnirs.1003.
- Sousa J.J., i in., UAV-Based Hyperspectral Monitoring Using Push-Broom and Snapshot Sensors: A Multisite Assessment for Precision Viticulture Applications, “Sensors”, Vol. 22, No. 17, 2022, DOI: 10.3390/s22176574.
- Palmieri R., Gasbarrone R., Fiore L., Hyperspectral Imaging for Sustainable Waste Recycling, “Sustainability”, Vol. 15, No. 10, 2023, DOI: 10.3390/su15107752.
- Saha D., Manickavasagan A., Machine learning techniques for analysis of hyperspectral images to determine quality of food products: A review, “Current Research in Food Science”, Vol. 4, 2021, 28–44, DOI: 10.1016/j.crfs.2021.01.002.
- Mehrad Nikzadfar i in., Hyperspectral Imaging Aiding Artificial Intelligence: A Reliable Approach for Food Qualification and Safety, “Applied Sciences”, Vol. 14, No. 21, 2024, DOI: 10.3390/app14219821.
- Stuart M.B., McGonigle A.J.S., Willmott J.R., Hyperspectral Imaging in Environmental Monitoring: A Review of Recent Developments and Technological Advances in Compact Field Deployable Systems, “Sensors”, Vol. 19, No. 14, 2019, DOI: 10.3390/s19143071.
- Yan Y., Ren J., Sun H., Nondestructive Quality Control in Powder Metallurgy using Hyperspectral Imaging, “Computer Vision and Pattern Recognition”, 2022, DOI: 10.48550/arXiv.2207.12966.
- Gao Z., Du M., Cao N., Hou M., Wang W., Lyu S., Application of hyperspectral imaging technology to digitally protect murals in the Qutan temple, “Heritage Science”, Vol. 11, No. 1, 2023, DOI: 10.1186/s40494-022-00847-7.
- Luo Y., Zoi J., Yao C., Zhao X., Li T., Bai G., HSICNN: A Novel Convolution Neural Network for Hyperspectral Image, 2018 International Conference on Audio, Language and Image Processing (ICALIP), 2018, 464–469, DOI: 10.1109/ICALIP.2018.8455251.
- Khan M.J., Khan H.S., Yousaf A., Khurshid K., Abbas A., Modern Trends in Hyperspectral Image Analysis: A Review, “IEEE Access”, Vol. 6, 2018, 14118–14129, DOI: 10.1109/ACCESS.2018.2812999.
- Lee H., Kwon H., Going Deeper With Contextual CNN for Hyperspectral Image Classification, “IEEE Transactions on Image Processing”, Vol. 26, No. 10, 2017, 4843–4855, DOI: 10.1109/TIP.2017.2725580.