22. 01. 2025

SOMA algorithm enhances the performance of neural networks for image segmentation

Evolutionary optimization algorithms, particularly the self-organizing migrating algorithm (SOMA), have demonstrated significant effectiveness in enhancing the performance of neural networks for skin segmentation tasks. This finding is supported by a paper published in Scientific Reports, authored by Ivan Zelinka from the Faculty of Electrical Engineering and Computer Science at VSB-TUO, along with collaborators from Vietnam. Research funded by the REFRESH project indicates that the SOMA method outperforms traditional optimization techniques and other evolutionary algorithms, showcasing its potential application in various domains of computer vision.

The authors of the study conducted a comparative analysis of SOMA against well-known gradient-based optimization methods such as ADAM and SGDM, as well as another evolutionary algorithm, differential evolution (DE). The experiments utilized a dataset comprising 245,057 samples. For each optimization method, the researchers assessed both performance metrics and perceptual quality—the subjective evaluation of image quality by human observers. The findings reveal that the neural network trained by SOMA achieves the highest accuracy at 93.18%, surpassing ADAM (84.87%), SGDM (84.79%), and DE (91.32%).

“I highlight two key contributions of this research. Firstly, we have demonstrated the potential of integrating evolutionary optimization algorithms, such as SOMA, into the training process of artificial neural networks. While there are other learning algorithms available, these techniques—rooted in differential evolution and SOMA —not only facilitate learning but also aid in designing the network’s structure and various parameters. Consequently, we can effectively simulate the evolution of networks on a computer to achieve the desired outcomes. This integration significantly enhances performance in tasks like image processing,” stated Professor Zelinka, who is also involved with the Industry 4.0 & Automotive Lab of the REFRESH project.

“Second, our proposed method shows some novelty and efficiency compared to traditional gradient-based optimization techniques and other evolutionary algorithms. Again, it is not about the victory of one or the other algorithm, but about their innovative use,” he added.

The authors believe that this method holds significant potential for use in various fields of machine learning and classification. “Incorporating emerging technologies like quantum computers or their hardware simulators could enhance and broaden this approach, thereby advancing developments in these areas,” stated Zelinka.

Artificial neural networks (ANNs) and related systems, such as GPT, have become a key component in a variety of contemporary technologies across fields like image recognition and text processing. They also play a significant role in medicine, aiding in the diagnosis of specific diseases and the management of healthcare. These networks draw inspiration from the structure of the human brain, comprising interconnected ‘nodes’ or neurons that collaborate and learn from data. Nonetheless, training these networks remains a considerable challenge due to the complexity of optimizing numerous parameters.