Search published articles

Showing 2 results for Khishe

M. R. Mosavi, M. Khishe, Y. Hatam Khani, M. Shabani,
Volume 13, Issue 1 (March 2017)

Radial Basis Function Neural Networks (RBF NNs) are one of the most applicable NNs in the classification of real targets. Despite the use of recursive methods and gradient descent for training RBF NNs, classification improper accuracy, failing to local minimum and low-convergence speed are defects of this type of network. In order to overcome these defects, heuristic and meta-heuristic algorithms have been conventional to training RBF network in the recent years. This study uses Stochastic Fractal Search Algorithm (SFSA) for training RBF NNs. The particles in the new algorithm explore the search space more efficiently by using the diffusion property, which is observed regularly in arbitrary fractals. To assess the performance of the proposed classifier, this network will be evaluated with the two benchmark datasets and a high-dimensional practical dataset (i.e., sonar). Results indicate that new classifier classifies sonar dataset six percent better than the best algorithm and its convergence speed is better than the other algorithms. Also has better performance than classic benchmark algorithms about all datasets.

A. Saffari, S. H. Zahiri, M. Khishe,
Volume 18, Issue 1 (March 2022)

In this paper, multilayer perceptron neural network (MLP-NN) training is used by the grasshopper optimization algorithm with the tuning of control parameters using a fuzzy system for the big data sonar classification problem. With proper tuning of these parameters, the two stages of exploration and exploitation are balanced, and the boundary between them is determined correctly. Therefore, the algorithm does not get stuck in the local optimization, and the degree of convergence increases. So the main aim is to get a set of real sonar data and then classify real sonar targets from unrealistic targets, including noise, clutter, and reverberation, using GOA-trained MLP-NN developed by the fuzzy system. To have accurate comparisons and prove the GOA performance developed with fuzzy logic (called FGOA), nine benchmark algorithms GOA, GA, PSO, GSA, GWO, BBO, PBIL, ES, ACO, and the standard backpropagation (BP) algorithm were used. The measured criteria are concurrency speed, ability to avoid local optimization, and accuracy. The results show that FGOA has the best performance for training datasets and generalized datasets with 96.43% and 92.03% accuracy, respectively.

Page 1 from 1     

Creative Commons License
© 2022 by the authors. Licensee IUST, Tehran, Iran. This is an open access journal distributed under the terms and conditions of the Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) license.