بهبود عملکرد طبقهبند شبکه عصبی چندجملهای با استفاده از الگوریتم بهینهسازی نهنگ
محورهای موضوعی : مهندسی برق و کامپیوترمهسا معماری 1 , عباس حریفی 2 * , عبدالله خلیلی 3
1 - دانشگاه هرمزگان،گروه مهندسی برق و کامپیوتر
2 - دانشگاه هرمزگان،گروه مهندسی برق و کامپیوتر
3 - دانشگاه هرمزگان،گروه مهندسی برق و کامپیوتر
کلید واژه: الگوریتمهای فرااکتشافی, رایانش ابری, زنجیره مارکوف جاذب, کاهش مصرف انرژی,
چکیده مقاله :
شبکه عصبی چندجملهای (PNN) یک الگوریتم یادگیری بانظارت و از محبوبترین مدلهای مورد استفاده در کاربردهای واقعی است. هرچه شبکه عصبی چندجملهای از نظر تعداد توصیفات جزئی (PDها) و لایهها ساختار پیچیدهتری داشته باشد، نیاز به زمان و حجم بیشتری برای محاسبه و ذخیرهسازی دارد. در این تحقیق رویکرد جدیدی در زمینه بهبود کارایی طبقهبند شبکه عصبی چندجملهای با استفاده از الگوریتم بهینهسازی نهنگ (WOA) به نام PNN-WOA پیشنهاد شده که علاوه بر افزایش دقت PNN، زمان و حجم محاسبات قابل تحملی دارد. در رویکرد پیشنهادی، PDها بر اساس ترکیب دوبهدوی ویژگیها از نمونههای آموزشی در لایه اول تولید میشوند. مجموعهای از PDهای تولیدشده در لایه اول، متغیرهای ورودی و بایاس، عصبهای لایه دوم را تشکیل میدهند. در نهایت خروجی شبکه عصبی چندجملهای، توسط مجموع وزندهی شده خروجیهای لایه دوم به دست میآید. با استفاده از الگوریتم بهینهسازی نهنگ (WOA) بهترین بردار ضرایب وزندهی به گونهای که شبکه PNN بیشترین دقت طبقهبندی را داشته باشد، به دست میآید. برای ارزیابی روش PNN-WOA از یازده مجموعه داده موجود در پایگاه داده UCI استفاده شد. نتایج نشان میدهند که PNN-WOA در مقایسه با روشهای پیشین از قبیل PNN-RCGA، PNN-MOPPSO، RCPNN-PSO و S-TWSVM عملکرد مناسبی دارد. همچنین نتایج آزمون آماری فریدمن نشان میدهند که در مجموع، روش پیشنهادی PNN-WOA نسبت به سایر روشهای مقایسهشده، از نظر آماری عملکرد بهتری (با مقدار P برابر 039/0) داشته است.
Polynomial neural network (PNN) is a supervised learning algorithm which is one of the most popular models used in real applications. The architectural complexity of polynomial neural network in terms of both number of partial descriptions (PDs) and number of layers, leads to more computation time and more storage space requirement. In general, it can be said that the architecture of the polynomial neural networks is very complex and it requires large memory and computation time. In this research, a novel approach has been proposed to improve the classification performance of a polynomial neural network using the Whale Optimization Algorithm (PNN-WOA). In this approach, the PDs are generated at the first layer based on the combination of two features. The second layer nodes consists of PDs generated in the first layer, input variables and bias. Finally, the polynomial neural network output is obtained by sum of weighted values of the second layer outputs. Using the Whale Optimization Algorithm (WOA), the best vector of weighting coefficients will be obtained in such a way that the PNN network reach to the highest classification accuracy. Eleven different dataset from UCI database has been used as input data of proposed PNN-WOA and the results has been presented. The proposed method outperforms state-of-the-art approaches such as PNN-RCGA, PNN-MOPPSO, RCPNN-PSO and S-TWSVM in most cases. For datasets, an improvement of accuracy between 0.18% and 10.33% can be seen. Also, the results of the Friedman test indicate the statistical superiority of the proposed PNN-WOA model compared to other methods with p value of 0.039.
[1] O. I. Abiodun, A. Jantan, A. E. Omolara, K. V. Dada, N. A. Mohamed, and H. Arshad, "State-of-the-art in artificial neural network applications: a survey," Heliyon, vol. 4, no. 11, Article ID: e00938, Nov. 2018.
[2] N. L. da Costa, M. D. de Lima, and R. Barbosa, "Evaluation of feature selection methods based on artificial neural network weights," Expert Syst. Appl., vol. 168, no. 7, Article ID: 114312, Apr. 2021.
[3] J. Zhou, H. G. Amir, F. Chen, and A. Holzinger, "Evaluating the quality of machine learning explanations: a survey on methods and metrics," Electronics, vol. 10, no. 5, Article ID: 593, 19 pp., Mar. 2021.
[4] M. Kantardzic, Data Mining: Concepts, Models, Methods, and Algorithms, John Wiley, 2020.
[5] J. Beyerer, M. Richter, and M. Nagel, Pattern Recognition: Introduction, Features, Classifiers and Principles. Walter de Gruyter GmbH & Co KG, 2017.
[6] T. A. Al-Asadi, O. J. Ahmed, R. Hidayat, and A. A. Ramli, "A survey on web mining techniques and applications," Int. J. on Adv. Sci. Eng. and Inf. Tech., vol. 7, no. 4, pp. 1178-1184, 2017.
[7] Y. Cao, T. A. Geddes, J. Y. H. Yang, and P. Yang, "Ensemble deep learning in bioinformatics," Nat. Mach. Intell., vol. 2, no. 9, pp. 500-508, Aug. 2020.
[8] I. Sadgali, N. Sael, and F. Benabbou, "Performance of machine learning techniques in the detection of financial frauds," Procedia Comput Sci, vol. 148, no. 6, pp. 45-54, 2019.
[9] J. Henriquez and W. Kristjanpoller, "A combined independent component analysis-neural network model for forecasting exchange rate variation," Appl Soft Comput, vol. 83, no. C, Article ID: 105654, Oct. 2019.
[10] I. N. Da Silva, D. H. Spatti, R. A. Flauzino, L. H. B. Liboni, and S. F. dos Reis Alves, "Artificial neural network architectures and training processes," Artif. Neural Netw., Springer, Cham, pp. 21-28, 2017.
[11] C. Singh, W. J. Murdoch, and B. Yu, Hierarchical Interpretations for Neural Network Predictions, arXiv preprint arXiv: 1806.05337, 2018.
[12] F. Ros, M. Pintore, and J. R. Chrétien, "Automatic design of growing radial basis function neural networks based on neighboorhood concepts," Chemometrics and Intelligent Laboratory Systems, vol. 87, no. 2, pp. 231-240, Jun. 2007.
[13] A. G. Ivakhnenko, "Polynomial theory of complex systems," IEEE Trans. on Systems, Man, and Cybernetics, vol. 1, no. 4, pp. 364-378, Oct. 1971.
[14] C. T. Lin, M. Prasad, and A. Saxena, "An improved polynomial neural network classifier using real-coded genetic algorithm," IEEE Trans. on Systems, Man, and Cybernetics: Systems, vol. 45, no. 11, pp. 1389-1401, Nov. 2015.
[15] L. L. Huang, A. Shimizu, Y. Hagihara, and H. Kobatake, "Face detection from cluttered images using a polynomial neural network," Neurocomputing, vol. 51, no. 12, pp. 197-211, Apr. 2003.
[16] S. B. Roh, T. C. Ahn, and W. Pedrycz, "Fuzzy linear regression based on Polynomial Neural Networks," Expert Systems with Applications, vol. 39, no. 10, pp. 8909-8928, Aug. 2012.
[17] C. C. Huang, W. C. Chen, C. Y. Shen, Y. J. Chen, C. Y. Chang, and R. C. Hwang, "Signal processing by polynomial NN and equivalent polynomial function," in Proc. 1st Int. Conf. on, Pervasive Computing Signal Processing and Applications, PCSPA’10, pp. 460-463, Harbin, China, 17-19 Sept. 2010.
[18] L. Zjavka and W. Pedrycz, "Constructing general partial differential equations using polynomial and neural networks," Neural Networks, vol. 73, no. 1, pp. 58-69, Jan. 2016.
[19] M. Mehrabi, M. Sharifpur, and J. P. Meyer, "Application of the FCM-based neuro-fuzzy inference system and genetic algorithm-polynomial neural network approaches to modelling the thermal conductivity of alumina_water nanofluids," International Communications in Heat and Mass Transfer, vol. 39, no. 7, pp. 971-977, Aug. 2012.
[20] M. F. Zarandi, I. B. Türksen, J. Sobhani, and A. A. Ramezanianpour, "Fuzzy polynomial neural networks for approximation of the compressive strength of concrete," Applied Soft Computing, vol. 8, no. 1, pp. 488-498, Jan. 2008.
[21] S. Dehuri, B. B. Misra, A. Ghosh, and S. B. Cho, "A condensed polynomial neural network for classification using swarm intelligence," Applied Soft Computing, vol. 11, no. 3, pp. 3106-3113, Apr. 2011.
[22] M. Li, J. Tian, and F. Chen, "Improving multiclass pattern recognition with a co-evolutionary RBFNN," Pattern Recognition Letters, vol. 29, no. 4, pp. 392-406, Mar. 2008.
[23] B. B. Misra, S. Dehuri, P. K. Dash, and G. Panda, "A reduced and comprehensible polynomial neural network for classification," Pattern Recognition Letters, vol. 29, no. 12, pp. 1705-1712, Sept. 2008.
[24] Z. Qi, Y. Tian, and Y. Shi, "Laplacian twin support vector machine for semi-supervised classification," Neural Networks, vol. 35, no. 4, pp. 46-53, Nov. 2012.
[25] S. Dehuri and S. B. Cho, "Multi-criterion Pareto based particle swarm optimized polynomial neural network for classification: a review and state-of-the-art," Computer Science Review, vol. 3, no. 1, pp. 19-40, Feb. 2009.
[26] Z. Qi, Y. Tian, and Y. Shi, "Structural twin support vector machine for classification," Knowledge-Based Systems, vol. 43, no. 7, pp. 74-81, May 2013.
[27] Q. Hou, L. Liu, L. Zhen, and L. Jing, "A novel projection nonparallel support vector machine for pattern classification," Eng. Appl. Artif. Intell., vol. 75, no. 7, pp. 64-75, Oct. 2018.
[28] W. J. Chen, Y. H. Shao, C. N. Li, Y. Q. Wang, M. Z. Liu, and Z. Wang, "NPrSVM: nonparallel sparse projection support vector machine with efficient algorithm," Appl. Soft Comput., vol. 90, no. 3, Article ID: 106142, May 2020.
[29] H. Pant, M. Sharma, and S. Soman, "Twin neural networks for the classification of large unbalanced datasets," Neurocomputing, vol. 343, no. 4, pp. 34-49, May 2019.
[30] R. Rastogi, P. Saigal, and S. Chandra, "Angle-based twin parametric-margin support vector machine for pattern classification," Knowl. Based. Syst., vol. 139, no. 6, pp. 64-77, Jan. 2018.
[31] S. K. Oh, W. Pedrycz, and B. J. Park, "Polynomial neural networks architecture: analysis and design," Computers & Electrical Engineering, vol. 29, no. 6, pp. 703-725, May 2003.
[32] S. Mirjalili and A. Lewis, "The whale optimization algorithm," Advances in Engineering Software, vol. 95, no. 6 pp. 51-67, May 2016.
[33] W. A. Watkins and W. E. Schevill, "Aerial observation of feeding behavior in four baleen whales: Eubalaena glacialis, Balaenoptera borealis, Megaptera novaeangliae, and Balaenoptera physalus," J. of Mammalogy, vol. 60, no. 1, pp. 155-163, Feb. 1979.