Please use this identifier to cite or link to this item: https://dspace.ncfu.ru/handle/20.500.12258/24231
Full metadata record
DC FieldValueLanguage
dc.contributor.authorAbdulkadirov, R. I.-
dc.contributor.authorАбдулкадиров, Р. И.-
dc.contributor.authorLyakhov, P. A.-
dc.contributor.authorЛяхов, П. А.-
dc.contributor.authorNagornov, N. N.-
dc.contributor.authorНагорнов, Н. Н.-
dc.date.accessioned2023-08-03T08:22:31Z-
dc.date.available2023-08-03T08:22:31Z-
dc.date.issued2023-
dc.identifier.citationAbdulkadirov, R., Lyakhov, P., Nagornov, N. Survey of Optimization Algorithms in Modern Neural Networks // Mathematics. - 2023. - 11 (11), статья № 2466. - DOI: 10.3390/math11112466ru
dc.identifier.urihttp://hdl.handle.net/20.500.12258/24231-
dc.description.abstractThe main goal of machine learning is the creation of self-learning algorithms in many areas of human activity. It allows a replacement of a person with artificial intelligence in seeking to expand production. The theory of artificial neural networks, which have already replaced humans in many problems, remains the most well-utilized branch of machine learning. Thus, one must select appropriate neural network architectures, data processing, and advanced applied mathematics tools. A common challenge for these networks is achieving the highest accuracy in a short time. This problem is solved by modifying networks and improving data pre-processing, where accuracy increases along with training time. Bt using optimization methods, one can improve the accuracy without increasing the time. In this review, we consider all existing optimization algorithms that meet in neural networks. We present modifications of optimization algorithms of the first, second, and information-geometric order, which are related to information geometry for Fisher–Rao and Bregman metrics. These optimizers have significantly influenced the development of neural networks through geometric and probabilistic tools. We present applications of all the given optimization algorithms, considering the types of neural networks. After that, we show ways to develop optimization algorithms in further research using modern neural networks. Fractional order, bilevel, and gradient-free optimizers can replace classical gradient-based optimizers. Such approaches are induced in graph, spiking, complex-valued, quantum, and wavelet neural networks. Besides pattern recognition, time series prediction, and object detection, there are many other applications in machine learning: quantum computations, partial differential, and integrodifferential equations, and stochastic processes.ru
dc.language.isoenru
dc.relation.ispartofseriesMathematics-
dc.subjectApproximationru
dc.subjectSpiking neural networksru
dc.subjectBilevel optimizationru
dc.subjectQuasi-Newton methodsru
dc.subjectGradient-free optimizationru
dc.subjectGraph neural networksru
dc.subjectQuantum neural networksru
dc.subjectQuantum computationsru
dc.subjectPhysics-informed neural networksru
dc.subjectOptimization methodsru
dc.subjectFractional order optimizationru
dc.subjectInformation geometryru
dc.titleSurvey of Optimization Algorithms in Modern Neural Networksru
dc.typeСтатьяru
vkr.instФакультет математики и компьютерных наук имени профессора Н.И. Червяковаru
vkr.instСеверо-Кавказский центр математических исследованийru
Appears in Collections:Статьи, проиндексированные в SCOPUS, WOS

Files in This Item:
File Description SizeFormat 
scopusresults 2646 .pdf
  Restricted Access
132.06 kBAdobe PDFView/Open
WoS 1669 .pdf
  Restricted Access
123.49 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.