Please use this identifier to cite or link to this item:
http://hdl.handle.net/20.500.12258/19637
Title: | Improving extreme search with natural gradient descent using dirichlet distribution |
Authors: | Abdulkadirov, R. I. Абдулкадиров, Р. И. Lyakhov, P. A. Ляхов, П. А. |
Keywords: | Adam algorithm;Dirichlet distribution;Fisher information matrix;Kullback-Leibler divergence;Natural gradient descent |
Issue Date: | 2022 |
Publisher: | Springer Science and Business Media Deutschland GmbH |
Citation: | Abdulkadirov, R. I., Lyakhov, P. A. Improving extreme search with natural gradient descent using dirichlet distribution // Lecture Notes in Networks and Systems. - 2022. - Том 424. - Стр.: 19 - 28. - DOI10.1007/978-3-030-97020-8_3 |
Series/Report no.: | Lecture Notes in Networks and Systems |
Abstract: | Natural gradient descent is an optimization algorithm, which is proposed to replace stochastic gradient descent and its modifications. The most precious ability of this algorithm is to reach the extreme with little number of iterations and required accuracy, which has high value in machine learning and statistics. The goal of this article is to propose a natural gradient descent algorithm with the Dirichlet distribution, which includes step-size adaptation. We will prove experimentally advantage of natural gradient descent over stochastic gradient descent and Adam algorithm. Additionally, the calculating of the Fisher information matrix of Dirichlet distribution will be shown. |
URI: | http://hdl.handle.net/20.500.12258/19637 |
Appears in Collections: | Статьи, проиндексированные в SCOPUS, WOS |
Files in This Item:
File | Size | Format | |
---|---|---|---|
scopusresults 2205 .pdf Restricted Access | 64.03 kB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.