Please use this identifier to cite or link to this item:
|Title:||Accelerating extreme search based on natural gradient descent with beta distribution|
|Authors:||Lyakhov, P. A.|
Ляхов, П. А.
Abdulkadirov, R. I.
Абдулкадиров, Р. И.
|Keywords:||Distribution;Natural gradient descent;Hessian;Fisher matrix;Extreme search|
|Publisher:||Institute of Electrical and Electronics Engineers Inc.|
|Citation:||Lyakhov, P., Abdulkadirov, R. Accelerating extreme search based on natural gradient descent with beta distribution // 2021 International Conference Engineering and Telecommunication, En and T 2021. - 2021. - DOI10.1109/EnT50460.2021.9681769|
|Series/Report no.:||2021 International Conference Engineering and Telecommunication, En and T 2021|
|Abstract:||Natural gradient descent is an optimization method developed from the information geometry. It works well for many applications due to the better convergence and can be a good alternative for gradient descent and stochastic gradient descent in machine learning and statistics. The goal of this work is to propose a natural gradient descent algorithm with the beta distribution and the stepsize adaptation. We compare the minimizing process of gradient descent with natural gradient descent with respect to Gauss and Beta distributions. Additionally, the calculating of the Fisher matrix for computing natural gradient will be represented.|
|Appears in Collections:||Статьи, проиндексированные в SCOPUS, WOS|
Files in This Item:
|scopusresults 2104 .pdf|
|989.58 kB||Adobe PDF||View/Open|
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.