Please use this identifier to cite or link to this item: http://hdl.handle.net/20.500.12258/19288
Title: Accelerating extreme search based on natural gradient descent with beta distribution
Authors: Lyakhov, P. A.
Ляхов, П. А.
Abdulkadirov, R. I.
Абдулкадиров, Р. И.
Keywords: Distribution;Natural gradient descent;Hessian;Fisher matrix;Extreme search
Issue Date: 2021
Publisher: Institute of Electrical and Electronics Engineers Inc.
Citation: Lyakhov, P., Abdulkadirov, R. Accelerating extreme search based on natural gradient descent with beta distribution // 2021 International Conference Engineering and Telecommunication, En and T 2021. - 2021. - DOI10.1109/EnT50460.2021.9681769
Series/Report no.: 2021 International Conference Engineering and Telecommunication, En and T 2021
Abstract: Natural gradient descent is an optimization method developed from the information geometry. It works well for many applications due to the better convergence and can be a good alternative for gradient descent and stochastic gradient descent in machine learning and statistics. The goal of this work is to propose a natural gradient descent algorithm with the beta distribution and the stepsize adaptation. We compare the minimizing process of gradient descent with natural gradient descent with respect to Gauss and Beta distributions. Additionally, the calculating of the Fisher matrix for computing natural gradient will be represented.
URI: http://hdl.handle.net/20.500.12258/19288
Appears in Collections:Статьи, проиндексированные в SCOPUS, WOS

Files in This Item:
File SizeFormat 
scopusresults 2104 .pdf
  Restricted Access
989.58 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.