Please use this identifier to cite or link to this item:
https://dspace.ncfu.ru/handle/123456789/31995| Title: | Transformer operator network with fast difference gradient positive–negative momentum for solving Navier–Stokes equations |
| Authors: | Abdulkadirov, R. I. Абдулкадиров, Р. И. Lyakhov, P. A. Ляхов, П. А. Bergerman, M. V. Бергерман, М. В. Nagornov, N. N. Нагорнов, Н. Н. |
| Keywords: | Deep neural operator;Numerical methods;Navier–stokes equations;Physics-informed neural networks;Optimization;Transformers;Mathematical operators;Mean square error;Global optimization;Learning systems |
| Issue Date: | 2025 |
| Publisher: | Elsevier Ltd |
| Citation: | Abdulkadirov, R. I., Lyakhov, P. A., Bergerman, M. V., Nagornov, N. N. Transformer operator network with fast difference gradient positive–negative momentum for solving Navier–Stokes equations // Chaos, Solitons and Fractals. - 2025. - 200. - art. no. 116964. - DOI: 10.1016/j.chaos.2025.116964 |
| Series/Report no.: | Chaos, Solitons and Fractals |
| Abstract: | Modern machine learning approaches solve many problems in human activity. Recent models of neural networks find applications in solving equations of mathematical physics. Along with traditional numerical methods, physics-informed learning builds the approximate solution with a relatively small L2 and mean squared errors. In this paper, we propose a fast difference-gradient positive–negative momentum optimizer that achieves a global minimum of the loss function with a convergence rate O(logT) and stability O(T). This optimization algorithm solves loss function minimization problems such as gradient discontinuity, vanishing gradient and local minimum traversal. Analysis on the set of test functions confirms the superiority of the proposed optimizer over state-of-the-art methods. Through the modified positive–negative moment estimation, the proposed optimizer gives more appropriate weight updates in physics-informed, deep operator, and transformer operator neural networks in Navier–Stokes equation solving. In particular, the proposed optimizer minimizes the loss function better than known analogs in solving 2d-Kovasznay, (2d+t)-Taylor–Green, (3d+t)-Beltrami, and (2d+t) circular cylindrical flows. Fast difference gradient positive–negative moment estimation allows the physics-informed model to reduce the L2 and mean squared errors solution by 14.6−53.9 percentage points. The proposed fast difference gradient positive–negative momentum can increase the approximate solutions of partial differential equations in physics-informed neural, deep operator, and transformer operator networks. |
| URI: | https://dspace.ncfu.ru/handle/123456789/31995 |
| Appears in Collections: | Статьи, проиндексированные в SCOPUS, WOS |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| scopusresults 3673.pdf Restricted Access | 127.64 kB | Adobe PDF | View/Open | |
| WoS 2195.pdf Restricted Access | 113.23 kB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.