The work has been published in a top-ranked journal “IEEE Transactions on Neural Networks and Learning Systems”
The link is below
https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10479223
The title: On Model of Recurrent Neural Network on a Time Scale: Exponential Convergence and Stability Research
Abstract:
The majority of the results on modeling recurrent neural networks (RNNs) are obtained using delayed differential equations, which imply continuous time representation. On the other hand, these models must be discrete in time, given their practical implementation in computer systems, requiring their versatile utilization across arbitrary time scales. Hence, the goal of this research is to model and investigate the architecture design of a delayed RNN using delayed differential equations on a time scale. Internal memory can be utilized to describe the calculation of the future states using discrete and distributed delays, which is a representation of the deep learning architecture for artificial RNNs. We focus on qualitative behavior and stability study of the system. Special attention is paid to taking into account the effect of the time-scale parameters on neural network dynamics. Here, we delve into the exploration of exponential stability in RNN models on a time scale that incorporates multiple discrete and distributed delays. Two approaches for constructing exponential estimates, including the Hilger and the usual exponential functions, are considered and compared. The Lyapunov–Krasovskii (L–K) functional method is employed to study stability on a time scale in both cases. The established stability criteria, resulting in an exponential-like estimate, utilizes a tuple of positive definite matrices, decay rate, and graininess of the time scale. The models of RNNs for the two-neuron network with four discrete and distributed delays, as well as the ring lattice delayed network of seven identical neurons, are numerically investigated. The results indicate how the time scale (graininess) and model characteristics (weights) influence the qualitative behavior, leading to a transition from stable focus to quasiperiodic limit cycles.
This work was supported in part by the Erasmus + Program for Education of the European Union through the Key Action 2 Grant (the Future Is in Applied Artificial Intelligence) under Grant 2022-1-PL01-KA220-HED000088359 (work package 5: “Piloting,” activity A5.6 “Project deliverables”)