Keynote Speaker
Prof. Predrag Stanimirović
University of Niš, Serbia
Speech Title: Gradient Dynamical Systems for Solving Matrix Equations and Computing Generalized Inverses
Biography: Prof. Predrag Stanimirović has accomplished his doctoral degree from Mathematics at University of Niš, Faculty of Mathematics, Niš, Serbia. Now he is working as full Professor at University of Niš, Faculty of Sciences and Mathematics, Departments of Computer Science, Niš, Serbia. Thirty four years of experience in scientific research in diverse fields of mathematics and computer science, which span multiple branches of Numerical linear algebra, recurrent neural networks, linear algebra, symbolic computation, nonlinear optimization and others. His main research topics include Numerical Linear Algebra, Operations Research, Recurrent Neural Networks and Symbolic Computation. Within recent years, he has successfully published 245 publications in scientific journals, including 5 research monographs, 6 text-books, and 75 peer-reviewed research articles published in conference proceedings and book chapters.He is Editor-in-Chief of the scientific journal Facta Universitatis, Series: Mathematics and Informatics and section editor in Filomat and several other journals.
Abstract: The approach based on dynamical system is a powerful tool for solving many kinds of matrix algebra problems because of: possibility of defined evolution to ensure a response within a predefined time-frame in real-time applications, parallel distributed nature of initiated neural networks, convenience of hardware implementation, global convergence without restrictions, dynamical system is applicable to online computation with time-varying matrices.
As a confirmation, various dynamical systems based on the Gradient Neural Network (GNN) for solving matrix equations and computing generalized inverses have been investigated. Also, different types of dynamic state equations corresponding to various kinds of generalized inverses have been proposed and considered. Recurrent Neural Network (RNN) models arising from appropriate simplification of GNN models are also proposed. Convergence properties and exact solutions of considered GNN and RNN models are investigated.