• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

Russian Scientists Reconstruct Dynamics of Brain Neuron Model Using Neural Network

Russian Scientists Reconstruct Dynamics of Brain Neuron Model Using Neural Network

© iStock

Researchers from HSE University in Nizhny Novgorod have shown that a neural network can reconstruct the dynamics of a brain neuron model using just a single set of measurements, such as recordings of its electrical activity. The developed neural network was trained to reconstruct the system's full dynamics and predict its behaviour under changing conditions. This method enables the investigation of complex biological processes, even when not all necessary measurements are available. The study has been published in Chaos, Solitons & Fractals.

Neurons are cells that enable the brain to process information and transmit signals. They communicate through electrical impulses, which either activate neighbouring neurons or slow them down. Each neuron has a membrane that allows charged particles, known as ions, to pass through channels in the membrane, generating electrical impulses.

Figure 1. Diagram showing an electrically active cell in a neuronal culture and the process of recording its transmembrane potential for further analysis
© Natalya Stankevich

Mathematical models are used to study the function of neurons. These models are often based on the Hodgkin-Huxley approach, which allows for the construction of relatively simple models but requires a large number of parameters and calculations. To predict a neuron's behaviour, several parameters and characteristics are typically measured, including membrane voltage, ion currents, and the state of the cell channels. Researchers from HSE University and the Saratov Branch of the Kotelnikov Institute of Radioengineering and Electronics of the Russian Academy of Sciences have demonstrated the possibility of considering changes in a single control parameter—the neuron's membrane electrical potential—and using a neural network to reconstruct the missing data. 

The proposed method consisted of two steps. First, changes in a neuron's potential over time were analysed. This data was then fed into a neural network—a variational autoencoder—that identified key patterns, discarded irrelevant information, and generated a set of characteristics describing the neuron's state. Second, a different type of neural network—neural network mapping—used these characteristics to predict the neuron's future behaviour. The neural network effectively took on the functions of a Hodgkin-Huxley model, but instead of relying on complex equations, it was trained on the data.

Figure 2. Reconstruction of neuronal characteristics using a variational autoencoder. The original time series (measurement record) R is compressed by the encoder and transformed into dynamic characteristics μ. The decoder then decompresses μ as accurately as possible back into the original time series—R’. The process is similar to making it through a bottleneck: only the most important information can pass through, while all unnecessary data is discarded. To obtain μ, the autoencoder must identify the most relevant information about the neuron.
© Pavel V. Kuptsov, Nataliya V. Stankevich, Reconstruction of neuromorphic dynamics from a single scalar time series using variational autoencoder and neural network map, Chaos, Solitons & Fractals, Volume 191, 2025, 115818, ISSN 0960-0779.

'With the advancement of mathematical and computational methods, traditional approaches are being revisited, which not only helps improve them but can also lead to new discoveries. Models reconstructed from data are typically based on low-order polynomial equations, such as the 4th or 5th order. These models have limited nonlinearity, meaning they cannot describe highly complex dependencies without increasing the error,' explains Pavel Kuptsov, Leading Research Fellow at the Faculty of Informatics, Mathematics, and Computer Science of HSE University in Nizhny Novgorod. 'The new method uses neural networks in place of polynomials. Their nonlinearity is governed by sigmoids, smooth functions ranging from 0 to 1, which correspond to polynomial equations (Taylor series) of infinite order. This makes the modelling process more flexible and accurate.’ 

Typically, a complete set of parameters is required to simulate a complex system, but obtaining this in real-world conditions can be challenging. In experiments, especially in biology and medicine, data is often incomplete or noisy. The scientists demonstrated by their approach that using a neural network makes it possible to reconstruct missing values and predict the system's behaviour, even with a limited amount of data. 

'We take just one row of data, a single example of behaviour, train a model on it, and incorporate a control parameter into it. Imagine it as a rotating switch that can be turned to observe different behaviours. After training, if we start adjusting the switch—ie, changing this parameter—we will observe that the model reproduces various types of behaviours that are characteristic of the original system,' explains Pavel Kuptsov. 

During the simulation, the neural network not only replicated the system modes it was trained on but also identified new ones. One of these involves the transition from a series of frequent pulses to single bursts. Such transitions occur when the parameters change, yet the neural network detected them independently, without having seen such examples in the data it was trained on. This means that the neural network does not just memorise examples; it actually recognises hidden patterns.

'It is important that the neural network can identify new patterns in the data,’ says Natalya Stankevich, Leading Research Fellow at the Faculty of Informatics, Mathematics, and Computer Science of HSE University in Nizhny Novgorod. 'It identifies connections that are not explicitly represented in the training sample and draws conclusions about the system's behaviour under new conditions.' 

The neural network is currently operating on computer-generated data. In the future, the researchers plan to apply it to real experimental data. This opens up opportunities for studying complex dynamic processes where it is impossible to anticipate all potential scenarios in advance.

The study was carried out as part of HSE University's Mirror Laboratories project and supported by a grant from the Russian Science Foundation.

See also:

Machine Learning Links Two New Genes to Ischemic Stroke

A team of scientists from HSE University and the Kurchatov Institute used machine learning methods to investigate genetic predisposition to stroke. Their analysis of the genomes of over 5,000 people identified 131 genes linked to the risk of ischemic stroke. For two of these genes, the association was found for the first time. The paper has been published in PeerJ Computer Science.

First Digital Adult Reading Test Available on RuStore

HSE University's Centre for Language and Brain has developed the first standardised tool for assessing Russian reading skills in adults—the LexiMetr-A test. The test is now available digitally on the RuStore platform. This application allows for a quick and effective diagnosis of reading disorders, including dyslexia, in people aged 18 and older.

Low-Carbon Exports Reduce CO2 Emissions

Researchers at the HSE Faculty of Economic Sciences and the Federal Research Centre of Coal and Coal Chemistry have found that exporting low-carbon goods contributes to a better environment in Russian regions and helps them reduce greenhouse gas emissions. The study results have been published in R-Economy.

Russian Scientists Assess Dangers of Internal Waves During Underwater Volcanic Eruptions

Mathematicians at HSE University in Nizhny Novgorod and the A.V. Gaponov-Grekhov Institute of Applied Physics of the Russian Academy of Sciences studied internal waves generated in the ocean after the explosive eruption of an underwater volcano. The researchers calculated how the waves vary depending on ocean depth and the radius of the explosion source. It turns out that the strongest wave in the first group does not arrive immediately, but after a significant delay. This data can help predict the consequences of eruptions and enable advance preparation for potential threats. The article has been published in Natural Hazards. The research was carried out with support from the Russian Science Foundation (link in Russian).

Centre for Language and Brain Begins Cooperation with Academy of Sciences of Sakha Republic

HSE University's Centre for Language and Brain and the Academy of Sciences of the Republic of Sakha (Yakutia) have signed a partnership agreement, opening up new opportunities for research on the region's understudied languages and bilingualism. Thanks to modern methods, such as eye tracking and neuroimaging, scientists will be able to answer questions about how bilingualism works at the brain level.

How the Brain Responds to Prices: Scientists Discover Neural Marker for Price Perception

Russian scientists have discovered how the brain makes purchasing decisions. Using electroencephalography (EEG) and magnetoencephalography (MEG), researchers found that the brain responds almost instantly when a product's price deviates from expectations. This response engages brain regions involved in evaluating rewards and learning from past decisions. Thus, perceiving a product's value is not merely a conscious choice but also a function of automatic cognitive mechanisms. The results have been published in Frontiers in Human Neuroscience.

AI Predicts Behaviour of Quantum Systems

Scientists from HSE University, in collaboration with researchers from the University of Southern California, have developed an algorithm that rapidly and accurately predicts the behaviour of quantum systems, from quantum computers to solar panels. This methodology enabled the simulation of processes in the MoS₂ semiconductor and revealed that the movement of charged particles is influenced not only by the number of defects but also by their location. These defects can either slow down or accelerate charge transport, leading to effects that were previously difficult to account for with standard methods. The study has been published in Proceedings of the National Academy of Sciences (PNAS).

Electrical Brain Stimulation Helps Memorise New Words

A team of researchers at HSE University, in collaboration with scientists from Russian and foreign universities, has investigated the impact of electrical brain stimulation on learning new words. The experiment shows that direct current stimulation of language centres—Broca's and Wernicke's areas—can improve and speed up the memorisation of new words. The findings have been published in Neurobiology of Learning and Memory.

Artificial Intelligence Improves Risk Prediction of Complex Diseases

Neural network models developed at the HSE AI Research Centre have significantly improved the prediction of risks for obesity, type 1 diabetes, psoriasis, and other complex diseases. A joint study with Genotek Ltd showed that deep learning algorithms outperform traditional methods, particularly in cases involving complex gene interactions (epistasis). The findings have been published in Frontiers in Medicine.

Cerium Glows Yellow: Chemists Discover How to Control Luminescence of Rare Earth Elements

Researchers at HSE University and the Institute of Petrochemical Synthesis of the Russian Academy of Sciences have discovered a way to control both the colour and brightness of the glow emitted by rare earth elements. Their luminescence is generally predictable—for example, cerium typically emits light in the ultraviolet range. However, the scientists have demonstrated that this can be altered. They created a chemical environment in which a cerium ion began to emit a yellow glow. The findings could contribute to the development of new light sources, displays, and lasers. The study has been published in Optical Materials.