Artificial neuron and its model
In this study published in the journal Science on August 6, researchers from the National Center for Scientific Research in Paris, France, built a computer model of artificial neurons.
Like neurons in the brain, this model can use electrical signals to transmit information. Researchers transport ions across thin water layers to simulate real ion channels;
In this way, these artificial neurons can produce the same electrical discharge as neurons in the brain.
Scientists speculate that if we can make a computer like a human brain, the energy consumption of the computer will be greatly reduced. One way to reproduce the biological mechanism of the human brain is to use ions to generate electricity like the brain.
At a more detailed level, researchers have built a system to simulate the generation of action potential.
The cell membrane is at a resting potential before the neurons generate action potential. When the stimulation causes the influx of extracellular cations, the cell membrane depolarizes.
When the depolarization reaches the threshold, the voltage-gated ion channel will open, allowing more cations to enter the cell, until the membrane potential reaches the maximum value and then repolarization, which takes several milliseconds.
In order to simulate the voltage-gated ion channel, researchers modeled the thin layer of water between two layers of graphene.
In the simulation, the researchers modeled the water layer with the thickness of one water molecule, two water molecules and three water molecules, and characterized the water layer as a quasi-dimensional slit.
During the computer simulation test of this model, researchers found that when a stronger electric field is applied, these structures will decompose slowly enough to leave some memory.
In real neurons, action potential is equivalent to the cellular memory of neurons; Our brain creates such memories by opening and closing ion channels.
The model is based on [memory resistor]
According to previous research, some scientists believe that the storage capacity of the human brain is about 1TB, but others believe that it should be 100TB.
Although the human brain is not the largest in nature, it is the most developed. Of all mammals, the human brain accounts for the largest proportion of the body.
Although the human brain accounts for only 2% of the body weight, it consumes 20% of the energy.
The number of neurons corresponds to the capacity of the brain. The human brain contains about 86 billion nerve cells, of which the cerebral cortex accounts for 14 billion nerve cells.
These neurons are like the gate circuits in the computer. They are basic logic units. They are connected together in a complex way, and they are large-scale series-parallel. Finally, together with glial cells, they form the complex central nervous system of the human brain.
Research shows that there are about 1000 synapses between each pair of neurons; By multiplying 86 billion neurons and 1000 outstanding neurons, we can roughly calculate that the potential storage capacity of the human brain is as high as 86TB.
The artificial neural network is similar to this model. It gives different weights to different data, and then realizes fuzzy calculation, which is used to simulate the nonlinear and imprecise data processing ability of the human brain.
Make more accurate predictions with fewer neurons
The researchers input the data from the dynamic system into a [reserve pool] composed of randomly connected artificial neurons in the neural network.
The larger and more complex the system, the more accurate the expected prediction results, the larger the artificial neural network, and the more computing resources and time required to complete the task.
The reserve pool based on artificial neurons is a black box. Scientists do not know what is happening in it, but only know that it works.
The artificial neural network at the core of reserve pool computing is based on mathematics. It is found that the entire reserve pool computing system can be greatly simplified, thus significantly reducing the demand for computing resources and saving a lot of time.
The next-generation storage pool computing technology proposed by researchers is significantly superior to the current SOTA technology. In a relatively simple simulation on a desktop computer, the speed of the new system is 33 to 163 times that of the current model.
Realize true AI in any sense
Traditional electronic components need to consume a lot of energy. Although our human brain is complex, its energy consumption is very low.
This difference is due to the existence of current in the communication between human neurons, but the carrier of current is not electrons but ions.
Let electronic components use ions for signal transmission, and the newly developed [artificial neuron] can communicate with human brain neurons using the same dielectric.
Generally speaking, the human brain can realize two-way communication with computers.
Now, intelligent algorithms can perform face recognition and even drive driverless cars, and these technologies are mainly attributed to deep learning based on the principle of human brain structure.
They are composed of many artificial nerve cells, which are connected by artificial synapses and thus transmit signals to each other.
The real neurons are deep enough to help them carry out a more complex, efficient and more close to the human brain learning process.
That is to say, such an artificial network has fewer examples to learn to recognize a cat and has the function of internalizing language meaning.
Such a system can not only change the performance of a single neuron in each artificial neural network, but also combine the characteristics of different types of neurons in the artificial network, just like the human brain.
The ultimate goal is to build an electronically replicated brain, which can imitate the functions, capabilities and diversity of human brain and realize true artificial intelligence in any sense.
If machine learning methods that accurately imitate the brain are applied to computer models, the development of computer models will also enhance the understanding of the brain itself.
Scientists have found a way to imitate their own learning ability through artificial networks, which in turn allows us to better understand the brain and ourselves.
Some data reference: “Artificial Neurons Go Further, Memory Storage Has Become a Reality”, Lei “Can Artificial Neural Networks Control the” Thick “Degree of Biological Neurons? 5 to 8 Layers may not be the Limit”, Heart of Machine Pro: “New Breakthrough in Reserve Pool Computing: Less Neurons, Up to a Million Times Increase in Computing Speed”, Science and Technology Talk: “Who is Stronger than the” Artificial Neurons “in China?”