Network of artificial neurons learns to use language

By Hugo Angel,

  Filed under: AI, Artificial Intelligence, Cognitive, Computing, Cortex, Deep Learning, Language, Language Detection, Machine Learning, Memory, Neural Networks, Neuron, Neuroscience, NLP, Pattern Recognition, Research, Science
  Comments: Comments Off on Network of artificial neurons learns to use language

Neurons. Shutterstock
A network of artificial neurons has learned how to use language.
Researchers from the universities of Sassari and Plymouth found that their cognitive model, made up of two million interconnected artificial neurons, was able to learn to use language without any prior knowledge.
The model is called the Artificial Neural Network with Adaptive Behaviour Exploited for Language Learning — or the slightly catchier Annabell for short. Researchers hope Annabell will help shed light on the cognitive processes that underpin language development. 
Annabell has no pre-coded knowledge of language, and learned through communication with a human interlocutor. 
The system is capable of learning to communicate through natural language starting from tabula rasa, without any prior knowledge of the structure of phrases, meaning of words [or] role of the different classes of words, and only by interacting with a human through a text-based interface,” researchers said.
It is also able to learn nouns, verbs, adjectives, pronouns and other word classes and to use them in expressive language.” 
Annabell was able to learn due to two functional mechanisms — synaptic plasticity and neural gating, both of which are present in the human brain.

  • Synaptic plasticity: refers to the brain’s ability to increase efficiency when the connection between two neurons are activated simultaneously, and is linked to learning and memory.
  • Neural gating mechanisms: play an important role in the cortex by modulating neurons, behaving like ‘switches’ that turn particular behaviours on and off. When turned on, they transmit a signal; when off, they block the signal. Annabell is able to learn using these mechanisms, as the flow of information inputted into the system is controlled in different areas
The results show that, compared to previous cognitive neural models of language, the Annabell model is able to develop a broad range of functionalities, starting from a tabula rasa condition,” researchers said in their conclusion
The current version of the system sets the scene for subsequent experiments on the fluidity of the brain and its robustness. It could lead to the extension of the model for handling the developmental stages in the grounding and acquisition of language.

Comments are closed for this post.