0.2063
7667766266
x

Artificial Neural Networks (ANN)

iasparliament Logo
October 09, 2024

Why in News?

The 2024 Nobel Prize in physics has been awarded to John Hopfield and Geoffrey Hinton for foundational discoveries and inventions that enable machine learning with artificial neural networks.

What are Artificial Neural Networks(ANN)?

John Hopfield created an associative memory that can store and reconstruct images and other types of patterns in data.

Geoffrey Hinton invented a method that can autonomously find properties in data, and so perform tasks such as identifying specific elements in pictures.

  • Artificial Intelligence – It is the ability of machines to perform cognitive functions like learn, analyse, interpret and make decisions like human intelligence through  Artificial Neural Networks.
    • For  example , interpreting a picture to identify the objects in it.
  • Artificial Neural Networks – It is a network of  connected nodes similar to the structure of the brain.

1

  • Each node is a site where some input data is processed according to fixed rules to produce an output and the connection between nodes allows them to transfer input and output signals to each other.
  • Function - These nodes influence each other through con­nections that can be likened to synapses in brain and can be made stronger or weaker.

Synapse is the site of transmission of electric nerve impulses between two nerve cells (neurons) or between a neuron and a gland or muscle cell (effector).

2

  • Training - The network is trained by developing stronger connections between nodes with simultaneously high values.

What is Hopfield Network?

  • Hopfield Network – Artificial Neural Networks is constructed using  Hopfield network which is a type of recurrent neural network in which neurons learn and process information based on Hebbian learning.

Hebbian learning is an idea in neuropsychology that if one neuron repeatedly triggers a second, the connection between the two becomes stronger.

  • Hopfield’s mapping allowed researchers to translate ideas from statistical physics, neuropsychology, and biology to a form of cognition.
  • Hopfield network utilises the atomic spin property of magnetic materials to store patterns and for recreating them.

Atomic Spin

  • It is the special characteristics of magnetic materials
  • Atomic spin  property makes each atom a tiny magnet.
  • The spins of neighbouring atoms affect each other.
  • This causes domains with spin in the same direction to form within materials.

3    4

 

  • When the network is given an incomplete or slightly distorted pattern, the method can find the stored pattern that is most similar.
  • When the network is ‘taught’ an image, it stores the visual in a ‘low-energy state’ created by adjusting the strengths of the nodes’ connections.
  • When the network encounters a noisy version of the image, it produces the denoised version by progressively moving it to the same low-energy state.
  • Training -  It is the process of updating artificial networks using Donald Hebb’s hypothesis as one of the basic rules.

Donald Hebb’s hypothesis is about how learning occurs because connections between neurons are reinforced when they work together.

  • Associative Memory - It is a special type of memory  that stores set of patterns as memories for performing searches through data.
  • When the associative memory is being presented with a key pattern, it responds by producing one of the stored pattern which closely resembles or relates to the key pattern.
  • If a node is exposed to many texts, one set in English and the other its Tamil translation, it could use Hebbian learning to conclude “hand” and “kai” are synonymous because they appear together most often.

What is Boltzmann Machine ?

  • Boltzmann Machine - It is the first simple deep-learning machines developed using a statistical physics technique.
  • It is applied in the context of cognitive science to perform cognitive tasks, building on the principles of the Hopfield network.

5

  • Probability - Boltzmann’s equation predicts the most probable states based on the system’s energy preference to that state.
  • Hinton developed an ANN with a tendency to move towards some outcomes over others by using Boltzmann’s equation to process its inputs.
  • Hidden & Visible Nodes - Their network had a set of visible nodes, which could input and output information, and a set of hidden nodes that only interacted with other nodes.
  • Dawn of Generative AI - The visible nodes worked like a Hopfield network whereas the hidden nodes modelled new possibilities using Boltzmann’s equation.
  • Restricted Boltzmann Machines - In this machine, hidden nodes were connected only to visible nodes, and vice versa  which  enabled more efficient learning.
  • that some states are more probable than others because the system’s energy prefers them.

What are the recent developments?

  • Transformer – It is  a two-part neural network that encodes and then decodes information, with valuable applications in object (including facial) detection and recognition.
  • Backpropagation – It is a technique that allows unsupervised ANNs to upgrade themselves as they learn.
  • Long short-term Memory – It  enables ANNs to remember some information for a fixed number of steps.

Reference

  1. The Hindu | Artificial Neural Networks
  2. Nobel Prize | The Nobel Prize in Physics 2024
Login or Register to Post Comments
There are no reviews yet. Be the first one to review.

ARCHIVES

MONTH/YEARWISE ARCHIVES

sidetext
Free UPSC Interview Guidance Programme
sidetext