close

Insider

The influence of human anatomy in the history of Artificial Intelligence

Written by Alana Team
on October 21, 2020

Artificial intelligence is still a paradigm for some people and businesses, but by knowing more about its history, evolution, and understanding how studies based on human anatomy and biology help in the development of technology, we understand that it can be an ally of humans in several aspects.


In the second season of the Inside Alana Podcast, we discussed how AI, together with humans, is changing the experience of consumers and companies. And understanding the constant technological evolution and its impact in different sectors (health, economy, education, etc.) is necessary.

Next, learn:

  1. The current scenario of AI
  2. The evolutionary history of artificial intelligence
  3. The influence of human anatomy on technology
  4. How humans and machines think

The current scenario of AI

The main fear about artificial intelligence is that it will be able to replace humans, and that humanoid robot will take over the universe. However, this imagery is far from real, and it exists only in films and series about AI.

Although AI is evolving every decade, and machines are able to understand a certain level of human communication and perform advanced tasks, an independent machine (by concept a Strong AI) may never be developed.


Growth of companies in the sector

The last decade has seen an increase in the number of companies competing for space in the artificial intelligence market, highlighting the high demand for solutions and generating a need for training professionals and forming qualified teams.

Also, large companies like Google, Amazon, and Apple have invested millions of dollars in developing proprietary technology to be used in their digital ecosystems.

According to the Gartner Institute, the number of companies implementing artificial intelligence grew by 270% worldwide in the period from 2015 to 2019.

Countries that invest in AI

The leadership in the development of artificial intelligence remains with the United States and China, countries that have a high number of researches for scalable technologies and sophisticated algorithms. 

Other countries, such as France, the United Kingdom, and Canada, have joined in an initiative called The Global Partnership on AI, which aims to create artificial intelligence parameters that respect the following pillars:

  • Human rights
  • Inclusion
  • Diversity
  • Innovation
  • Economic growth

For the expert, Dr. Alexandre Chiavegatto, guest of the episode The role of Artificial Intelligence in the fight against Covid-19, from Inside Alana Podcast, the race for artificial intelligence is much bigger than the aerospace race.

A brief history of artificial intelligence

Artificial intelligence is not a result of the evolution of the internet, but has been enhanced by this phenomenon and also by Big Data.

AI was already imagined in antiquity, in 390 BC, through Socrates' questions about the possibility of an “external artifact” being able to classify human behaviors.

Socrates' premise was that humans themselves act based on predefined patterns, just like machines, so the "artifact" could identify the patterns and classify them.

World War II

In the 1940s, Alan Turing, an important mathematician, was part of a team that was responsible for creating a decoder to handle the messages encrypted by the German Enigma machine.

Despite not having used artificial intelligence, the creation of the decoder was the work that brought Turing closer to computer studies, which culminated in the creation of the Turing Test in the 1950s.

The 1950s and 1960s

In the early 1950s, Alan created a test that became famous as the Turing Test, which aims to verify the ability of machines to simulate human thinking. At that time there was still no name for the field of study and, in 1956, another renowned scientist, John McCarthy, suggested the name Artificial Intelligence.

In the 60s, the Eliza chatbot was the great evolution of artificial intelligence, as it was a chatbot with natural language processing (NLP) technology.

Contemporary AI

Artificial intelligence as we know it today is the result of the growth in the areas of data mining, web applications, and natural language processing during the 90s and early 2000s.

Over the past decade, there has been a significant increase in conversational products driven by artificial intelligence, such as personal assistants Siri, Google Assistant, and Alexa.

How artificial intelligence was inspired by human anatomy

Studies in the field of Biology, specifically on neuroanatomy, influence the way that artificial intelligence researchers develop artificial neural networks, used in the machine learning process.

Artificial neural networks are representations of biological neural networks and seek to replicate the way neurons act, that is, how they process and transmit information. Therefore, understanding the anatomy of the human brain is essential to create something that resembles the natural process

This concept is the basis of deep learning, which has several layers of artificial neural networks to train a given machine.

Charles Darwin's theory

Another inspiration for the AI ​​field is the Theory of Evolution, which considers natural selection as an evolutionary principle. Based on this premise, computer scientists have developed algorithms called genetic, which are capable of evolving.

The evolutionary process of genetic algorithms is simple, and they follow the logic of natural selection: only the strongest algorithm within a given group survives. 

This type of algorithm is also known as evolutionary since it can progress. 

 

Human thinking and machine thinking

Although there are algorithms capable of replicating some human brain processes, machines do not have the same thinking capacity as we do. The way of learning is different, as well as the memorization process.

Besides, only the human being is aware and able to understand the context in which he is inserted. The machine may even think according to the data it has available, but it is not able to involve feelings in a thought.

Human Memory

Humans get information and store it in the following forms:

  • Procedure memory: Stores data from repetitions of a pattern, such as learning motor skills
  • Declarative memory: It is linked to episodic memory, about facts lived, seen, or read. 

Another classification of human memory is according to the storage time, like the famous short and long term memories.

Machine memory

Their memory is directly linked to algorithms, such as LSTM, which is specific for short and long-term memory. 

LSTM is a recurring artificial neural network architecture, that is, it can remember values at random intervals. This type of algorithm is suitable for classifying, processing, and predicting time series with time intervals of unknown duration. 

This architecture is also widely used in:

  • language modeling applications
  • language translation
  • generation of texts and chatbots

linha-min-1

 

personagens escrevendo

 



assinatura_teamalana

 

 

 

 

Você também pode gostar:

Insider

[Video] Leaders of the future: chat with Luiza Helena Trajano

Creating a digital business and expanding it is not a simple task, it is necessary to invest time and money in technolog...

Insider

Black Friday: the challenge of sales volume increase and customer service

Black Friday is the most anticipated large-scale promotion of the year. It is the chance that the consumers have to take...

Insider

Artificial Intelligence and the Future of Communication Between Humans and Machines

Interpreting the human language is essential for communication between humans and machines. It happens through the natur...

Receba insights sobre
Inteligência Artificial por e-mail,
na medida certa.