The Evolution of Language and the Emergence of Natural Language Processing: Challenges and Applications of NLP

Transformers In Depth — Chapter 1: Introduction to Language Models Part I

Gabriel Furnieles
8 min readOct 16, 2023
Photo by SIMON LEE on Unsplash

In commencing this first chapter of the series, I wanted to first provide a brief historical overview of Human Language and its subsequent evolution in computing with Natural Language Processing (NLP). Embracing the past is a cornerstone for understanding why and how things have evolved, and sets a context that allows us to study the smartest and most wonderful minds that have ever existed.

However, for those whose interests are firmly fixed on the specifics of NLP and are inclined to forgo the historical narrative, you can skip to Part II of this chapter Introduction to NLP. Text preprocessing, Tokenizers, Probabilistic Models, and Word embeddings, where I delve into the intricacies of the concepts and models that make up the broad landscape of this deep field.

If anyone has any comments, ideas, suggestions, or questions, please do not hesitate to leave a comment and it will be a pleasure to answer and take them into account to improve the reading and contents of the article. Thanks for reading! — The author

Contents

  1. A Brief History of Language
    Why +6900 languages?
    Language in our Brains
  2. Natural Language Processing (NLP)
    Why NLP?
    A Historical Journey through NLP
    Challenges and Applications of NLP
  3. Summary

A Brief History of Language

A language is not just words. It’s a culture, a tradition, a unification of a community, a whole history that creates what a community is. It’s all embodied in a language. — Noam Chomsky

Language allows us to communicate our emotions, ideas or desires, organize in society, imagine and create, and ultimately evolve as a species. Nowadays there exist more than 6900 different languages [source], each impenetrable from the others (i.e. it is impossible to understand Spanish by just knowing English). But when and how did language arise? Why there exist so many languages? And how did they evolve over time?

According to scientists [source], human language most likely emerged between 50,000 and 100,000 years ago in the apogee of the Homo Erectus species. The Homo Erectus is considered the differentiating link between primates and humans; they had greater cranial capacity than their predecessors and faster cognitive development, and they were the first to start moving on two legs (bipedalism).

According to Michael Corballis, because of bipedalism, the Homo Erectus hands were freed and their face further exposed, allowing for richer and more complex body expressions — before Homo Erectus it is believed that hominids communicated by repetitive, automatic and emotional gestures, just like other animal species — . Then, when Homo Erectus started to create the first tools, their hands were occupied again and communication became based on the face and eventually the mouth.

Facial reconstruction of a Homo erectus woman, from John Gurche, in a display at the Smithsonian National Museum of Natural History, via Google Arts and Culture.

From that point, language started evolving, becoming more complex and adapting to new necessities (cultural, emotional, technicalities, …) But what is even more important is the emergence of new forms of communication:

↪ First with cave art and religious icons in the Prehistory (64.000 years ago)
↪ Then by pictograms in ancient Mesopotamia (9000 B.C.)
↪ Followed by the first alphabets: Sumerian cuneiforms and Egyptian hieroglyphs (3400 B.C.)
↪ Evolved in many different ways around the world during the following millennia (Latin, Vedic Sanskrit, Proto-Chinese, …)
↪ The first letter where sent by the Persian Queen Atossa (500 B.C.)
↪ Guttenberg created the printing press (1450), enabling the exchange of written information on a massive scale.
↪ Technology evolved in leaps and bounds and new communication devices were invented: the landline telephone (1876), the radio (1920), television and fax (1926), the internet with modern computers (1983) and finally mobile phones (1984)

Until finally reaching the present day, when we produce 328.77 million terabytes of information per day [source] in many different ways (messages, pictures, blogs, …)

Why +6900 languages?

Back to Homo Erectus, there is fossil evidence from 1.7 million years ago of the migratory journeys of this species. Homo Erectus dispersed across Europe and Asia at a rapid pace and with them the languages they used adapted to geography, culture, and religion.
Furthermore, language was also used as a security barrier and as the identifying mark of a community; only those who belonged to the group knew the language and could communicate.

Language in our Brains

In his talk, The Origins and Evolution of Language, Michael Corballis explains the fundamental difference between human and animal language:

  • Human language: Infinite capacity to say something different and to create meanings.
  • Animal language: Repetitive, automatic, and emotional with no option to create new meanings.

But at some point in history, this difference wasn’t so, as human language had not yet been invented.
Most theories point out that the root of human language lies in primate mirror mechanisms. Looking at the human brain, neuroscientists detected that those regions dedicated to language in humans are the evolution of the primate’s mirror mechanisms [1][2][3]. Furthermore, many studies have shown that apes and humans can communicate by gestures and signs.

Figure 1. (Left) Location of brain regions dedicated to language in the human brain. (Right) Mirror mechanisms in apes’ brain. Source NTU.

In simple words, we learn language by imitation.*

*This statement will be more relevant in future chapters and sections

Natural Language Processing (NLP)

Natural language processing (NLP) is the discipline of building machines that can manipulate human language — or data that resembles human language — in the way that it is written, spoken, and organized. — DeepLearning AI

Why NLP?

One of computer science’s fundamental goals is to create machines that can communicate with humans seamlessly and intelligently.

As Alan Turing, father of computation and artificial intelligence, stated in his well-known Turing test (1950): Intelligent behavior can be attributed to a machine when, during a dialogue with a human evaluator who lacks knowledge about the identity of their conversational partner (whether it be another human, an entity, or a machine), said evaluator is unable to discern whether they are conversing with a machine or a fellow human being.

So begins the race to find computers capable of dialoguing coherently with humans.

A Historical Journey through NLP

Figure 2. NLP timeline. Image by the author.

1950–1970: Symbolic NLP
Scientists begin to explore the possibilities of teaching languages by abstracting a set of predefined rules (rule-based systems).

  • The Georgetown experiment successfully translates 64 short sentences from Russian into English (1954)
  • Chomsky tries to encompass all language in a formal structure and builds its hierarchy of grammar (1956)
  • Computer programs like ELIZA (1965) or SHRDLU (1970) show promising results as the first chatbots.

1970–1980: AI winter
Due to a lack of funding and a loss of interest from researchers, AI research went through a difficult period in which the rate of progress was drastically reduced.

1980–2010: Statistical and Machine Learning NLP
Statistical methods start gaining impulse among researchers and new innovative approaches are used:

  • IBM Research achieves great results in machine translation with the IBM alignment models (1990)
  • Hidden Markov Models (HMM)show potential when processing short sentences B. H. Juang & L. R. Rabiner (1991)
  • Previous work during the 70s led to the Autocorrect algorithm by Microsoft (1994), the precursor of the automatic proofreading algorithms we use every day.
  • A couple of years later, Microsoft announces Autocomplete algorithm (2002) present in every smartphone and search engine nowadays.
  • Google launches Google Translate in 2006, the statistical machine translation service that revolutionized the world.

2000-today: Neural NLP

Challenges and Applications of NLP

As a field in constant growth and evolution, NLP […]. Some of these applications are:

Figure 3. NLP applications and challenges (some of them). Image by the author
  • Sentiment analysis
    Classify whether a text has positive, negative or neutral meaning
  • Machine Translation
    Translate texts from one language A to another language B
  • Autocorrect
    Automatically correct misspelled words
  • Part Of Speech Tagging
    Classify words in a text into part-of-speech classes (nouns, verbs, adjectives, …)
  • Autocomplete
    Automatically complete words/parts of a text
  • Question Answering
    Given a question respond with the correct answer
  • Text generation
    Generate text in the same way a human will do

However, because text is sequential data (i.e. a sequence of sentences, words, subwords, characters…) most of the advances in NLP also have repercussions in other disciplines such as biology and medicine (gene sequencing), chemistry (molecular structures) or economics (time series forecasting).

Summary

A summary of the main takeaways of this article is presented:

  • Human Language started 100,000 years ago and it evolved from gestures to verbal communication to many other forms like art or text using the technology available at the moment.
  • The human brain regions in charge of Language, are the evolution of primate’s mirror mechanisms. We learn language by imitation!
  • Natural language processing (NLP) is the discipline of building machines that can manipulate human language — or data that resembles human language — in the way that it is written, spoken, and organized. — DeepLearning AI
  • NLP was born from the desire of scientists to build intelligent machines capable of communicating with humans and its history up to the present day goes through 4 phases:
    - 1950–1970: Symbolic NLP
    -
    1970–1980: AI winter
    -
    1980–2010: Statistical and Machine Learning NLP
    -
    2000-today: Neural NLP
  • NLP applications range from text processing and linguistics to many other different areas of science. Because text data can be seen as sequential data, NLP advancements can be also applied in Biology, Chemistry, or Economics among others.

Hi, I’m Gabriel Furnieles, a Mathematical Engineering student specializing in Artificial Intelligence. I write about AI and Data Science topics. I hope you enjoyed the article and found it helpful, if so, please consider following me Gabriel Furnieles and subscribing to my newsletter so stories will be sent directly to you 👇

--

--

Gabriel Furnieles
Gabriel Furnieles

Written by Gabriel Furnieles

Mathematical Engineer | Data Scientist | Python Specializing in AI and ML. I write casually on Data Science topics. www.linkedin.com/in/gabrielfurnielesgarcia

No responses yet