Learn More

Try out the world’s first micro-repo!

Learn More

NLP Chatbots: An Overview of Natural Language Processing in Chatbot Technology

NLP Chatbots: An Overview of Natural Language Processing in Chatbot Technology
NLP Chatbots.

Making users comfortable enough to interact with the team for a variety of reasons is something that every single organization in every single domain aims to achieve. Enterprises are looking for and implementing AI solutions through which users can express their feelings in a very seamless way. Integrating chatbots into the website – the first place of contact between the user and the product – has made a mark in this journey without a doubt! Natural Language Processing (NLP)-based chatbots, the latest, state-of-the-art versions of these chatbots, have taken the game to the next level.

In this blog post, we will explore the fascinating world of NLP chatbots and take a look at how they work exactly under the hood.

What is a Chatbot?

Before diving into natural language processing chatbots, let’s briefly examine how the previous generation of chatbots worked, and also take a look at how they have evolved over time.

A chatbot is a tool that allows users to interact with a company and receive immediate responses. It eliminates the need for a human team member to sit in front of their machine and respond to everyone individually.

The earlier, first version of chatbots was called rule-based chatbots. All it did was answer a few questions for which the answers were manually written into its code through a bunch of if-else statements. Technically it used pattern-matching algorithms to match the user’s sentence to that in the predefined responses and would respond with the predefined answer, the predefined texts were more like FAQs.

There are a couple of issues with this model:

  • It failed to understand the user’s sentiments associated with those texts. It was unsure whether the user was satisfied or disappointed and sometimes the users got even more disappointed by looking at the responses it gave. It failed to recognize when to notify the human to take over and when to end a conversation.
  • It failed to answer the queries if the user’s wording didn’t match with the predefined file, or even sometimes matching the query to a wrong question and eventually providing an entirely different response.

A Brief History of Chatbots

Before NLPs existed, there was this classic research example where scientists tried to convert Russian to English and vice-versa. When the system was given a sentence in English and converted to Russian, then converted the Russian back to English, it would go from “Spirit is willing, but the flesh is weak” to “The vodka is good, but the meat is rotten”. Thankfully, we’ve been able to improve these systems exponentially.

The timeline of chatbot development can be traced back to the early days of rule-based chatbots:

  • 1966 - Eliza: It was the very first chatbot developed at MIT, completely a rule-based model. It gave users a false sense of understanding the program, even though they didn't actually comprehend what was being said by either side which was termed as the "Eliza effect".
The original chatbot, Eliza.
The Eliza Effect refers to the phenomenon where people tend to attribute human-like qualities or intentions to AI systems, even when they are simply responding based on pre-programmed rules.
  • 2000 - ALICE: Artificial Language Internet Computer Entity, or ALICE, was inspired by Eliza. The interaction with ALICE was facilitated by applying heuristic pattern matching rules to the user's input. ALICE serves as a foundation for modern-day NLP chatbots.
The Alice Chatbot.
  • 2010 - Personal/Home assistance: Apple introduced Siri, a personal assistant available across their devices. It was the first natural language chatbot to be based on human voice input and also the first NLP chatbot to come built-in with the product, functioning without internet access as well as the ability to use the internet when needed.
    Around the same time smart speakers started entering the market, products like Alexa, and Google Home were used as home assistance
  • 2016 - Modern-day NLP chatbots: Around this year we had the “AI boom” where companies started integrating AI into their products like NLP chatbots. Companies started integrating AI into their products, such as NLP chatbots, which have been utilized in a wide range of domains from package tracking to transaction status monitoring.

What is a NLP chatbot? How Do They Work?

A Natural Language Processing chatbot is an AI-powered chatbot that simulates human-to-human conversation to such a great extent that the only way for you to know that it is a bot at the other end is through its lightning-fast responses. Its responses are so quick that no human’s limbic system would ever evolve to match that kind of speed.

Let’s look at how exactly these NLP chatbots are working underneath the hood through a simple example.

Say you had a great year so far (I hope you have) and want to celebrate 🎅🏻 Christmas 🎄 by organizing a party and you realize you don’t have enough lights ✨🌟 to decorate your house, so you will go online and find a store that sells Christmas decorations and find the lights that you very looking for. As it is the Christmas season the employees are busy helping customers in their offline store and have been busy trying to manage deliveries. But you don’t need to worry as they were smart enough to use NLP chatbot on their website and say they called it “Fairie”. Now you will click on Fairie and type “Hey I have a huge party this weekend and I need some lights”. It will respond by saying “Great, what colors and how many of each do you need?” You will respond by saying “I need 20 green ones, 15 red ones and 10 blue ones”.

Now what Fairie does is it sends the user’s response to the cloud where the Natural Language will be processed by the AI models — we will look at what exactly happens in this stage later, and this query will be sent to the database, the backend software checks if they have enough and if they do then it will update the database by subtracting order quantity from the total. The database sends the confirmation back to the AI in the cloud which in return creates a response and passes it down to Fairie, Fairie responds by saying “Hey we can make that work, please share your address and the card details” and BOOM 💥 you just brought a product at its peak season, at the very last minute and that too with no human interaction.

A process map of purchasing products via NLP chatbot.

What are NLP Chatbots Even Made of?

Let's start by understanding the different components that make an NLP chatbot a complete application.

When I speak to my home assistant, say “Alexa, add eggs and milk to my shopping cart” it is going to insert eggs and milk in a listed format into a page called shopping cart and gives a confirmation saying “Added eggs and milk to the shopping cart”. The input we provide is in an unstructured format, but the machine only accepts input in a structured format.

So it converts the unstructured data i.e, “Add eggs and milk to my shopping cart” into structured data which looks something like

<SHOPPING CART>
<ITEM> EGGS </ITEMS>
<ITEM> MILK </ITEMS>
</>

The part of the Natural Language Process that converts the unstructured data to structured one is called Natural Language Understanding or NLU and the process that converts structured data into unstructured format is called Natural Language Generation or NLG.

A diagram of how NLP turns unstructured data into structured data.

If you didn’t get it entirely don’t worry, I will try to explain in a simpler way.

“A quick brown fox jumped over the lazy dog”

In the process of writing the above sentence, I was involved in Natural Language Generation. At the same time, you the reader peering at the screen, reading it or even just trying to understand and make sense of what I wrote, you are participating in Natural Language Understanding and together, we are both performing subsets of the overall collective of Natural Language Processing.

NLU is something that improves the computer’s reading comprehension whereas NLG is something that allows computers to write.

Both of these processes are trained by considering the rules of the language, including morphology, lexicons, syntax, and semantics. This enables them to make appropriate choices on how to process the data or phrase responses.

What Exactly do NLP Models do Underneath the Hood?

Earlier we briefly saw that these models were on the cloud and doing all the heavy lifting. In this section, we will look into how exactly it is converting unstructured data into computer-readable format at each stage.

Input

The input to these NLP models is a written text or spoken text that has been converted into written text through a speech-to-text algorithm

Tokenization

In the 1st stage the sentences are converted into tokens where each token is a word of the sentence.

Stemming

In the next stage it derives the “word stem” for a given token. For example, the words “running”, “runs” & “ran” will have the word stem “run”. The word stem is derived by removing the prefixes, and suffixes and normalizing the tense.

Lemming

If we have tokens like “university” and “universe” then through stemming its word stem comes out to be “universe” which is obviously wrong, “university” and “universe” have completely different meanings and hence they cannot have the same word stem. Hence we use lemming, this is an alternate stage that the NLP model will take. Here the model takes up a token and learns its meaning through dictionary definitions and from there it derives the “root” or “lem” of the token. For example, if the token value is “better” then its root or lem is “good”, whereas, for the same token, the stem would have been “bet”

Hence, teaching the model to choose between stem and lem for a given token is a very significant step in the training process.

Part-of-speech tagging

In the next stage, the NLP model searches for slots where the token was used within the context of the sentence. For example, if there are two sentences “I am going to make dinner” and “What make is your laptop” and “make” is the token that's being processed.

In the first sentence, the word "make" functions as a verb, whereas in the second sentence, the same word functions as a noun. Therefore, the usage of the token matters and part-of-speech tagging helps determine the context in which it is used.

Name Entity Recognition (NER)

In this final stage, the model checks if there is an entity associated with a given token. For example, a token of New Jersey has an entity of a U.S. state whereas a token of James has an entity of a person’s name.

The stages of natural language processing.

These are some of the basic steps that every NLP chatbot will use to process the user’s input and a similar process will be undergone when it needs to generate a response back to the user. Based on the different use cases some additional processing will be done to get the required data in a structured format.

Conclusion

NLP chatbots have revolutionized the field of conversational AI by bringing a more natural and meaningful language understanding to machines. With their ability to personalize responses based on user preferences and historical data to be able to set them up with very little to no knowledge of programming to respond in a variety of languages, they have truly enhanced the customer experience.

NLPs are being used in various applications like sentiment analysis, spam detection, machine translation and so on. So what does the future hold? Where will NLP take humanity?

Table of Contents

No items found.
More from Pieces