The bots: The new wave of the 4th industrial revolution

0

What if, apart from the new organisation of production assets, the fourth industrial revolution also implies a significant evolution in the knowledge management intrinsic to each domain? What if new digital technologies allow the operational actors to simply access this knowledge, mostly derived from empirical methods, thanks to analytical methods and machine learning?

These questions are natural in a period with a profusion of change management methods when all innovation must be disruptive. It is now a high priority to free the operational staff by automating ancillary, repetitive, time-consuming and low-value tasks.

The philosopher Jacques Derrida wrote, "What cannot be said above all must not be silenced but written." And indeed, industrial companies have massive unstructured textual data. These often scattered and long-neglected reports, guides, manuals, emails, patents and other technical specifications are now in the limelight. If, according to IDC’s Global DataSphere model, 88% of the data created worldwide is unstructured, the richness and usefulness of this content is well-established in 2019.

"What cannot be said above all must not be silenced but written. Jacques Derrida

Thus, even if Derrida would have "deconstructed" these writings to understand the relationship between the text and the meaning, in this article, we will focus on the restructuring of what, by essence, is not. It is all about giving back a lexical semantics to enhance the exploitation of the writings and their reuse through augmented search.

Hidden Insights: The bots: The new wave of the 4th industrial revolution

NLP & machine learning on conversational agents 

But how do we accomplish this restructuring? With automated natural language processing. NLP enables the understanding and interpretation of human language through a conversational agent, the so-called "bot." It navigates in a restructured form of documents made available to ease decision making.

A rules-based approach, by itself, will not be sufficient. We need an industrial approach, where reliable decision support provides boosts productivity and agility. If you set up a bot without machine learning, you must stick to a limited script and a set of finite rules and utterances. So you will not get very far.

Machine learning helps the bot understand the intent of a question: What are the root causes of a rotating machine triggering under high pressure? Then the bot can propose more sophisticated, more flexible conversation scenarios. The bot contextualises the answer from, for instance, a knowledge base. For example: The rotating machine seems to have triggered due to a safety bypass valve change causing an unexpectedly high level of steam in the pipeline, which resulted in higher pressure for associated equipment.

Setting up a bot without machine learning? You must stick to a limited script and a set of finite rules and utterances. So you will not get very far. 

Which intelligence does an AI-based bot use?

Let’s take a peek under the hood to understand the process involved.

At the very beginning, you will find the contact channel. Business applications, intranets, websites or popular messaging apps could serve as a contact point with the customer. 

The speech-to-text unit transcribes the speech input to text and provides a clear statement.

Then the text will be sent, with any retained information from the previous utterances, via a request to the natural language understanding (NLU) service. The NLU service provides the ability to extract meaning and intent from natural language. The language processing is mainly about semantic parsing, sentence tokenisation, stemming, part-of-speech tagging, disambiguation or named entity recognition. There is no need to be proficient in linguistics or programming languages within an end-to-end enterprise AI platform. All the NLP will be done behind the scenes for you.

A powerful dependency parser analyzes the grammatical structure of each sentence and establishes semantic relationships between words. By leveraging both business rules and machine learning (through training samples provided by the customer), the NLU service can extract the predicted intents (e.g., identify the root causes) and entities (e.g., a rotating machine, compressor, pressure). 

The machine learning layer is mainly used to detect those intents and entities.

Nevertheless, for the less sophisticated chatbots, you may go from the NLU to the rules unit in a straightforward way. That means everything will be hardcoded. This is the crux of the question. Machine learning is a huge jump-start and a real game-changer compared to simple rules-based Q&A systems. 

Machine learning is a huge jump-start and a real game-changer compared to simple rules-based Q&A systems. #machinelearning #NLP #AI Click To Tweet

So what’s the point of machine learning? 

Thanks to supervised learning capabilities, the system could detect the intents from training examples.

But what if no prebuilt knowledge base system is available to fetch the entities? You could then leverage the advanced text analytics capabilities on existing knowledge repositories: e.g., call centre records or an existing Q&A list. Here, contextual entity extraction will be helpful. In an unsupervised approach, topic modelling – automatic extraction of the different topics or intent types from an unknown operational data set – could automatically uncover hidden topics in the data that could also enrich the intent list and entity lists.

You could reuse automatically detected topics into existing subject-matter dictionaries or industry taxonomies. How does it work? Going beyond machine learning, thanks to active learning –automatic rule builders promote topics to categories with supervised machine learning and generate linguistic rules automatically. This is a strong accelerator and resource optimiser.

The rules unit acts as a backbone. It furnishes the mandatory business logic, providing correct identification of context within the users’ intent, and finally ensures that the bot gives the right answers back to the end user. 

The execution unit is responsible for achieving the actions necessary to answer to the user’s intent. To find out the correct answer, the bot needs seamless integration with existing systems, technologies or databases. This includes the data visualisation system, streaming, image recognition and so much more.

Data holds the key: when used properly, your data can set your organization up for success for years to come. Join us at the event that gathers more than 1,800 data scientists and business leaders who are passionate about analytics. Register to attend Analytics Experience here.

The API comes into play 

Here is where the API comes into play. An API is an application programming interface, a set of rules and mechanisms for applications to talk to each other. These APIs will open the bot on the outside.

The natural language generation (NLG) service provides the final answer, with respect to the grammatical rules. If all the expected entities or mandatory contexts were not in the user’s question, then the NLG service will answer back with a prompt for the missing information.

NLP, machine learning and APIs paved the way for the maturity of the chatbots. 

Thus, “augmented” with the cognitive services we’ve discussed previously, 2019’s chatbot is no longer a simple rules-based one. 

Thanks to a conversational agent enhanced with text analytics tools, there’s no need to read any longer or analyze manually a huge amount of unstructured textual data. You just have to focus on delivering high-order business value. Only advanced text analytics tools with AI-based chatbots overcome the challenge of automating the enterprise search in a more agile way for end users and accelerate the data-to-decision timeline.

This article is co-written by Sylvie Jacquet-Faucillon, Artificial Intelligence Customer Adviser, and Samuel Blanquet, Head of Manufacturing, SAS France.

Discover more Insights about Connected Manufacturing and AIoT.

Share

About Author

Samuel Blanquet

Pre Sales Senior Consultant - Manufacturing Leader Samuel joined SAS in June 2016 after a period working on product lifecycle management, customer relationship management and pre sales for other companies. His work has generally involved strategic development, and his roles have been largely customer-facing. Samuel’s focus is firmly on helping his customers to understand the business issues they face, which in the past have been particularly those connected to product lifecycle management, and enable them to introduce appropriate solutions to manage these issues. Samuel’s current work builds on his previous experience in both pre sales and manufacturing. He now provides sales and pre sales support on targeted major manufacturing accounts for SAS. His strategic development work focuses on developing business value for his customers, including considering how analytics can be used to help manufacturers obtain value from data from connected equipment and sensors to enhance productivity and enrich their business model.

Leave A Reply

Back to Top