SUBSCRIBE NOW
IN THIS ISSUE
PIPELINE RESOURCES

AI and Cognitive Innovation

By: Praful Krishna

Artificial Intelligence is all the rage right now. Executives across sectors and across parts of value chain are looking to harness its promise and potential. Service Providers are no exception.

However, there is one area where the story is more nuanced – application of AI to natural language. This article talks about the troubles enterprises face using traditional AI technologies for language, and discusses emerging alternatives like Calibrated Quantum Mesh.

The Frustration of a Promise

Many of the algorithms that are popular today have been around for a long time –  the first artificial neural networks were developed in the 1960s.  It is only now that there is sufficient data being captured digitally to make these algorithms useful, or that there is enough computing power for all this data.

Of all the algorithms based on neural networks, Deep Learning is showing the most promise. It is a technique that involves complex, multi-layered neural networks. It can recognize objects in images and video with greater precision than the human eye, transcribe speech to text faster than a journalist, predict crop yield better than the USDA, and diagnose cancer more accurately than the world’s best physicians.

Many players are trying to build solutions based on Deep Learning to automate workflows based on natural language as well. IBM, Microsoft, Nuance, are some of the larger ones. Many others like Amazon, Apple, Google, or Facebook are using similar techniques to power their own products. While there are many successes – who has not asked Siri for a joke and chuckled, if only at its naivete – the business community is getting frustrated.

Take the example of MD Anderson’s recent announcement that after investing $62 million over three years, it will discontinue its project with IBM Watson. The project was to automate cancer diagnosis, but failed to give reliable results.

Similarly, independent agents found in February 2017 that 70% chatbot messages on Facebook’s Messenger platform gave wrong answers. The company announced significant changes to its platform shortly after that.

The publicized successes of this approach for language are not mainstream. Google uses Deep Learning for its product, Google Translate. This product translates texts in more than 100 languages. The data necessary for such translation is crowd-sourced using Google’s enormous reach, for example, using Captcha. Some unconventional methods are also used – two million data points were crowd-sourced for the English to Kazakh translator after Kazakh President’s office issued an appeal.

Unfortunately, most enterprises do not have this kind of reach, budgets, or time. They do recognize the potential of such a technology, but are often confused as to the best next steps. 

Another source of frustration is the need for annotation. For a Deep Learning system to train, all the training data should be properly marked in terms of its key features.  For example, let’s say a Deep Learning system is being trained on a set of tickets raised in some context. Someone needs to go through all the text in each of the tickets and manually mark out the words that indicate the problem, the location, the severity, the urgency, the access level, or any other feature the machine should consider. In many situations qualified subject matter experts are necessary to complete such annotation. This need for annotation increases the costs tremendously. It also makes it very difficult for data scientists to iterate their model, which leads to lower accuracy.

Alternatives to Deep Learning

Automation of workflows based on natural language can be very lucrative to mainstream enterprises if it is more affordable. Technologists in Silicon Valley and around the world are coming up with multiple alternatives to Deep Learning to address its shortcomings. 

Frequently, companies focus on a specific application or a vertical to solve the problem in narrow confines. They use venture capital to overcome all barriers enlisted above – collecting sufficient data, annotating them, and training a machine for a specific problem. Such approaches can be relatively successful. However, enterprises or service providers who pride in differentiated solutions do not prefer such solutions.

Others, are inventing fundamentally different algorithms. 

The perfect algorithm for English must handle the inherently probabilistic nature of language. Consider the phrase “Good Day”. In San Francisco, it means certain solar insolation, wind-chill factor, humidity, etc. It also implies a temperature of 65 Fahrenheit, give or take. However, if we were to assume the same conditions for South Pole, it probably would be closer to the end of the world than to a good day.



FEATURED SPONSOR:

Latest Updates





Subscribe to our YouTube Channel