news

Look who's talking

Computers in the future will be able to perform more useful tasks with the natural languages that humans use

LEADING technology companies such as Google, Apple and Microsoft have successfully commercialised technology that came out of Natural Language Processing (NLP), a field of computer science hinged on analysis of how people speak and write. Essentially, NLP is the ability of a computer program to understand human speech as it is spoken. It is a component of artificial intelligence (AI).

NLP will thus become an important technology in bridging the gap between human communication and digital data. Future applications include machine translation, information extraction and question answering.

One renowned expert in the field of NLP is Dr Alyona Medelyan. She has provided consulting services internationally ever since completing her PhD in 2009. Her key areas of expertise are keyword extraction, text categorisation and semantic search. She is the author of the popular open-source tools Kea and Maui. Alyona has published over 20 research papers and journal articles in various AI and NLP venues, and she frequently speaks at tech conferences and trade shows. Alyona is also the CEO and co-founder of Thematic, a customer insight startup.

Alyona was a speaker at the recent Wrangle Conference Asia 2016 event in Kuala Lumpur. The event which was co-organised by Malaysia Digital Economy Corporation (MDeC), Cloudera and Big Data Malaysia was a single-day, single-track industry event about the principles, practice, and application of Data Science, across multiple data-rich industries.

Here, Dr Alyona explains more about NLP and how it will impact the future.

WHAT INSPIRED YOU TO CHOOSE THIS FIELD OF WORK?

Originally, I chose to study linguistics. But after learning about Natural Language Processing (NLP), where people create programs to analyse human language, I knew that this will be one of the key fields of the future. Making sense of language using computers is incredibly difficult but I get inspired by seeing constant advances in this field and by seeing how these advances change our lives. Think about how being able to search on Google has changed our lives, or the growing number of personal assistants like Alexa that make our lives easier.

YOU SPECIALISE IN EXTRACTING MEANING FROM TEXT. CAN YOU ELABORATE? ARE THERE ELEMENTS OF AI IN NLP?

Yes, NLP is a subfield of AI. I personally specialise in detecting themes in customer feedback and use some of the AI techniques for modelling the meaning of words and phrases developed over the past five years.

For example, customers may be leaving comments like “I was put on hold for too long”, “the wait was incredibly long”, “I’m not happy with the waiting time”. They use different words, but what they mean is the same thing.

Traditionally, companies would employ people to manually code such comments. Now, computers can take care of this task automatically. Understanding nuances of language such as sarcasm would require more advanced elements of AI, such as modelling of common sense knowledge, but in reality, sarcasm is not that common.

WHAT ABOUT CHAT BOTS? CAN YOU ELABORATE?

Google, Apple, Amazon and Microsoft are all simultaneously building personal assistants. But more and more companies are now creating chat bots, which assist people in performing one specific task, e.g. shopping, meeting scheduling, reminders.

Personal assistants and chat bots respond to language commands. Just like in my previous example of customer feedback, people can express the same thing in different ways. So, many chat bots need some level of Natural Language Understanding, a component of NLP, to infer what a person needs, and what other additional information they require. Some chat bots don’t and can be implemented simply.

WHERE CAN CHAT BOTS BE USEFUL?

By using chat bots, companies can save resources while still providing an experience of a natural interaction to customers. For example, customers often ask support or call centres the same questions, like “Is the store open on Sundays?”, or “Does it stock a particular product?”. For a person, it’s much easier to quickly ask somebody rather than search the site. Such interactions can be automated and if done right, both customers and companies can benefit from it.

WHAT ABOUT VOICE-ACTIVATED PERSONAL ASSISTANTS? HOW ARE THEY DIFFERENT FROM CHAT BOTS? AND WHERE DO YOU THINK SUCH TECHNOLOGY WILL

TAKE US?

There is a difference between personal assistants and devices they live on. Personal assistants such as Siri, Cortana, Google Now, Alexa are basically personalised chat bots, which help us solve tasks by having access to our personal data such as our location, past search history or tools such as our calendar.

On some devices, they can be activated by voice. For example, on our phones, on home devices like Amazon Echo or in-car navigation systems. As the quality of voice recognition and Natural Language Understanding improves, our lives will gradually become easier. Most of us already don’t carry paper maps. Some of us already speak to personal assistants in our phones, ask them for directions, opening times or ask to send a message to a frequent contact. Gradually, this will become more mainstream.

Most Popular
Related Article
Says Stories