Case Study

Emotech: Olly the Robot

Posted on August 08, 2018

Emotech, a company that specialises in AI technology, wanted to develop the world’s first robot with personality - one which can identify specific users and then cater for their individual preferences and needs - even going as far as anticipating them.

To make that a reality Emotech needed to undertake further research to equip the robot with natural language processing abilities. Emotech turned to The Data Lab for its support and secured funding allowing them to collaborate with Heriot-Watt University on the project, undertake the required research and develop the test models.

The resulting robot, named ‘Olly’, can perform the standard tasks associated with general home assistants: playing music, dimming lights, writing emails and setting alarms. Olly takes it to another level through its ability to understand and generate language, allowing it to remember specific user requests and anticipate needs.

Project Detail

Emotech sought support from The Data Lab after Dr Pawel Swietojanski, research scientist at Emotech became familiar with the innovation centre while pursuing his PhD at the University of Edinburgh.

The Data Lab went on to fund 50% of the project costs, enabling the tie-up with Heriot-Watt University’s Interaction Lab and Natural Language Processing (NLP) Lab, led by Dr Liu and Dr Rieser. This collaboration allowed research to be undertaken leading to the development of Olly’s unique framework and test model.

Olly is a showcase of Emotech’s AI technology and is the first step in demonstrating how conversation interfaces can be applied across various industry uses.

The project officially began in December 2016 and the first shipment of Olly robots are available for $549 USD (approx. £415 GBP) to consumers on Indiegogo from September 2018.


Olly uses a camera sensor to know how many users are in the room. Once Olly is aware of who is present, it adapts its responses to the user or group.

After a user speaks to Olly, the robot uses Automatic Speech Recognition (ASR) to convert the user’s speech voice into text utterances.

Natural Language Understanding (NLU) follows ASR, and is how Olly learns each user’s preferences. Heriot-Watt University developed Olly’s NLU by using open source software, Rasa_NLU, and Emotech collected real user data to train the NLU component.

For instance, if a user says, “Please set my alarm for 6am”, the NLU component will understand the user’s intention to set an alarm.

Data-driven NLU has more advantages over rule-based NLU because the robot learns from real user data, and can therefore handle real situations better. The data-driven approach is also easily extended and adapted to new domains.

Once NLU is complete, Olly has a Dialogue Manager (DM) which determines what to do or say and learns from conversations to remember the user’s intentions or preferences.

The next step is Natural Language Generation (NLG), and currently this is template-based.

Finally, Olly uses Text-To-Speech (TTS) to convert the natural language text to speech which the user will hear.

The pipeline for Olly’s conversational system is:

User > Speech > ASR > NLU > DM > NLG > TTS > User

When I met Dr Rieser and Dr Liu from Heriot-Watt’s School of Mathematical and Computer Sciences at a natural language generation conference we realised that there was an opportunity to work on this project together. The Data Lab was instrumental in making that happen by helping fund Emotech’s collaboration with Heriot-Watt and therefore making Olly a reality.

Dr Pawel Swietojanski, research scientist at Emotech

AI is an increasingly popular topic but often people are unclear on how it can be used. This project demonstrates how the technology can be applied across different industries. For instance, we are currently researching how the research-based, data-driven NLG can be adapted to industrial products such as Olly. It’s great that The Data Lab is facilitating this type of research in Scotland.

Dr Liu of Heriot-Watt University

Mailing List

To recieve updates from The Data Lab please enter your details below.