Chatbots demonstrate unexpected range of uses, from mental health to education

Chatbots, software that uses normal language to communicate with people through text or voice, are proving their value in a surprising variety of ways – from accountancy support to mental health

Chatbots – software that uses normal language to communicate with people through text or voice – have become a common alternative to web browsing to find information or carry out routine tasks. But for some, they are having a more profound impact.

In her 2017 book, To Siri, with love, Judith Newman writes about how her teenage son Gus, who is autistic, built a close relationship with Apple’s Siri voice chatbot. It tirelessly and politely answered his questions about weather conditions and public transport, which Newman believes improved his conversations with humans and made him happier in general. At one point, her son proposed marriage to Siri, to receive a graceful turn-down of, “My end user agreement does not include marriage.”

Through talking to Siri’s creators, Newman came to realise the vast amount of work that goes into allowing general-use chatbots to respond with apparent intelligence.

“Language is one of the hardest problems in artificial intelligence,” says Daniel Polani, professor of artificial intelligence at the University of Hertfordshire. “If you see language as an iceberg, with humans there is a small percentage above water and the rest is underwater. With chatbots, it’s made of Styrofoam – it’s all above water.”

In other words, chatbots are good at handling questions they have been programmed to answer, not at understanding knotty problems. This means they lend themselves to specific applications. 

Accounting chatbots

Software provider Intuit has introduced a question-answering chatbot to help users of its Quickbooks accountancy software.

QuickBooks Assistant was primarily intended to provide an easy route to users’ own data, such as their expected tax bill. But Shaun Shirazian, UK head of product management, says he has been surprised by the volume of questions on how to use the product – as well as people asking odd things such as whether Assistant loves them. Siri isn’t the only chatbot getting chatted up.

Shirazian says QuickBooks Assistant was based on the problems customers reported through other support methods, such as email and its call centre. Intuit measures how long it takes to resolve problems: “It’s seconds with the assistant, versus minutes through alternative ways,” he says. Users are asked if the response answered their questions, adding follow-up questions if not. The company uses machine learning to analyse this feedback and make improvements.

The company, which has made QuickBooks Assistant available to users of software for self-employed workers in Canada, the UK and the US, is considering making the chatbot proactive.

“We could reach out when the customer needs some help,” says Shirazian, adding that it is aware of the need to do so without being intrusive or annoying. The next step is for chatbots to understand the full context in which queries are made. “We’re not there yet, but it’s soon to come,” he adds.

Education chatbots

Chatbots can be used to support education. For the past three years, BI (Bedriftøkonomisk Institutt) Norwegian Business School has been developing its use of a chatbot from Norwegian startup Differ to help students engage with their courses.

“A study in 2014 showed that teachers at BI struggled to find ways of engaging students in large classes,” says Anne Swanberg, director of the school’s LearningLab. “The bot has been an important part in testing how increased student engagement can be automated.” This includes introducing students to each other when they first log in, to encourage them to collaborate.

Overall, Swanberg says the chatbot has led to 19 times more engagement in terms of the number of participating students, compared with its previous solution.

“I think that the main advantage is automatic and personalised follow-ups of students,” says Swanberg, such as answering questions immediately at any time and being able to react to specific behaviour, or lack of it, by each student. “This market is still in its infancy, but together with Differ we have found clear indications when a bot can and should be used for educational purposes.”

Another Differ user, NKI Online Studies, found its use increased the completion rate of one health and care course from 22.5% to 67.6%.

Chatbots for mental health

The potential for chatbots in mental health was first considered more than half a century ago, when Joseph Weizenbaum, a professor at the Massachusetts Institute of Technology, built the pioneering natural language program Eliza.

This reframed a human user’s statements as questions and otherwise prompted the speaker to continue, in the style of person-centred therapy developed by psychologist Carl Rogers. Confiding to a web version of Eliza that “I am having problems with my chatbot” produces the reply “Did you come to me because you are having problems with your chatbot?”

Woebot is a more sophisticated mental health chatbot, based on cognitive behavioural therapy (CBT), a talking therapy widely used for depression, anxiety and other disorders. Available through Apple and Android devices or Facebook Messenger, it focuses on coaching and tracking users’ moods over time.

“This is a chatbot built by psychologists rather than technologists,” says Woebot Labs’ founder and chief executive, Alison Darcy, adding that about 80% of what it says is scripted. It does not aim to provide diagnoses. “We’re much more on the helpful coaching side. Woebot is an automated guide to a self-guided programme,” she says.

From Woebot’s robotic logo and name, to references made in chat, users are reminded that they are talking to software rather than a human. “There’s a huge section of the population that is put off by talking to another person,” says Darcy. “People deserve an anonymous place where they can share things, which is sensitive but where they are not judged by another person. People can get something off their chests, without managing the impression.”

Read more about the business use of chatbots

She sees potential in providing advice on other sensitive subjects, such as drug abuse and sexually transmitted diseases.

Woebot Labs is backed by venture capital from the Andrew Ng AI Fund and New Enterprise Associates. Access to the chatbot is currently free after an early experiment in charging was ended, although Darcy says the business ultimately has to be sustainable and there has been interest from health systems, employers and pharmaceutical companies.

She points to rising volumes of mental health problems, including in poorer countries, saying they are growing at such a pace that fully automated tools are going to be the only way to provide first-line gateway treatments, which could lead some people to using human therapists.

John Torous, co-director of the digital psychiatry programme at Beth Israel Deaconess medical centre in Boston, says there is potential for chatbots in mental healthcare, and has seen some impressive efforts. But he thinks there is work yet to do.

“Sometimes, getting it to tell you the weather can be a struggle,” he says of the likes of Siri or Amazon’s Alexa. “Making it have a meaningful, deep connection that helps you work through psychological issues in a relatable way that’s personalised to you is a really hard challenge. You could say it’s the hardest challenge for any conversational agent or chatbot.”

Torous says there are some impressive efforts in asking routine questions with limited personalisation, but so far little scientific evidence of benefits that can reproduced. “We have a lot of exciting claims,” he says. “We have all the pieces, but assembling the pieces is going to take some research, some time, some effort, some failure, some successes.

“Anyone that doesn’t have a direct conflict of interest, who isn’t trying to sell you or pitch you something, will admit that we’ve a lot to learn about how these things work,” he adds.

#metoo

One option is for chatbots to focus on very specific tasks. University of Hertfordshire’s Daniel Polani says that helping users to record their thoughts and feelings is an area with potential. “As long as it’s an honest system that doesn’t pretend to be intelligent, it can be helpful as a kind of structured diary,” he says. “It can help formulate what you want to say.”

This is the aim of Spot, a chatbot developed to record workplace harassment and discrimination. It takes users through a structured cognitive interview, a process developed to help interviewees remember more about what they are reporting. Apart from a free recall section at the start, to which the system can add supplementary questions based on keywords in what the interviewee types, it uses a standard script.

Julia Shaw, co-founder of Spot and a research associate at University College London, says chatbots are better than humans at sticking to this kind of script, but there are other advantages. “By having a bot rather than a person, people may be more likely to open up, as they’re not being judged,” she says.

It also provides a greater degree of anonymity, as the person reporting does not have to meet anyone, with the resulting reports being sent directly to employers from Spot’s servers. Additionally, a chatbot is available at any time of the day or night and is scalable; Spot can collect more reports without having to train more interviewers.

Chatbots also have advantages over online forms, says Shaw. “For important emotional experiences like harassment or discrimination at work, it’s important to make sure that people feel comfortable and safe discussing an issue,” she says. “A chatbot is probably more likely to be able to generate a basic sense of an interaction, rather than just filling out a bunch of questions.” Also, people filling out a form have a tendency to provide the shortest answers possible, particularly if they can read later questions. With Spot, Shaw says people are, at times, giving very extensive answers.

The resulting report, which is organised from what is said to the chatbot, can be downloaded by the individual and is offered to the employer. It is deleted after 30 days.

Spot, which was set up with San Francisco AI startup studio All Turtles, plans to continue to offer interviews and reports free to individuals, but is about to launch an optional paid-for service for employers. This will help them organise reports and look for trends through a dashboard, as well as allow them to respond to the individual while preserving anonymity.

Shaw says similar chatbots could be used for some kinds of crime reporting and by children being bullied in schools. They could also be used to help people remember more pleasant events. “We help people remember the negative parts of their lives, but what if people wanted to remember the positive parts? There’s no reason why a cognitive bot couldn’t do that as well,” she says.

Read more on Artificial intelligence, automation and robotics