'Alexa, how is NASA using chatbots?'
- By Matt Leonard
- Sep 27, 2018
The Technology Evaluation and Infusion Team at NASA's Jet Propulsion Laboratory is looking to understand conversational interfaces like Alexa, Cortana, Siri and Google Home, along with various text-based chatbots. NASA is testing these tools to see how well they work with automating internal processes like scheduling a meeting, looking up contract information or answering other task-specific questions.
JPL's IoT team functions like a consultancy within the agency, looking to see how to tap commercial technologies to use internally, explained team lead Michael Cox.
"We're on the cutting edge when it comes to landing rovers on Mars, but we're not on the cutting edge … in how we get out work done on the IT-side sometimes," Cox said in an interview.
These devices are becoming increasingly common, with one report estimating that more than 47 million Americans have a smart speaker in their home, and many smartphones come with AI-powered assistants. This penetration in the marketplace means people are more comfortable with the technology and ready to use it when they get to work, too, Cox said.
JPL started out by primarily using Alexa for these projects, but has moved a majority of its virtual assistants onto Amazon Lex, a cloud-based backend for building chatbots.
"In addition to all of the stuff that Alexa gives you, Lex gives you the ability to do text-based queries instead of just audio," Cox said.
While voice interfaces are very popular in consumer devices, they are often not the best option, whether because someone doesn't want their coworkers to hear them or because they're looking for information while in a meeting. This is where chatbots come in handy.
Chatbots have proven useful for simple productivity tasks like booking a meeting or remembering phone numbers. JPL is also piloting the use of Slack; they've used Lex to create a room-bot for Slack that allows an employee to tell the bot they need a conference room at a specific time and have it respond with what is available.
The JPL acquisition team is currently beta testing another automation tool. People within acquisition came to the Technology Evaluation and Infusion Team with a set of 11 questions they must answer repeatedly each day. Cox and his team made an Alexa skill that could answer these questions, which was able to save "around 70 percent of the time they previously spent asking and answering those questions."
It's so much faster that the 10 people who are piloting it as part of the beta testing are being asked by other acquisition members to help them out.
"It's so much more practical and efficient … that other members of acquisition are actually calling these [beta testers] with their questions so that they can put them into this intelligent assistant, get the answer out, tell them an answer over the phone and then they have the answer," Cox said. "And that is still faster than the old way of doing things logging into two or three systems and getting your answer out."
Many of the text-based assistants are accessed through a web portal that leads to a large chat screen. But JPL is also looking at using text messages as the channel to interact with the virtual assistance. The goal is to have a backend, like Lex, that will be able to stay consistent as the user interface changes with technological evolution.
Right now, though, these are all just proof of concept projects. The next step, Cox said, is making them into final products to be used across JPL.
"We want to make these chatbots and [Intelligent Digital Assistants] ubiquitous and a part of the everyday workflow for all JPLers," Cox said.
Matt Leonard is a former reporter for GCN.