Update (11/05/2018): Google’s unbelievably real demonstration of their Duplex AI talking like humans did make people’s jaws drop. But at the same time, it raised concerns over ethics. It was an example of how easy it has become to deceive the person at the other end. The human must know they’re talking to a computer because it’s not every day some AI calls you to book a haircut appointment.
Google has already acknowledged that transparency is a key part of technologies like these. Now, amid rising controversy, the search giant has made it clear that they’re working to introduce a disclosure and “make sure the system is appropriately identified”. What they showed at I/O was an early demo.
The original post continues to form here.
One of the biggest issues with our artificially intelligent digital assistants is their computerized voice and the inability to comprehend natural human conversations in real time. Hopefully, that might change in the coming future.
Many among the audience of over 7,000 people at Google I/O were almost on the edge of their seats when Google Assistant made a real call to book a haircut appointment during Sundar Pichai’s keynote.
Google’s AI-powered digital assistant is now more natural than ever during conversations. What’s behind its new superpower is a tech called Google Duplex which Google has been developing for many years. It enables Assistant to be a helping hand in real-life situations like making restaurant reservations, get holiday hours, etc.
It’s better if a business has an online booking system. But if it hasn’t, Assistant can make a call in the background and talk to the humans on the other side like it’s no big deal. All you need to do is provide the date and time.
Assistant also adds a reminder to your calendar after a successful booking is made. From there, you can cancel the reservation later if you change your mind.
To talk like humans, Assistant uses Google Duplex to understand complex sentences, fast speech, and long remarks. It makes sure that the intent is clear so that the other person can understand the context of the conversation.
We’ll have wait for some time until Google brings the feature to Assistant which they’ll start testing later this summer; there is a scope for improvement before it ends up on consumer devices. During the training, there are times when the all-confident Assistant makes a phone call but struggles when it becomes unusually complex. It has built-in self-monitoring capability and signals the human operator to take over the task.
The reason why Google Duplex is so efficient at natural conversations is because it’s designed to “complete specific tasks, such as scheduling certain types appointments.”
According to Google, one of the key aspects was constraining Duplex to closed domains which are narrow enough to explore extensively. It’s only after rigorous training in such domains that Duplex can have natural conversations. It can’t have general discussions in the same way.
The tech may appear useful on the user’s part, but it sounds handier (and an example of AI taking human jobs) for the businesses which rely on phone calls for booking appointments. Every day they get numerous phone calls. If there is Google Assistant on the phone, it may be able to handle things more efficiently.
Check out more interesting stuff from Google I/O.