Last week at Google’s annual I/O conference, Google CEO Sundar Pichai demoed Google Duplex, which is a feature of Google Assistant. The new feature allows the Artificial Intelligence (AI) to make calls on behalf of the owner – however the AI sounds more like your older sister than Siri.
The demo featured two calls. The first call was to a beauty salon, where the AI was instructed to make an appointment. During the call, the voice assistant scheduled an appointment without the receptionist knowing she was not conversing with a human. The AI displayed human-like speech inflections during the call, as well as insertions of common filler words such as “um” and “mm-hmm.”
The second call however displayed more of the technologies reach. The AI called a restaurant to make a reservation, but the call didn’t go as planned. The hostess at the restaurant didn’t seem to understand the AI’s request. The AI brilliantly responded to the repeated questions asked by the hostess and understood the overall direction of the call, which was that the restaurant didn’t require reservations for parties of 4. The voice assistant even asked the hostess if the restaurant is typically busy during the hours the group planned to dine, and if they would incur a wait.
The technology displayed by Pichai was without a doubt astounding, however many are calling into question the ethics of Google Duplex. The first issue, that was widely criticized by the public, was the fact that the receivers were unaware they were not speaking with an actual person. Many called out Google for not including a prompt at the start of the call that made the business aware that they were speaking with Google Assistant.
The second ethical issue – was the uncanny human-like quality of the AI’s voice. When Pichai was describing the technology, I expected to hear a distant cousin of Siri, not my best friend. The AI voice was almost indistinguishable from a human voice. The inclusion of filler words such as “um”, “mm-hmm”, and “ah” made the voice assistant sound less formal and therefore more human-like. Because what human is overly formal while scheduling a hair appointment?
The AI calls prompted such a controversial response that Google released a statement following the demonstration stating:
We understand and value the discussion around Google Duplex — as we’ve said from the beginning, transparency in the technology is important. We are designing this feature with disclosure built-in, and we’ll make sure the system is appropriately identified. What we showed at I/O was an early technology demo, and we look forward to incorporating feedback as we develop this into a product.
The question still remains, that if the AI will identity itself at the beginning of the call then why must the technology sound like a human? If the AI was not built to deceive humans, then why have it portray human-like behavior during that calls?
Google is marketing the technology as an asset for both consumers and businesses that do not have an online scheduling system. Pichai explained that the AI technology will allow Google to update local business hours more efficiently, especially around holidays when businesses tend to receive an influx of calls from customers inquiring about their holiday hours. This aspect of Duplex will be extremely helpful to businesses that do not update their online hours regularly.
I am interested to see how Google reprograms Duplex after the feedback they received from the demonstration. I believe it would be in their best interest to remove the filler words and human-like inflections of the AI voice, to calm the public so they can appreciate the technology for what it is.
Contact The TNS Group today to stay up to date on the latest tech trends.