Have more natural conversations with Google Assistant

Looking ahead: more natural conversation

In everyday conversation, we all naturally say “um,” correct ourselves and pause occasionally to find the right words. But others can still understand us, because people are active listeners and can react to conversational cues in under 200 milliseconds. We believe your Google Assistant should be able to listen and understand you just as well.

To make this happen, we’re building new, more powerful speech and language models that can understand the nuances of human speech — like when someone is pausing, but not finished speaking. And we’re getting closer to the fluidity of real-time conversation with the Tensor chip, which is custom-engineered to handle on-device machine learning tasks super fast. Looking ahead, Assistant will be able to better understand the imperfections of human speech without getting tripped up — including the pauses, “umms” and interruptions — making your interactions feel much closer to a natural conversation.

We’re working hard to make Google Assistant the easiest way to get everyday tasks done at home, in the car and on the go. And with these latest improvements, we’re getting closer to a world where you can spend less time thinking about technology — and more time staying present in the moment.

Source