The first question Android developers ask me when they learn that I do talks about the Google Assistant is: "OH! I have this awesome app that has this use case and I totally want to integrate it with the Google Assistant so I can do hands-free actions inside my app". This is a genuine question with a complicated answer. There are several tools out there that you can use to address this problem: Dialogflow, TensorFlow (Mobile, Lite), ML Kit, Cloud APIs, Auto ML. The most impressive thing? They're all Google solutions! If you're not confused yet, take a look at Google Maps that "integrated" with the Google Assistant when you're in itinerary mode and App Actions that are supposed to help Android developers take advantage of Actions on Google. In this talk, I'm *not* talking about integrating your app with the Google Assistant, but rather how Android developers can add voice actions in their app (not via the Google Assistant). I'll then talk about both ecosystems (Android and Assistant) and how they interact with each other.
Elaine has been working with mobile apps development for the past 6 years. Since the launch of the Google Assistant, she has been following the developments around that area. She truly believes that interacting with technology using natural language will define the future of computing. Born and raised in Brazil, she's been living in France since 2004 and loves everything multicultural. She's a GDE for the Google Assistant.