NEXT-GEN ASSISTANT

Defining the future of Google's

Assistant on Android

NEXT-GEN ASSISTANT

Defining the future of Google's

Assistant on Android

NEXT-GEN ASSISTANT

Defining the future of Google's

Assistant on Android

Background

Background

Background

Project

Project

Project

Role

Role

Role

Year

Year

Year

Android & Assistant

Android &

Assistant

Android & Assistant

UX Lead

UX Lead

UX Lead

2018-2020

2018-2020

2018-2020

Just like our search experience, we tended to think of them as separate teams and thus separate disjointed experiences. As Android’s ecosystem of devices grew it became necessary to evolve the Assistant visual language and provide deeper integration.


I served as a UX lead on Android and worked closely with the talented designers from the Android & Assistant team to craft the vision for Google's next Generation Assistant.

I led the UX for Android's native on-device & web search.

Traditionally, the Google search experience was solely a web experience that would only act as a text based auto-complete. Android had a separate on-device search that allowed users to only search for apps on their phone. We sought to unify and improve these two disjointed experiences.

A Pivotal Moment

A Pivotal Moment

A Pivotal Moment

We wanted to create a vision for Google's Assistant that not only could scale to the ecosystem of devices that we had but also provide deeper system level integration as well. With the release of Google's own chip on device, it would allow us to do just that.


Users could now interact with the Assistant more naturally with continued conversations and control their devices right from their Assistant.

I led the UX for Android's native on-device & web search.

Traditionally, the Google search experience was solely a web experience that would only act as a text based auto-complete. Android had a separate on-device search that allowed users to only search for apps on their phone. We sought to unify and improve these two disjointed experiences.

Goal

Create an integrated Assistant that can scale across Android’s growing ecosystem while enabling users to engage in more natural & continuous conversation.

Create an integrated Assistant that can scale across Android’s growing ecosystem while enabling users to engage in more natural & continuous conversation.

Create an integrated Assistant that can scale across Android’s growing ecosystem while enabling users to engage in more natural & continuous conversation.

Principles

Principles

Principles

I worked closely with several designers from the Assistant team (Adrian, Woonji, Remington, Ye, Johnathan) to help establish the vision and design language for Google's Assistant.


We had 3 core design goals we set out when thinking about the Assistant:

  1. Adaptable (Scalable to different devices)

  2. Preserve context (Lightweight and enable follow-up conversations)

  3. Easy to access (Consistent way of engaging with the Assistant)


Establishing a coherent spatial model was critical to understand how users engage with the Assistant. Prior to this, the Assistant was treated by Android as an app and would appear in the Overview.

Process

Process

Process

Early one we organized a cross-functional/cross-org sprint to generate and brainstorm ideas in this space. Although we had several candidates that we considered, many of them restricted the Assistant to a space (background layer, status bar, etc), which ultimately clashed with the direction we saw the Assistant moving towards.


I built a prototype to demonstrate how the Assistant could be implemented on a system level across Android. That prototype was demoed to leadership which ultimately gave the green light to launch the new minimal Assistant design language.

Invoking the Assistant

Invoking the Assistant

Invoking the Assistant

One of the core principles we sought to do is to make the experience as minimal as possible on Android. This was to ensure users were able to have the Assistant interact with the screen content if they needed. We matched the experience (light/dark) with the background of the context they were in.


In the past, users would hold down the home button on the 3 button navigation bar to invoke the Assistant. As we continued to give more real estate back to users, we had to provide an alternative to invoke the Assistant. We looked at several gestures to enable this (corner swipe) but ultimately landed on the long press of the power button.


This also required working with our OEM partners to ensure that our hardware button strategy (not just the Assistant) was in alignment since every Android manufacturer took a different strategy.

Outcome

Outcome

Outcome

The New Google Assistant launched alongside the release of the Pixel 4 and continues to be a priority for the company. This new design has scaled to Pixel watch, foldables, tablets, TV, nest hubs and more and continues to be the foundation for Google's long term strategy of a deeper more personal Assistant.