hi, i'm dan aminzade, anengineer on the android wear team. if you build androidapps, you may be wondering howto update your app to work better on awearable form factor.
Android Wear Apps, to give you a better idea ofthe changes you might want to make, in thistalk, i'll tell you how we updated google's own appsto better support android wear. i'm going to focuson four apps that
showcase different aspects ofthe android wear api, gmail, hangouts, cameraand google maps. each of these appsillustrates a different set of features of theandroid wear api. let's start with gmail. gmail generates two types ofnotifications on your phone. a notification like this,when you have one email. and a notificationlike this, when you have multiple unread emails.
the notification for one emailhas a snippet of the email, and two actions;archive and reply. without changing gmailat all, the notification would be bridged fromphone to wearable, and would show up asa card in the stream. you could swipe on the cardto select reply or archive. the problem is thatthe reply action would do exactly what it didwhen selected on the phone-- it would open up thegmail app on your phone,
so you could type a response. so the first thingwe wanted to do was to allow peopleto speak quick voice replies to their watch so theycan answer emails directly on their wrist. the android notificationapi now lets you annotate anotification action with a remote input, whichtells android wear that you'd like to collect a bit offree form text via speech
before performing the action. when we build anotification in gmail, we attach a remote inputto the reply action. android wear seesthis remote input, and instead of firingthe action immediately, it first launchesa transcription ui to collect a spoken response. then it puts the transcribedtext into the intent, before firing theintent on the phone.
when gmail starts upand sees this text, it knows that it has what itneeds to complete the reply action, and it can goahead and send the email without bringing upany ui on the phone. so adding voice repliesjust required two small code changes. first, changing the reply actionto include a remote input. and second, modifyingthe activity that received theintent from this action,
to see if the intentincluded a text response. what's cool is that thenotification experience on the phone is unchanged. if you select the reply actionon the phone's notification shade, gmail startsthe same activity. but since the remoteinput isn't populated, it displays thecompose window as usual to let you type a response. now, let's look at themultiple message notification,
which uses themulti-line inbox style. instead of putting a bunchof short lines of text onto a single card, we wantedto have one card for each email, in an expandable stack. these bundles, or stacksof notification cards, are a new feature inthe notification api. instead of postingone notification with all of the emailthreads coalesced, we post multiplenotifications-- one per email.
but all of these notificationsare marked with the same group key, indicating thatthey're related, and they should be grouped asa card bundle on the wearable. now we get a nicestack of cards, which you can fan outby tapping the bundle, and read the cards individually. notice that each ofthe cards in the bundle can also have its ownactions, so we can now reply to or archive a singleemail from within the group--
just as we did witha single email. notification bundlesalso have a sort key, which you can set to controlthe ordering of cards within the bundle. you can mark onenotification in the bundle as a summary notification,which is representative of the group as a whole. for gmail, we flag the originalinbox style notification as the summary.
the summary notificationis displayed on the phone'snotification shade. and the bundlewithout the summary is displayed on the wearable. next, let's look at hangouts. hangouts messagesare automatically bridged to the wearable too. but we wanted to makesome changes to fine tune the experience forandroid wear devices.
as with gmail, wewanted voice replies. but the hangouts notificationsare a little bit different. they don't have a reply action,just a content intent that opens up the app, soyou can type a response. this shows up as an open onphone action on the wearable. fortunately, thenotification api now lets you specifya distinct set of actions onphone and wearable. the phone actions will onlybe displayed on the phone,
and the wearableactions will only be displayed on the wearable. this allowed us to adda wearable only reply action, which includes aremote input without changing the phone behavior. hangouts also uses anothernew notification feature, notification pages. we thought it would be usefulif you could swipe sideways on the main message cardto see a second page
with the recentconversation history. this isn't reallynecessary on the phone, because you canjust open the app to see the conversation history. but on the wearable, it's niceto get a little extra context before speaking a response. to do this, we usethe add page method of wearable extender,which allows you to add extrapages of content
on the primary notification. we put the chat historyinto a second big text style notification, and add itto the primary notification as a second page. once again, the notificationexperience on the phone is unchanged. but on the wearable, we getthe second page of content, showing the chat history. here's a full example ofhangouts on the wearable.
this is a picture message, withone page for just a photo, one page with the chat history,and a reply action, for recording aquick voice reply. let's move on to camera. we wanted to add a fun littlefeature to google's camera app that would let youtrigger a shutter release from your wrist. you've probably seen high endcameras with remote controls, and the idea here is the same.
you put your phone ona tripod, or maybe you lean it against a wall, or yougive it to someone to hold. and then you capture a photo bytapping a button on your watch. with gmail and hangouts,we did entirely phone-side integrations, using onlythe notification apis. but for this usecase, it made sense to build a wearable appfor a couple of reasons. first, it wouldn't reallymake sense for the camera app to post notificationsto the phone shade, when
it was alreadyrunning full screen, so we knew that the behavioron the phone and the wearable had to be asymmetric. also, we thought thatfor this use case, it would be appropriateto take over the entire screenof the wearable with the shutter button,instead of confining the button to acard in the stream. so we built a simple app torun on the wearable, that
communicates withthe main camera app, using google play services. when the camera app isready to take a picture, it sets a data item,indicating that it's ready to accept remoteshutter messages. this data item is read by aservice within the wearable app on the watch, whichdisplays the shutter button. tapping the button sends amessage back to the phone to trigger shutter press.
finally, to preview thephoto you just took, the phone appcreates a thumbnail and sends it back across to thewatch as an asset within a data item. the wearable reads theasset, and displays it full screen for preview. here's what it lookslike end to end. i open the camera app on myphone, and i see a string card. i can tap the card tolaunch a full screen
activity with theshutter button. i tap the shutter button,and after a count down, my phone takes a photo, and isee a preview on the wearable. not a bad selfie. let's move on to google maps. during turn byturn navigation, we wanted to show directionprompts on your wrist. these can be especiallyuseful when you're walking, since it's awkwardholding up your phone,
and it could be more convenientto leave it in your pocket, and refer to turndescriptions on your watch. because we wantedfine grained control over the layout and thepresentation of the directions displayed on thewearable, we decided to build the google maps wearapp, that rendered custom drawn cards as localonly notifications. we modified thegoogle maps phone app to update a data itemwith the description
and the icon forthe next maneuver, and info about thenavigation state. the maps wear app listens forchanges to this data item. with each change, itreads the new data, and updates the cardon the wearable. to draw the card,the wearable app uses the new display intentfeature of wearable extender. you specify an activityto draw the content within a notification card.
this lets us just drawwhatever we wanted on the card, instead of being constrainedto the standard notification styles. cards in ambient low powermode, or in this peaking state, still have to use standardnotification templates. but when you swipe thecard up into full view, the system cross fades thecard into a custom view, drawn by the customdisplay activity. it's a bit of abummer that you have
to use standard styles inpeak mode and in ambient mode, but android wear doesintroduce some new wearable specific notificationtemplates, like big action style and content icon style. one last feature. we also wanted to let youstart a navigation session by speaking a voice command,like navigate to donuts. to do this, the mapswear app has an activity with an intent filter for thenavigate voice action, which
creates an intent like this. the wear appreceives this intent, and sends a message togoogle maps on the phone that contains a destination,and a travel mode. the maps phone appreceives this message, and starts a navigationsession to the destination. and off you go. that's how we made gmail,hangouts, camera, and google maps, all readyfor android wear.

i hope it gave you agood overview of what you can do on theandroid wear platform. and i hope it inspires you tocustomize your own application to support a betterwearable experience.