Android TV api



t. v. raman: hello, everyone. thank you for coming to thissession on android accessibility. i have three of my colleagueswith me. i am t. v. raman.


Android TV api, i lead our work onandroid access. with me, i have alan viverette,charles chen, and peter lundblad. and what we are talking to youabout is accessibility on the


android platform in terms ofwhat android does for you with respect to increasing thebreadth of users that you can cover with your applications. so you heard talks all day todayin terms of how android does a lot of heavy liftingfor you so that you, as an application developer, canfocus on the logic and functionality of yourapplications. so today you don't need to worryabout, am i writing for a small phone, bigphone, tablet?


android does all ofthat work for you. what this talk is about isgiving you yet another perspective on the huge varietyof devices and users that you address when you buildan android application. i use speech for all my work. i cannot see. my colleague, peter, usesbraille for all his work. and what we are showing you isapis in the platform that actually allow us to buildaccessibility services that


let us use all of theapplications you guys are building in manners that youwould never have thought we would use it. so that's really our motivationhere with android accessibility, is to increasethe reach of these applications to users withdifferent needs, with special needs. we've come a long way sinceour access work on android started in 2008.


a quick history to offer you interms of how we've evolved and what the platform does. so our access work actuallystarted in android 1.6. 1.5, we released thetext to speech api. and 1.6, basically, allowedusers to navigate using system focus. and anything that you navigatedto would speak. and you could use a lot ofsimple applications that way. we have since come a long way.


and in ice cream sandwich, atthe end of last year, we released a feature we calledtouch exploration, which then allows me, as a blind user, totouch different parts of the screen and hear whatis under my finger. this basically then completelyopens up all aspects of the android user interfaceto a blind user. because now, with touchexploration, you could actually read everything on thescreen, not just things that were focusable.


and that was a hugestep for us. but what we had as a gap in ics,which we are now covering today, and we'll talk aboutthis in greater detail, is the next step. so that was the shorthistory lesson. let's talk about jelly bean. so there are significant accessapi enhancements in jelly bean. some of you have heard a quickoverview of this during romain


guy's talk this morning. we've done a series of thingsthat actually enable both spoken feedback access forblind users as well as braille, which peter willshow you in a bit. but at a high level, here arethe three things we are introducing today. we have introduced the notionof accessibility focus in android, which peter will talkabout in detail, that allows blind users to reliably stepthrough the interface.


so accessibility focus issomething that can be put on any part of the user interfaceand, as a result, what is there will speak. we have also introduced thisnotion we call accessibility actions that you can thentrigger from accessibility services, like talkbackand brailleback, which peter will show you. what those actions let us do iswe can then perform clicks and selections and things likethat programmatically, which


then allows us to hook upesoteric devices like braille displays and other keyboards andwhatever you can imagine. and finally, we have introduceda set of gestures-- this wasn't mentioned duringthe keynote today-- that allow us to then bringtogether accessibility focus and access actions to allow ablind user to very effectively use the android userinterface. and last but not least the othermajor enhancement in jelly bean is the comingof age of chrome.


and the chrome browseris completely accessible in jelly bean. and it all works seamlessly,as you shall see during the various demos we have for you. so with that i'll hand off topeter, who's the lead engineer behind brailleback, whichis the braille enablement of the platform. so brailleback is anaccessibility service that uses these same apis that i'vebeen talking about to


basically provide a brailleinterface on android. so peter, go for it. peter lundblad: thankyou, raman. hi, everyone. i'm peter lundblad. and i'm going to, as raman said,talk about a few things. i'm going to start talking aboutone of the things that we've added in jelly beanthat enables things like brailleback and otherthings for the user.


and this is called accessibilityfocus. it allows a user to interactwith anything on the screen, like any view. it behaves similar to a cursor,in the sense that you can move it aroundon the screen. it is also similarto input focus. but the difference is, as isaid, that it can interact with all views, whichis great. because input focus can onlyread views that are meant to,


for example, take input. so a good example of that is anedit text or a push back on that you can press enter on. and then it will activatethe button. but for a blind user, you wantto be able to read anything on the screen, including statictext and similar, which is normally not focusedby input focus. so therefore, we added thisaccessibility focus that is shown on screen as ayellow rectangular.


accessibility focus can beplaced anywhere by the accessibility service. and this allows multiplemodes of interaction. it's handled by the system,which means that there is one global source of truth. but again, accessibilityservices get notified when it moves. and they also havecontrol over it. so what are examples of thesedifferent modes of operation


and interaction by the user? how can we move accessibilityfocus? one example, as alanis showing, is by swiping on the screen. alan will actually show youlater on more about these swipe gestures thatwe've added. another example is that thetouch exploration that was already added in ice creamsandwich now moves accessibility focus.


we are also able to moveaccessibility focus by accessibility services using,for example, a braille display, which i'm goingto show you in a bit. the great thing is that, whenone accessibility service moves it, the accessibilityfocus, it gets broadcast to all accessibility services. so, for example, if you havetalkback and brailleback running on the system atthe same time, they get synchronized.


and the user knows where he isby both braille and speech. now, being able to onlyread what's on the screen, is great. but life is pretty boringif you can not interact with anything. so that moves us over tothe next part, which is accessibility actions. accessibility actions allows auser to interact with nodes in different ways.


and that's very important. because, if we now have thedifferent ways of moving around a focus, then you don'treally know where on the screen the differentviews are. so therefore, you also need toactually do things with them. so what accessibility actionsdo we support? the obvious thing is movingaccessibility focus, as i already mentioned a few times. we can also move input focus,which is useful.


because then we can, at will,as an accessibility service, synchronize the two focuses whenthat's good for the user. we can click on views, which,of course, is the most used action so that we can activatebuttons, activate input fields, and so on. another important thingis that we can scroll on the screen. so if the view has morethan one page, we can move between them.


that's already possible usingswipe gestures on the screen. but a problem for blind usersis that it's hard to scroll exactly by one page. and then, if you don't do thatcorrectly, then you lose where you are in the list books,for example. so the new actions allow usto scroll discreetly. there is also actions tomove inside of text. so if you have a text field withlots of text, maybe you want to, as a user, read it by


character, by word, or paragraph. or there is actionsfor handling moving inside web views. and all these are calledmovement by different granularities. we also added something,which i'm not going to talk much about. it's called global actions. that allow things likeactivating the home screen,


and pressing the backbutton, and going to notification, and so on. this is something that is purelyhandled by the system and application developersdon't have to worry about at all. so the great thing withaccessibility focus and actions is that, most of thetime, they are handled by the system as well. but there are occasionswhen the


applications need to worry. i'm going to show you some codeexamples before we are going into the demo abouthow these can look. if you have a very simple view,like on the first code example, it just adds anonclicklistener, i think this should be familiar to anyone whohas done any android app development. if you have this kind of view,that is already totally handled by the system.


because by default action,onclick, that allows us to perform clicks from brailledisplays, for example, is calling the onclicklistener,and that's it. now, let's look at thesecond code example. and there we have a differentkind of view. this view is a bitmore low level. it's a custom view where wehandle touch events directly. so we have an ontoucheventhandler. and that will becalled whenever


there is a touch event. and the view will directlyrespond to touch event and call some internal functionof itself. so it's not going throughthe onclicklistener. this, obviously, won't workbecause the system won't know what function to call when anaccessibility server invokes an action onclick. so we have to fix that issue. because, otherwise, theuser won't be able to


click on this view. one way is to refactor the view,of course, by invoking the onclicklistener and thenletting the default action handling handle the situation. but if that's not possible,sometimes you don't have access to your view or maybeyou can't easily change the code, then it's possible to[inaudible] something that we call an accessibilitydelegate. that allows you to, externalfrom the view, handle the


accessibility actions and othercalls that are related to accessibility. and in this last code example,we are doing that to take care of the onclick call by ourselvesand call our internal function. and that will fix the problem wehad on the previous slides. now, depending on what kind ofview you are creating, we might have to add handling forother actions as well. for example, if there isscrolling support in your


view, you might need to handlethe scroll action. it's also important to rememberto call back to the super class if you don't handlethe action yourself. so what can all these additionsbe used for? we are adding something new. the new features of theplatforms enable us to add braille support. sometimes blind users preferto use something called a refreshable braille display.


and this is an alternativeway of interacting with the system. we are enabling a number ofdifferent bluetooth-enabled braille displays to connectto android phones. a braille display has a line ofso-called braille cells on them with dots thatcan be raised. and the user can readon that line. there are navigation keys on abraille display so that, since it's only one line, it'spossible to move it around on


the screen and move around inthe user interface, activity things and so on. another thing that is usuallyon a braille display is a corded keyboard so that youcan also input braille. and this makes it much easierto type text than if you're using a touchscreen. of course, sometimes using theon-screen keyboard is the only alternative. but if you have the braillekeyboard it


makes life much easier. so we are adding thisaccessibility service, which is connected via bluetoothto the braille displays. it uses the accessibility eventsand nodes in the node tree to know what ison the screen and present that to the user. it synchronizes with talkbackusing accessibility focus. so you can actually interactwith the system in many ways at the same time.


you can use talkback, oryou can use braille. but they will bothbe synchronized. in addition, we are adding aninput method so that you can actually enter text usingthe braille display. this new accessibility serviceis called brailleback. it's available on google playstore today if you are running so please try it out if youhappen to have access to one of these hardware brailledisplays. i am now going to doa little demo.


and as we all know,sometimes wireless technology can be difficult. so let's see how this works. the first thing that can happenis that a braille display is disconnected fromthe android device. so what we wanted to make itas simple as possible to reconnect to the braille deviceor, actually, if you have disconnected it forbefore battery reasons. so what the user thendoes is locks the


screen and unlock again. and i'm going to ask alan dothis for me today so that we get the display connected. computer speaker: screen off,12:30 am ringer seven, slide home, home screen three. peter lundblad: and there,we heard a little sound. and that made the brailledisplay actually connect. unfortunately, though,it is not-- as i mentioned, thewireless is not--


computer speaker: screen on. peter lundblad: --alwaysour friend. so i'm going to trythis once more. computer speaker: 12:31am, ringer 70%. peter lundblad: we might havetoo many [inaudible]. alan viverette: to minimizeinterference, if you have bluetooth turned on, if youcould turn it off, we would appreciate that. peter lundblad: so thisis now working.


computer speaker: i/o 2012web view tutorial. peter lundblad: ok, so let'slook what i can do here. so as i you can hear, thespeech is talking. i can now use the keys on thebraille display to move around the screen. so let me do that. computer speaker: homescreen three. peter lundblad: it sayshome screen three. and as you can see, theaccessibility focus is moving


to focus the whole screen. and the speech isalso talking. computer speaker: i/o 2012 webview tutorial, 031, google i slash o 2012. peter lundblad: let me invokeone of the actions i talked briefly about before. and that's the action to openthe notification window. i do that by pressinga key combination on the braille display.


computer speaker: 12:31,thursday, june 28, 20. screen will rotateautomatically. check box orientationset to portrait. clear notification button. peter lundblad: herei go to the-- computer speaker: alanviverette, june 27, 12. hey, want to get a coffee? peter lundblad: ok, soalan is asking if i want to get a coffee.


that's great. let me respond to that chat. what i also have on the brailledisplay is a row of small buttons that easily letsme click on anything that i have focused on the display. so i'm going to click onalan's chat message. computer speaker: edit box. type message. peter lundblad: that takes meright into the edit box in


google talk as we expect when iclick on this notification. i'm going to move upwards onthe screen to see what he actually said. computer speaker: this chatis off the record. peter lundblad: asyou can see, he's concerned about privacy. but he still wants tohave some coffee. computer speaker: this chatedit box, type message. peter lundblad: i'm goingto type a response.


computer speaker: y-e-s,yes, i, i, l-o-v-3. peter lundblad: ok, making atypo, i can easily fix that. computer speaker:three deleted. e, love, [rapidly spelling letters] peter lundblad: all right,so i've typed a response. i can use the small keys iearlier mentioned to move around on the screensometimes. computer speaker: this editbox, yes, i love coffee.


peter lundblad: ok,so i have now-- i'm going to press the buttonto actually send-- computer speaker: this chat-- peter lundblad: --thismessage. computer speaker: --isoff the record. edit box. yes, i love coffee. peter lundblad: and thereyou see that i can actually send a message.


and it now appears in thechat list before. i'm going to invoke anotherglobal action that we have that's very convenient. and that is a key combination,again, on the braille display. computer speaker: home,home screen three. peter lundblad: that takes usback to the home screen. and with that, i'm going tohand it over to alan who's going to talk abouttouch exploration. alan viverette: thankyou, peter.


i look forward to gettingthat coffee later. so as we showed, youcan use your finger to explore the screen. so you can set accessibilityfocus by touching your finger to the screen, asyou just heard. this provides random access toon-screen content, which is really great if you are familiarwith what the screen looks like. so somewhere like theall app screen you


have a lot of buttons. and you can find thingspretty quickly. you can now double-tap toactivate the item that has focus with absolute certaintythat what you just heard is what's going to be launched. now, this is great. but having some deterministicway to access items on screen is even better. so let's say you havea really big screen


with one little button. i can move my finger around fora long time and never find that button. but if i can touch my finger tothe screen and just swipe to the right to go to the nextitem, i can find that button very quickly. and in fact, i can just keepswiping right to go through every single item on screen. so we've added these swipegestures that i demoed earlier


when peter was talking. and we've also added gesturesfor global actions. so peter showed you homeon the braille display. you can also draw a shape onthe screen to go home. we also sort back recentapplications and notifications. an accessibility service liketalkback or brailleback can also use gestures to manageinternal state. so in talkback, we have agesture that you can use to


start reading the screenby word or character instead of by object. so here's a quick overview ofthe gesture mapping that we have in talkback. you'll see there area lot of gestures. and in fact, these aren'tall of the gestures. we've left a littlebit of room for experimentation later on. so let's do a quick demofor explore by touch.


all right, so first i'm goingto start with touch exploration. apps. home. showing item three of five. alan viverette: so i can lookthrough my apps, random access by moving my finger. or i can swipe left and right ifi know that what i want to find is probably a littlebit past maps.


computer speaker: messenger. navigation. people. alan viverette: ok, so if i wantto launch people, i can just double-tap anywhereon the screen. computer speaker: full contactsdrop-down list. alan viverette: andi get contacts. so if i'd like to go back, ican draw a back gesture. computer speaker: clear apps.


alan viverette: let'sgo back again. computer speaker: clearhome screen three. alan viverette: and let's takea quick look at the google i/o. computer speaker: i/o zero,google i slash o 2012. google i slash o 2012. list showing one items. alan viverette: here ihave a list of items. computer speaker: showing itemsone to three of 21.


alan viverette: andi can tap-- computer speaker: 8:00 pm browsesessions empty slot. alan viverette: --on anitem within that list. and if i want to move an entirepage at a time, there's a gesture for-- computer speaker: 10:00 am. alan viverette: --movingup an entire page. computer speaker: wednesday,june 27. alan viverette: so these are thesame accessibility actions


that we use in brailleback. and they're something thatyou, as a developer, generally, won't haveto worry about it. t. v. raman: so notice that-- alan viverette: sorry. t. v. raman: notice what alanis showing there is a very, very powerful interaction modelfor completing tasks so you use the play storeall the time. so you know that theinstall button is


approximately somewhere. so touching the screen and doingone flick is pretty much all it takes. whereas in ics, youwould explore. and then before ics, you wouldhave used the trackball. so it makes the user interactionmodel really, really effective. and also, notice that with whatwe have done, the access guidelines also change.


where in the past only thingsthat could take system focus were visible to talkbackbefore ics. and we used to say makethings focusable. now your life as a developeris a lot easier. alan viverette: so as imentioned, as a developer, you generally won't have to worryabout this, except when it doesn't work. so you might wonder whatreceives focus when i'm moving my finger around the screen.


now, obviously, as a developerwho's probably made layouts before, you may knowthat you'll have a lot of nested layouts. obviously, these aren'tall being read. so we're picking actionablegroups. so actionable means clickableor focusable. and if you have a group, likethis folder icon that we're showing in the image, this isactually a group that contains an image and a piece of text.


and because the groupitself is clickable, it gets read aloud. now, if it has a contentdescription, then, instead of its children being readaloud, the content description is read. and if you have an item thatisn't actionable, and it doesn't have any actionablepredecessors, obviously, the only way that can be read isif somehow you can put accessibility focus on it.


and fortunately, it will receiveaccessibility focus. so here's a hierarchy viewerview of what that folder icon so you can see that there'sa folder icon, which is a view group. and that contains an image viewand a bubble text view in xml that looks like this. so you can see the folder iconis clickable, thus making an actionable group. and its children havetwo piece of text.


so the image view says folder. the bubbles text viewsays google. and when talkback puts focus onthis actionable group, it will say, folder google. so some tips when you'redesigning your application, make sure you use built-inwidgets whenever possible. these things will just work. because people have put a lotof thought into them. make sure your app works witha keyboard or d-pad.


so what we always used to saywas make sure your app works with a d-pad. and if your app did work witha d-pad, fortunately, it will just work. you may have to make somechanges if you were doing very special things. but for an application usingbuilt-in widgets, it will most likely just work. make sure that you havereadable content


on actionable items. so if you have an image button,give it some text in a content description. and if you have a view groupthat's focusable or clickable, put some text in it or put somecontent descriptions on some of its child items. so here's an example of baddesign and a way to fix that. so this is an orphanedactionable item. you have a frame layoutthat contains a view.


this view is clickable. and it fills the entireframe layout. this text view, which has text,also fills the entire frame layout. and for a sighted user, thislooks like a big button with a text label that youcan click on. and that's how it performs. but to a service like talkbackor brailleback, this looks like two separate items.


so if you make the frame layoutclickable, and you put text inside of it, you have anactionable group and a child that will be read out loud. all right. so sometimes it getsa little bit more complicated than that. so let's say you've made thisreally awesome keyboard. and so here's what it lookslike on screen. you've got a bunch of coollittle buttons that you're


rendering in code. so instead of having actualviews for each one of these buttons, you're just drawingthem onto the screen. here's what the xml forthat would look like. and as you might be able toguess, there's not much of a hierarchy there, not manyactionable groups nor many readable children. and you can fix that byproviding more information to services like braillebackand talkback.


so this is what it looks likewithout any changes to a blind user and to an accessibilityservice. it's just a big blank area. so fortunately, we have threesteps that you can take. one is, in your custom view,handle incoming hover events. when accessibility is turnedon and explore by touch is turned on, when you touch thescreen, your view receives hover events. if you take a look at theandroid source code for view,


you'll notice that hover eventsget handled a little bit specially if accessibilityis turned on. so here, because i know wherei'm rendering the keys on screen, i can map thexy-coordinates of a motion event to a key. i can say, was i justtouching this key? if not, then i know that i needto send the appropriate accessibility events. so as i touch the key, i'llget a hover exit from the


previous key, a hover enter forthe new key, and talkback or brailleback will saythe appropriate speech for the key. step two, you need to populatethat outgoing event. so you just sent a hoverenter event. you need to tell it what keyyou were just pressing. so here, i've made this sendhover enter event for a key method that takes a key, takesan event type, and populates the event with the informationfor the key.


so that would obviously includethings like text. and here i'm also settingthe source. because to get this great jellybean functionality of being able to swipe and beingable to double-click to activate things, i needa node hierarchy. so after i send my accessibilityevent, that gets to talkback or brailleback. and it has a virtual key id. it can then query my applicationfor the node info


that's associated withthat key id. so here i'm using a nodeprovider, which i've taken out due to space constraints. but it has thiscreateaccessibilitynodeinfo event that takes a key id. i map that key id toan actual key. and then i populate the nodeinfo with the key's properties. for consistency sake, i'm alsosetting the parent of the node


to be the keyboard thatit belongs to. and i'm setting its source to beits own virtual key id and, of course, its parent. so after taking these threesteps, my keyboard looks the same to every user. so if somebody's using talkbackor brailleback, if they put their finger onit, they'll receive the appropriate spoken orbraille feedback. and we don't just handlenative android views.


we also handle web viewsreally well. and charles will tellyou more about that. charles chen: thank you, alan. hey, so i'm charles chen. i'm here to talk about webaccessibility on android. so alan just gave you somereally great advice on how to make native android appsaccessible, what you need to do, and how to do it. you can do the same thing ifyou're building a hybrid app.


so if you're building a hybridapp that's a mixture of web content and native javacontrols, you can make that accessible and make it workreally well for users with visual impairments. so if you're using a web viewjust to do something really simple or really basic-- for example, maybe display aterms of service displaying instructions to the user-- then that case is prettystraightforward.


you have a web view. you put the text in there. and we'll just process itlike regular plain text. everything works, noproblem, simple. on the other hand, if you aregoing to build something that's a little bit moredynamic, a little bit more ajaxy, if you're going to usejavascript html5 as part of your ui, you can stillmake that accessible. you can still do agreat job here.


all you really have to do isto follow the same best practices that you would do fora web app on the desktop. and the reason you do that isbecause on android, we're running androvox. so androvox is a partof chromevox. chromevox is our screen reading solution for chrome os. it runs on chrome os, chrome. by the way, for those of you whoare interested in hearing


more about chromevox, pleasecome to the talk on friday, advancing web accessibility. it's going to be areally good talk. and i'm saying that not justbecause i'm one of the presenters. rachel there is also going to bepresenting, as well as one of our other colleagues,dominick. so please attend thattalk friday, 11:30. hope to see you guysthere, please.


anyways, getting back toandroid, so androvox is a part of chromevox. and this gives you alot of benefits. so all of the hard work thatwe've put into making chromevox works really wellon ajax content, making it support w3c standards, suchas aria and html5. all of that goodnesscomes into android. and it just works. and we've integrated this withandroid so that the two


experiences, both web contentand native android controls, blend seamlessly. and the users can justuse your app. and they won't even reallyknow the difference. and it will all just work forthem in a single simple experience. and so with that, i'm goingto switch over to demo. and so it's usually helpful toget onto the demo app that i intend to show withweb content.


computer speaker: home. charles chen: solet me do that. i/o 2012 web view tutorial. web content. google i slash o 2012web view tutorial. charles chen: ok, so as you cansee here, i have a hybrid application. this has web contentnear the top. and it also has android controlsnear the bottom.


and i'm going to touchthe web content. and you'll see that touchexploration works the same way in a web view as it doesfor any native control. so i'm going to starttouch exploring. computer speaker: accessibilityfor android's web views is handled byandrovox, a port of chromevox for android. charles chen: ok, now, alanearlier was showing you gestures where you coulddo swipes to do linear


navigation. the same exact thing workshere in web views. so i'm just going to do that. computer speaker: the same bestpractices for building accessible websites apply formaking web views accessible. charles chen: ok, so now, i'veactually reach the end of this web content. and i want to move forward. i shouldn't really haveto care about


that as an end user. so i'm just going tokeep navigating. computer speaker: previousbutton disabled. charles chen: ok, so i'veactually jumped out of the web content now. and i'm in the nativeandroid control. now i'm going to go to thenext button and click it. computer speaker: next button. charles chen: ok, and nowi'm going to click.


computer speaker:google i slash o 2012 web view tutorial. charles chen: ok, so as youcan guess from the heading here on this in the web content,this is probably not going to be a good slide. this is going to be somethingthat's really bad. so here's an example of what youshould never, ever do when you're making a hybridapplication. this is an application thatdoesn't follow best practices,


doesn't do the right things. so let's kind of go through itand see what's wrong with it. computer speaker:web content bad. heading one. here is an example of apoorly authored page. this button is made up ofdiv and span elements. it has no roles set. charles chen: ok, so i'm aboutto go to a button. it doesn't have theright settings on.


so even though visually it lookslike it's just a green button there and it looks reallypretty with 3d css, it's really just divand spans in there. it's not labeled as aria. it doesn't have any roleslabeled for it. so the user isn't going toknow that it's a button. there are no semanticsbacking it. it's just simpledivs and spans. so let's see what happenswhen i go there.


computer speaker:ok, clickable. charles chen: ok, so you knowit's clickable because we could detect that there wasa click handler there. but aside from that, you didn'treally know if that's a button, a chat box. what is this thing, right? you don't get any additionalfeedback. and that's because itwasn't properly set with a role attribute.


let's try clickingon this, though. because, hey, itsays clickable. so what could happen, right? let's see. computer speaker: clicked. charles chen: huh? ok, i just clicked something. and i know i clicked it. but i totally missed thatalert that came up.


so let's see whythat happened. computer speaker: when thisbutton is clicked, the alert region that is shown does nothave an alert role set, nor is it marked as a live region. t. v. raman: so there are thesesimple html attributes you can add. and you can look these up. there is a [? w3c ?]spec around it. but as java developers andandroid developers what you


need to know is you need toannotate your divs and spans with attributes thatgive semantics. and when dynamic changes happen,you need to annotate those regions as beingdynamically changeable. at which point, whatever adaptertechnology the blind user is using thenknows to speak. in this case, the technologyis, like charles explained, is chromevox. but this is not androidspecific.


this is basically goodaccessibility practice on the web. charles chen: thanks, raman. so that was some great advice. and again, if you want tohear more, please come to our talk on friday. ok, let's move onto an example of where this is fixed. charles chen: ok.


computer speaker: web content. good heading one. charles chen: ok, thatheading actually sounds a lot more promising. because it said good. so this should hopefully work. so again, i'm just doing swipegestures to navigate. and these are the sameswipe gestures as anywhere on android.


computer speaker: here is thesame page but with the problems fixed. the button now has itsrole set to button. charles chen: ok, so nowwe've set the correct role for this button. let's listen to whatit sounds like now. computer speaker: ok button. charles chen: ok, so now thisis working as intended. now the user knows thatthis is an ok button.


and they hear it. and it says, ok buttonso great. let's try clicking it. computer speaker: theok button has been pressed, alert. charles chen: cool, sonow i know that an alert has popped up. and it tells me the contentsof that alert. great.


so it's working correctly. and that's because-- computer speaker: the alertregion now has its role set to alert, which is treated asa live region by default. and so with that, i'mgoing to switch over to testing on android. ok, so testing for accessibilityon android is something that's really easy. it's as easy as one,two, three.


and there's really no excusefor not doing it. because it's builtinto the system. so alan here is going to helpdemonstrate just how easy it is to get accessibilityup and running. and we know you allhave devices now. so you really shouldget this done. so alan? alan viverette: feelfree to try this at home or in the audience.


home screen apps. settings. charles chen: so what alanis doing here is he went into settings. he's going into accessibility. and he's going to turnon talkback. computer speaker: showingitems seven to 24. accessibility. navigate up.


talkback on. charles chen: ok, well, sonormally you would turn on talkback and make sure explorerby touch is enabled and also enable additionalscripts for web accessibility to ensure you get all of the-- computer speaker: [inaudible]accessibility allowed. charles chen: --to ensure youget all the androvox goodness. but since we already have thisenabled, we're ready to go. so let's start testing it.


let's start experiencing whatthe user would experience. so the best way to do it isto just use your app. computer speaker: double-tapto activate. home screen three. charles chen: and we aregoing to go into the google i/o app-- computer speaker: googlei slash o 2012. charles chen: --and just try touse it the way a user who's using talkback wouldbe doing it.


so use touch exploration tofeel around the screen. computer speaker: 1:45pm, 2:44 pm. browse code labs empty,start now. computer speaker: [inaudible]enter for cache i/o 2012 brainpower ftw, check us outin after hours, stream. charles chen: and take advantageof the great new gestures that we've added foraccessibility options. use linear navigation. try to move around your app.


try to scroll lists. see if it works, if it's goingto do the right thing. computer speaker: show androidchrome, code labs. charles chen: cool. so what we're really checkingfor here is to make sure that everything in our app canbe done eyes-free. the user should be able to useyour app whether or not they're looking at it. all critical information needsto be conveyed to the user.


anytime the user does someaction, they need feedback. they need to know that theyactually did the action, that it's working, that somethingis going on. now, android linting toolshere are your friend. so earlier today, romain guymentioned that android linting has gotten a lot ofgood new features. and i think there's atalk on it later as well for android tools. and one of these new featuresis the ability to do some


really simple checksfor accessibility. now, this will notcatch everything. but this is a really goodstarting point. and this catches a very annoyingbut simple to fix error, which is missingcontent descriptions. so if you run the lintel withthis command, as you see here, so if you run the commands, lintwill actually catch cases where you have image buttonsthat are missing content description.


for a blind user that's usingtalkback or brailleback, all they're going to get is thatthere's an image here. but they'll have noclue what it is. so this is a really badthing if that happens. and this will catchit for you. now, you shouldn't be afraidto use this tool. it's not going to interfere ifyou have images that are just decorative. because if you have somethingthat's purely decorative,


that's not meant to convey anyinformation, that's not actionable, it's not reallymeant to do anything, you can always tag that withthe null notation. so if you set at null,then it will tell the tool to ignore it. it's only decorative. it's just eye candy. it doesn't really do anything. now, if you do what we talkedabout here today, then you're


going to build an appthat's usable. but i would really like tochallenge everyone here to go further, to go the next mile. because it's not about justmaking things usable so you can kind of strugglethrough it. it's really about building appsthat are great, building apps that people love to use. so really, i think we should allstrive for building apps that are just as efficient touse eyes-free as it is looking


at the screen. and with that, i am going tohand it back to raman. t. v. raman: thankyou, charles. so to wrap up, accessibilityon the android platform really, really helps you reachan ever increasing wide range of users. the platform does a lotof work for you. but if you follow some of theseguidelines that we are giving you and do some ofthe testing that charles


suggested, i guarantee you that,not only will your apps be usable by blind users, bylow-vision users, by users on specialized interfaces, but youwill, in general, discover that your user interfacesbecome more robust. they degrade gracefully, whichmeans that your application just works in environments thatyou originally did not expect it to. and that's a very, verypowerful thought. accessibility is thelaw in many places.


if you're selling to theenterprise, if you're selling to universities, yourapplications cannot be used if they don't meet certainaccessibility requirements. but that's actually, in myopinion, the initial educational reason why youshould be worrying about in general, if you build yourapps to be accessible, my own experience has been that thoseapplications eventually end up being more usablefor everyone. as an example, we last year inour i/o talk on accessibility


showcased tunein radio. i discovered that app twoand 1/2 years ago. and i loved it. it was very, very nicely done. we found it was accessibleout of the box. and today that is one of themost used radio tune in applications on android. so i think i'd love to see a lotmore of those coming from each one of you.


thank you. charles chen: also, before wego to q&a, i just want to mention that we showed a lotof things here today. and i know you all aredying to see a real display in real life. so please drop by our sandbox. our sandbox is just out in fronthere of this hallway. it's accessibility. you can't miss it.


so please drop byand say hello. and also, we havepartners here. and so come by andcheck it out. ok, and so with that, we'llgo to questions. [applause] charles chen: yes? audience: yeah, with morecomplex items like multiple radio buttons in a group orthings that need to swipe to do an action, are we going tohave to include instructions


for what exactly is going on,which radio button is selected, whenever, say,one is an option? alan viverette: so if you'reusing built-in radio buttons, no. you can just let the built-inwidgets do their job. for gestures, so as you may havenoticed when i was doing regular scrolling, i wasusing two fingers. so when explore by touch isturned on, your one-finger gestures simply becometwo-figure gestures.


charles chen: yes, sir? audience: so my questionrelates to content descriptions. let's say you had a listof items, say movies. and then when you entered it,you got an image of the movie's poster art. and then you had the title. is it appropriate to make theposter a null content description?


should that somehow dynamicallybe sent or just say, this is a movie poster? alan viverette: so i think, ingeneral, if you're going to add a content description,it should add meaningful information. so if you've already got thetitle of the movie, and the movie poster just reiteratesthe title, then you should probably avoid it. t. v. raman: yes, and idefinitely wouldn't like the


thing to say "movie poster."because that doesn't really give me that much morefunctionality as an end user. so err on the side of makingyour application less chatty. audience: thank you. i think you had it first. audience: and what [inaudible] that he showed today are notapplicable for 4.0, right? it is available for onlyjelly bean, right? t. v. raman: yes.


alan viverette: correct. audience: right, ok. and the question that i have istypically for the standard object's data fixed contentdescription. is it possible to modify thecontent description. for example, i'm using a webview in my application. and whenever i launch my view,it sees [inaudible]. and because i'm using, let'ssay, [inaudible] that i want to save something else, isit possible to do that?


charles chen: so my advice toyou is to actually not put a content descriptionon web view. because, if you do, your contentdescription will trump the normal behavior. so you'll actually lose all ofour web handling abilities. and you'll end up having toreimplement the whole thing yourself, which is notwhat you want to do. so rather, you shouldn'tdo that. instead, you should offer yourpage in a way that really


follows html5 accessibilitybest practices. and it should just work. if you have more detailedquestions about that, i'd be happy to chat with youone-on-one offline. audience: ok. charles chen: and i'lllook at your apps. audience: ok, thank you. thanks a lot. charles chen: no problem.


yes, sir? audience: hi. look just with a lot of apps youoften find that there's a sort of help guide tohow to use the app. and this is not even in therealms of accessibility. now, seeing how you stumbled abit with getting to the send button in the-- was it the instant messagingdemo that you made earlier? when you first found thatprogram, did you have to sort


of prod around to evenknow that there was a send button there? how can we make it sort ofintuitive but accessible at the same time? t. v. raman: so you aska very good question. so one way you can actually makeit really intuitive for blind users is to havethings appear in places you would expect. i'll give you an exampleof this from real life.


tonight, after the google i/oparty, you will all go back to your hotel rooms. and when you open your hotelroom door, you do not hunt around for the light switch. the light switchis right there. and i personally would like tosee touch screen interfaces, in the next couple of years,develop that level of consistency, where things arewhere you expect them to be. today, for a blind user, touchexploration is our way out.


so we explore. but that, as you observed,is painful. and in a world where things havesort of settled down-- today, in the mobilespace we are all innovating very rapidly. and in some sense, all the userinterface controls are in different places dependingon what the designer thought was best. but hopefully, things willsettle down, where a year from


now, just as today you don'thave to hunt for the light switch in your hotel room, youcan find the ok button or the install button without actually hunting on the screen. audience: ok, can i quickly add,do you think it would be a good idea, if you opened yourdoor to the hotel room and a voice said, "there's alight switch to your right"? t. v. raman: that wouldbe a nice thing. but on the other hand, if it'salways on the right, why do


you even actually needto say that? because, for instance,supposing we built a system like that. we said, every time you openthe door it says where the light switch is. what is a deaf usergoing to do? what is a deaf-blinduser going to do? we end up-- i think, it's hardto say these are


mutually exclusive solutions. but user interfaces work bestwhen you don't notice them. they walk up to a door. the type of door handletells you where you should push or not. the door shouldn't say,push me or pull me. charles chen: also, i'd like toadd that, in your specific question, that's actually oneof the really powerful ways where you can use touchexploration and linear


navigation in tandem. if you remember during mydemo, i did some touch exploration first onthe web content. and i started doing gesturenavigation. and this is the samething that works in any part of android. you can touch exploreto something. and then you can start doinggesture navigation from that point onwards.


so the first time you use anapp, it might take you some time to know where thingsare laid out. but once you figure that out,in the future, you can get into the general vicinity,and then just use linear navigation to get there. t. v. raman: so for instance, onthe phone today when i ran chrome, i load cnn or bbc orwhatever sites that i read news on often. i know that the top 1/3 of thatpage on that screen is


navigational stuff, and thingsthat i would really not need to read on a regular basis. so i approximately put my fingerhalfway down, hear a headline, and then say, ok,now read from here. so it's a powerful paradigm. but a year from now, as thesethings get consistent, hopefully i'm sure we'll sayingsomething more optimal. audience: so we have the optionto use javascript to mimic native gestures, suchas a swipe or a rotate.


now that you have the abilityto go from-- normally a one-finger would swipe or twofingers would rotate. now, you're saying that i canuse two fingers to swipe with the touch feedback. does that mean that webdevelopers can continue using javascript libraries thatmimic native gestures? because that's normallybeen a problem. charles chen: yes, so basicallywhat's happening here is that, as a developer,you don't really know the


difference between-- you won't realize howwe're doing the two-finger swipe thing. what's happening is, on theandroid end, you code it the same way as you wouldnormally. and what you'll get for the enduser is, if they use two fingers and they have touchexploration on, you'll see one finger. so you won't actuallysee that difference.


t. v. raman: so the answer toyour question is you can continue using thoselibraries. and because we've done thisconsistently at a platform level, blind users of yourapplication will know that they need to use anadditional finger. so that one finger that they'reusing for touch exploration basically to theplatform looks like a mouse pointer moving whenaccessibility is on. audience: is there a way toeasily detect if accessibility


is on on the device? charles chen: yes. audience: and is it advisable tochange your content if that flag is on? t. v. raman: so you can checkprogrammatically, yes. unless you're doing somethingthat is really heavily custom, where you think you can actuallyprovide better semantics, i wouldn't actuallychange the content. so, for instance, if you havean extremely custom view.


so alan showed you the exampleof a keyboard. but let's say you developa fancy calendar app or something like that. and you have this custom canvasinto which you have built up your calendar modelusing a couple of lists, a couple of grids, and whatever,and you feel, as an app developer, that by saying list,list, grid, and button the semantics of your app arebeing lost, then you can basically implement your ownaccessibility bits just the


way we do for some of the morecomplex platform widgets. but that's the level at whichi would customize things. i would not sort of do aseparate view or a separate content layout. because over time, the twowill go out of sync, and you'll have problems. alan viverette: it's very rarethat you'll save time by implementing somethingseparate. charles chen: yeah, it onlymakes sense if you were doing


everything openglor something. and it's just a plain list. and you wanted to justhave a symbol list. alan viverette: i shouldmention, though, if you use a node provider, you can makesomething that's written in opengl using a surface rendereror a gl canvas totally accessible. and it would beindistinguishable from a real view hierarchy if youimplement the


node provider correctly. charles chen: yeah. so in general, don'ttry to do that. t. v. raman: so the extra codeyou would be implementing is what alan and charles described,which is the node providers and exposing-- so you'd be exposingsemantics through those virtual hierarchy. charles chen: anyway, if youhave anything more specific,


we'd be happy to talk to youoffline after this talk. yes? and i think this might be thelast question, since we're running short on time. audience: hi, myname is daniel. thank you for this session. when peter sent alan the messagein the presentation saying that he loves coffee,he managed to find the send button.


but the device did not give himfeedback that the message actually was not sent. we could see it. but we couldn't hear it. so peter thinks hesent the message. but alan is stillwaiting for it. so what was the error inthe programming of the application? or what was the source of thisnot giving this important


feedback message hasnot been sent? peter lundblad: that's a greatquestion, actually. i think that it's a balancebetween, of course, giving too much feedback. in this case, we should probablyhave given a little bit more feedback. but you can always go back andcheck if the message was actually sent if you reallywant to know. t. v. raman: i think it was--


also, when you do hitthe send button successfully, it does say sent. and i suspect what happenedhere was we were all, our concentration was more on doinga talk over demos as opposed to real usage. but the messaging applicationdoes give you feedback when you send successfully. it doesn't give you feedbackwhen you fail to send. and so lack of feedback thereis feedback, in some sense.


charles chen: and also i'd justlike to plug one of our new accessibility events,successfully type announce. this is exactly the usecase where that would have come in handy. alan viverette: sotype announce-- charles chen: we'llbe upgrading that. alan viverette: --iswhen you want your app to just say something. if you send an accessibilityevent with the type of


announce, it will justget read verbatim. charles chen: thank you. ok, so any last questionsor no? ok. alan viverette: one last plug,we're down the hall at the


Android TV api

corner, accessibility booth. come see us. t. v. raman: thank you, guys. and look forward to your apps.



Android TV api Rating: 4.5 Diposkan Oleh: PaduWaras