- Hello and welcome to the "State of Accessibility Podcast," episode 14 from TPGi on LinkedIn live. I am David Sloan, Chief Accessibility Officer for Vispero. And yes, we missed October, so if you are looking for October's podcast, we took a month off, but we are back, and this week we're recognizing that on Thursday, November 13th, that will be the 20th annual World Usability Day, where this year's theme is emerging technologies and human experience. Now, mobile devices probably are unlikely to be considered an emerging technology now, but accessible mobile user experiences, there still seems to be a big opportunity for improvement. So today we are talking mobile Accessibility and UX, and I'm joined by one of TPGi's Principle and Accessibility Engineers, and our leading specialist in mobile app Accessibility, John Lilly. So welcome to the podcast John, would you like to share a bit about your Accessibility journey and where your interest in mobile Accessibility come from? - Sure, happy to. Thanks for having me. So, like many others, my Accessibility journey just happened by chance. I ask a lot of people that, and it's the same story. So it was back in 2010 and I had no experience with Accessibility or even knew what it was. I had never really even interacted with someone with a vision loss. So the only time I'd seen one was in college where there was a guy, he was walking around campus with his white cane and just zooming through the sidewalks. And I was amazed how he could do that. But my background is in computer science and I was in college and needed money and also wanted to get some college credits at the same time. So I asked my professor if there was any type of internship I could take, and he said that there was one with the American Foundation for the Blind doing image processing. And I said, sure, why not? I need the money and I need the experience. So I went and got an interview and got the position for the internship. And it turns out that the actual internship, the image processing part of it was this project where they were taking pictures of device screens. These were back in 2010. These were little things, like blood glucose monitors and blood pressure cuffs and things like that. And we were measuring the color contrast of the device screen and different lighting conditions, so bright lights, dark room, normal room, things like that. And we had this whole apparatus where the camera was connected to a computer and it was controlled by the computer. And then there was a light sphere that led us control the lighting conditions consistently. So we could turn the light sphere way up and it would shine like a spot on the device screen, and we could take a picture of that, or we could do a dark room which, a dark room in quotes, we draped felt over the whole apparatus so there was no light coming in. And we would take a picture of the screen that way. And on the computer we had some software that we wrote and we would mark a pixel on the screen and then a pixel off to the side of like the text just to get the color contrast difference between the text and the screen, including the glare and everything that came from the lighting. But that was a fairly short lived project. It either completed or ran out of funding. That's the nature of non-profit projects. - Yeah. - But after that, I moved over to AFP's Consulting, digital Accessibility Consulting division and kind of helped out there. And that's where I got my start. - That's awesome. And AFP's a great organization. I love the research they do to sort of support understanding the use of tech in daily living for people who are blind or have low vision. So that's a great organization. And just that story of exploring Accessibility needs that are also situational, you know, thinking about measure the impact of glare and visibility. That's pretty cool. - Yeah, yeah. They came up with a few different good ideas for some projects. There was another one where we did, it was an introduction to learning how to use NVDA. So it was a video tutorial and it was geared towards someone who didn't know how to use a computer, didn't know how to use the screen reader. So it was kind of geared towards someone new to vision loss, mostly those older individuals, probably someone with diabetes or something like that. So we did certain projects like that. - Yeah, that, and again, you understanding a transition into requiring assistive technology as opposed to assuming that somebody using assistive technology has kind of used it since the beginning and knows exactly how it works. - Yeah. - Yeah. So then you got into mobile Accessibility. How did that happen? - Yeah. So it was... Back in 2010, it was probably considered an emerging technology. And I was in school during that and the whole mobile app craze, and there's an app for that, if you remember all that going on. - Yep, yep. - And I was always really into tinkering with things. And while I was in high school, I built my own computer and did a senior project on that whole, I say ordeal, but it wasn't that bad. But that's how I got into to mobile app development and mobile devices in general. While I was in school, I just wanted to learn more about it. So even my senior project in college was an Android app that allowed the user to set an alarm for classes based on the distance that they were from the location that they needed to be. So if you were at home and it took 20 minutes to drive, you didn't set a time that you wanted the alarm to go off. You told it when the class was, and it would determine how long it would take to get there and give you the alarm early enough so you could drive over there and get to class on time. It was also a big time for the iOS and Android hacking scene. I do hacking in quotes, but it was really just custom operating systems and things like that. So I probably installed a new operating system on my Android phone probably once a week just to try out new features and bleeding edge, things like that. - Oh, I bet you could have sold that app to the school for a lot of money. - Probably. - If it's something that increases attendance, yeah. - Yeah. - So yeah. So let's kind of dive into exploring the specific challenges, particularly around native mobile app Accessibility. You know, there's a lot of, we kind of, when we talk about digital Accessibility, we tend to default to web Accessibility. So what would you say are the biggest differences between native mobile app Accessibility and web Accessibility? - I would say the biggest thing, at least that I see is the limited screen size of mobile devices. So I know when mobile, you can have smaller screen sizes and things like that, but, or on web you can have smaller screen sizes and things like that, but on mobile you're pretty much limited to a really small screen. And usually in portrait mode there's a requirement to make it in landscape as well, but most people use it in portrait. And there's a WCAG requirement that text needs to resize up to 200% without losing any functionality. That can be hard in some situations on mobile devices, specifically for some of the common design patterns, like tabs, you have these small tabs at the bottom of the screen that contain text and that text needs to increase up to 200% and the user still needs to see all the texts inside the tab. So that's one of the biggest difficulties, I see. The other thing that I see is that the APIs for mobile devices, the Accessibility APIs, they're quite different than how web works. Web is pretty much just a markup language and it has some really clearly defined specifications for HTML and ARIA, those don't really apply to mobile, at least native mobile. The only things that really apply is what Google and Apple makes the Accessibility API support. So for example, like on web you have a radio button, on iOS, native iOS at least, radio buttons don't exist. So a lot of times I see those design principles try to be faked on something like iOS, where you make the screen meter announce that something is a radio button, but technically it doesn't have the role of radio button because it's either added to the name or to the Accessibility value or something like that. There's also the issue of grouping. There are info and relationships specifies that things should be grouped just based on like their visual presentation. That grouping needs to be available programmatically. On iOS, there's really no way to programmatically group anything. You can add extra information to the accessible name, but that's a little cumbersome and not always super robust to do. - So, yeah, it seems like there's, a lot of work is just understanding the constraints and the opportunities of that more limited ecosystem that you're working within and figuring out how to be rely... You know, how to build something that's robust. - Yeah, yeah. That's a big challenge. - And I think it's something we're gonna come back to later in our conversation, but I just wanted to kind of skip over to looking at it from the user perspective. So from a UX perspective, what should designers and developers know about how people with disabilities use mobile devices and mobile apps and maybe differences in behaviors compared to interacting with websites and applications? - Yeah, so websites, they can be used on a really large screen. So you can have lots of content on a website, but mobile devices, one screen is generally used for one activity. And what I would like to see, and what I generally recommend is that when you design apps, the primary function of that screen should be either visually close to the top of the screen that helps magnification users, but for screen reader users make that function appear close to the top of the screen in the focus order, so that they can encounter that pretty quickly. So let's say you've probably seen something called a floating action button on Android, those are like little buttons that float on top that make it easier to access. They usually kind of float to the bottom right of the screen, but makes it easy to access with one hand, you can just tap that button and that's the primary action. But a lot of times what I see is that, primary action appears at the very bottom of the focus order. So let's say that you have like an email app and the pose button is one of those floating action buttons. Well, if you're a screen meter user and you're not really familiar with the app enough to use like touch to explore, you would swipe through the screen and try to figure out where that pose button is. But in the focus order, it appears at the very bottom of the screen. So you have to swipe through all of your emails to find the pose button. What you can do, and you should do is modify that focus order so that that composed button appears as close to the top of the screen as you can get where it still makes sense so that all users can find that controlled pretty quickly. - So that, yeah, that kind of focus on apps as things that are more focused on helping people do specific tasks, you know, I've certainly in doing user research heard that story that, if there's an app that allows me to do something, I'd rather use that than use a website that allows me to do that plus a bunch of other things. The app gives you that constraint and that focus and that can be fantastic from an Accessibility perspective as well that it's more stripped down, it's more focused and supporting task completion like a payment app or an app that allows you to manage your bank account or whatever is kind of from an Accessibility perspective, it helps simplify things. So kind of really, really focusing on supporting that task is essential. And if you start to sort of load the app with additional functionality, that would be our content that would be better off on a website, then that takes away the value of the app, so- - Yeah, I think originally they were designed and now they're trying to make it to where you stay on your phone longer just for advertising reasons. But originally apps were designed so that you could take out your phone, do something quickly and put it away. And that's essentially a design principle that I'm trying to push here is even allow screwing new users or any other type of assistive technology users do the exact same thing. Open up their phone, do what they need to do, put it back in their pocket as quickly as possible. - Right, and I guess in some cases people aren't, you even taking the phone out of their pocket if they don't, if they're relying on audio or active feedback, then it doesn't need to come out from your pocket. - Yeah, yeah, that way. - So that brings another. Another whole sort of understanding behaviors, I mean maybe this is kind of getting to the edge of a design perspective, but knowing just how people use devices, you know, the having the fixed orientation for somebody who's, you know, maybe has their phone sort of held on an arm on a wheelchair, a scooter, whatever, you know, that underlines the importance of not having a fixed display or assuming a fixed display for UI design, but also somebody might be not taking the device outta their pocket. So yeah, there's all sorts of things that the Accessibility requirements that really become more essential in mobile app design and development. - So thinking about, you know, from an Accessibility tester perspective, you know, again, a lot of focus and resources on Accessibility testing, kind of assume you're working with a traditional website or web application. What adjustments do you think are most important for Accessibility testers in terms of testing native of mobile apps? - So there's a phrase that I see common in issues that are logged for Accessibility and they usually say something like, the screw meter says, or the screw meter doesn't announce this thing. Well that's probably good for the reader of the report to understand or to give some context of what's going on. It's important for the tester to understand that that is not actually the metric that needs to be met for the Accessibility testing. So, what a screen reader does is it can see what's called the Accessibility tree and it interprets that Accessibility tree and sends it out over voice or braille. We'll get to that in a second, to the user, and interprets the Accessibility tree that way. Just making the screen reader say something, doesn't necessarily mean that it's done correctly. So we can, previously I mentioned radio buttons on iOS, we can go back to that example. On native iOS you can definitely make a radio or, or a control say yes, radio button checked. In native, that's actually not correct in native iOS. That's not correct. And I would actually fail that for 4.1 0.2 name role value if I was doing some testing on it because radio buttons don't exist in iOS. So there's actually no role tied to that. So it doesn't have a programmatically determined role, which are actually called Accessibility traits in iOS. But the reason I would fail that, is because it might sound fine for a screw meter user, but if you have a braille user that's a very large piece of text to display on a braille display, they have some pretty small things, these rows that have a refreshable braille on them. And the bigger the display, the more expensive they are. So people tend to lean towards the smaller ones 'cause they're a little bit cheaper. But if that was coded as a radio button and iOS for braille display, it would spell out that entire piece of text. But if you were on web, it would heavily abbreviate that. So it would be yes, radio, BTN for button, and then the checked or not checked state is actually these symbols. So it's a bracket with an X in it, and then for checked and a bracket with nothing in it for not checked. And that fits a lot better on a braille display than yes radio button checked. But instead of doing that radio button, I would recommend that they make it a regular button, just not a radio button and use the selected state. So on a voiceover that would say yes button selected, and on braille it would be abbreviated as yes, BTN and then SELD is the abbreviation for selected. And that's what they would expect for a braille display. And that also has the added benefit of not needing any extra translation. So if you had all of that spelled out, like in the accessible name or the Accessibility value, you would need to take care of the translation of that in a different language. If you use roles and traits appropriately, voiceover just handles that itself and you don't have to worry about it. - Mm-hmm. So, the remediation advice that you're, as a tester, you'd be providing really focus on understanding your platform and its constraints and don't try and bring web Accessibility to remediation advice into a platform that doesn't, where that doesn't fully apply. - Yeah. - It's not appropriate. - They're totally different things. So you have to consider one supports something and the other doesn't. So that's something you gotta take into consideration - And yeah, I mean it just underlines as a tester, yes you can report that something's an issue, but if you want to be successful, then helping somebody remediate that issue in the most effective way is, is such an important part of the equation. - So, you know, thinking about testing, what would you say are your most valued tools that you rely on for testing native app Accessibility? - So I would say Accessibility inspectors or the number one thing, even on web, in the Chrome dev tools, there's an Accessibility tab that lets you see the whole Accessibility tree. Those are indisposable when I'm, when I'm testing it lets me quickly see if something has a role, if something has a name, what the values are and things like that. You can do that with screw meters. But I have a pretty extensive background in development for web native and things like that. So normally when I'm testing, I'll inspect first and see what issues I can gather from that. And then run through the screw meter afterwards to see if the experience is what's to be expected. So on iOS, there's a. what's called the Xcode Accessibility inspector that's only built into to macOS. So you have to have an Apple device to use that. It doesn't let you see the full Accessibility tree, but it will let you cycle through different elements on the screen and see the actual Accessibility information for each of this element. So it'll say what roles it has, what's the name and what's the value. And there's other things like user labels, which those are things that a developer can set for voice input users. So you can have tons of user labels for a control and it could be just different keywords or things that a user or a voice user might say to access that control, but not necessarily render as the accessible name. - Mm-hmm. - And for Android, I don't wanna toot my own horn, but the one that I use the most is the one that I actually made. So it's available for free on GitHub and it's called the Accessibility Inspector, but it's similar to, I don't know if you've ever used Appium or ever heard of it, but it's similar to that in functionality. So it has a screenshot of the app and it just takes a single snapshot of the screen and the Accessibility tree. And then it lets you... You can either click on the screenshot to select different elements or you can click on the Accessibility tree and select those and each of those will display the full Accessibility information that talkback can see for that particular view. So it's using the exact same APIs that talkback is using. So any information that talkback can see this program can see. - Mm-hmm. - It also has a cool feature where it will look at the locale or the language setting of different pieces of text and tell you if that's actually included. So you don't kind of have to guess to see if the screen reader is pronouncing something differently. You can see if this actual language attribute has been applied. So you can test for language reports. - Yeah, I remember when you shared that tool as part of your talk at CSUN conference earlier this year and it got a lot of positive feedback. So Android Accessibility inspector on GitHub. We'll see if we can put a link up to just I mean I think it's a terrific tool and this is definitely the opportunity to share it and encourage others to use it and provide feedback as well. But overall it send seems like it's more a case of using inspection tools to help guide the manual testing process instead of kind of sort of full automation of native app Accessibility still seems a ways off yet. Is that fair to say? - Yeah, there are some limitations for full automation, you can test a handful of different success criteria, but most of it's gonna come from actual manual testing. It's not quite as extensive as web and that's mainly just because most things aren't exposed that you would need exposed to to do the automated testing. - Yeah, that's important to know. So anyone who's kind of promising full automation of native mobile app Accessibility is kind of been extremely optimistic I guess. - Yes. - Right now. So, you know, I've seen that arguments that say, there needs to be a separate standard for mobile Accessibility because of the differences of the app development and operation. And yet earlier this year, W3C published the WCAG to mobile resource, so giving guidance on applying WCAG to native apps and then the updated EN 301549 standard that we're hoping is gonna be published like early next year has an in specific section on software Accessibility that could be applied to mobile apps. So do you think that we have what we need from existing or upcoming standards to guide accessible app development or do we still need something new? - I don't think we necessarily need anything new WCAG, I think it can do what we need it to for the most part. I do think it could use some improvements to be less web centric 'cause there are some success, success criteria that specifically say for webpages, things like that. So you might have, and that's actually in the normative language. So someone might say, well this isn't a webpage so it doesn't apply. But I don't know if they'll even consider changing any of the normative language until maybe WCAG 3 because I don't think any of the normative language changed between 2.0 and 2.2 besides some being deprecated, but it was mostly just additions, but maybe a change to the full name of it. So web content Accessibility guidelines, it's kind of limiting. I know you can apply it, but the actual name of it just feels a little limiting. So maybe a broader stroke for the guidelines, so like digital content Accessibility guidelines or something like that just so that it can cover like all paradigms. - Yeah, no, that's a good point. I think the plan is that WCAG 3, will stand for W three C Accessibility guidelines rather than web content. So they don't have to change the abbreviation or acronym. But you know, whether people get that change when they're still writing and saying it the same way, who knows. But yeah, I think that's an important change. So we are, there's also some sort chat about hybrid apps, you know, something that combines native app with traditional web content embedded in there. Do they make design development testing for Accessibility easier or harder in your experiences? - It makes it quite a bit more difficult, when they do these hybrid apps. So once when you encounter, it's called a web view, which is essentially just a, a website embedded inside of an app. Once when you encounter web view, all of the native stuff goes out the door and full HTML and ARIA apply. So, you have to fully apply the HTML specifications and the ARIA specifications. And sometimes that's a little difficult to do because a lot of times you can't see the actual underlying HTML to do a full analysis to see if it's actually coded to specification. It's also a little difficult to tell, at least on iOS, when a WebView begins and ends. On Android, it will say WebView, so you know when it starts, but on iOS it doesn't tell you. So yeah, you can kind of gather from some clues, once when it starts announcing things like landmarks and heading levels, heading levels to a less extent because now iOS native actually is starting to support heading levels, so you can't really rely on that too much anymore. But there are some ways you can tell on iOS. So I mentioned Appium before. If you use a something called Appium inspector, it's kind of difficult to get set up, but once when you do, it will give you a full hierarchy of the view and you can see if something's inside like a web view, so you can kind of tell where to start applying the web specifications to that section. It can be, you can actually inspect the actual HTML. On iOS, it needs to be enabled by the developer to allow for remote inspection. So you probably won't see that in any actual production apps. If you're working for an organization that makes an app and you are doing the testing on that app, you could probably ask the developer to enable this for the, like the development apps. But you can connect the phone to the computer, open up Safari, and if that's enabled, you can open up a developer tools that will display everything that's in that web view on the phone. So you can inspect the HT ML that way. You can do the same on Android. It also has to be enabled by the developer, kind of, I'll get to that in a second, but you do the same thing with Chrome. So if, if that setting is enabled, you, you open up Chrome dev tools. Actually you go to, it's a URL called Chrome colon slash slash inspect, and that will list all of the remote devices attached to the computer and then you can click on the, the one that lists the, the web view in your app and it will display a screenshot of the phone. It, it's not exactly a screenshot, it's kind of live. So you can actually scroll and control the website through your computer that way and the full HTML and any like JavaScript or CSS associated with that. So you can actually inspect that way. If you don't have that setting enabled and you have access to a rooted Android device, this gets back to my iOS or Android hacking Custom operating systems. If you have a rooted Android device, that means you have full system access, and there there are ways to modify the apps running in real time to change variables so that you can say that whenever Chrome asks if the setting is enabled, you can flip it and say, yeah, it's enabled, so show me all this content that you wanna see, and you can expose the HTML content for inspection, even if it's not technically enabled. You can't do that on iOS because I think ever since they started doing the Apple chips and iOS 16 or 17, no one's been able to crack that to do a full jailbreak on, on iOS since then. - So yeah, ultimately hybrid ops definitely. - Yeah. - Levels of complexity to testing, yeah. - Yeah, it can Be difficult. - And for users as well. I mean potentially, if there's been a focus on native app accessibility and someone else is in control of the HTML and it's just people have been been paying attention to the accessibility there to the same level that, you know, the user experience is affected as well. So yeah. I know we're heading towards the end of our time here, but I'd love to get your thoughts on like what, you know, we talked a lot about app design development testing. What about Apple and Google's role? you know, what could be, what opportunities are there to improve mobile Accessibility, you know, from a platform or operating system or even device design perspective? What changes would you like to see them make? - So iOS is gonna be short. I would like to see them allow for proper grouping, semantic grouping of elements, and to allow for associations between those elements. So essentially like an aria described by, but in a native sense. Right now, you can't do that on iOS. So it makes kind of grouping things a little more difficult, at least programmatic grouping of things. And the other, the other side of that for Google is completely different. What I would like for to see them do is add full support for what's called HID braille displays, and that stands for human interface device. So that's in I think Android 15, they announced that they officially support HID braille displays. That's not actually the full story. So HID is a communication protocol between different devices so that it's a standardized protocol. And what that is is something like a keyword or mouse, they both use HID protocol. And because that communication is standardized, that means any device can connect and use those devices. So that's why you can connect a keyboard to a mobile device or a mouse to a mobile device and it just works. It's because they're all using these standardized communication protocols. Well, there is one for braille displays and it's standardized, but Android doesn't fully support it. So they're seems like they're kind of doing like an an arbitrary check to see what, first, what device is connected to the phone. And if it's not in the list of their approved HID braille devices, then it doesn't allow you to connect it. Secondly, if you do make it past that part, it also does a check to see what type of service is running. And by service I mean like with Bluetooth you have different types of services. You can have an HID service, you can have a service that tells like the battery information and things like that. So it's, it's checking to see that you have a specific type of braille HID service, which I would expect, but they're only checking one type, which is the old Bluetooth standard. The newer one is called a Bluetooth low energy, and that's the one that most devices are moving to just because it's so much more energy-efficient. If your device uses only Bluetooth low energy and it's in braille HID, you can't connect it to, to Android. It will work if it's not an a Braille HID device, but they're kicking that out and not accepting it. And this is, I actually have a, a little project that I'm working on that's like a HID open source braille display. So it runs on this device here. So this is, I'm showing it to the camera. It's a little microcontroller that's, it's just like a little circuit board. It's about two inches by one inch and it has blue, both Bluetooth and wifi on it. So I've written some custom code that will allow that to act as a braille device so it can connect to, it actually connects to all of them except Android. It'll connect to Mac OS, Windows, iOS, but Android kicks it out because this uses Bluetooth load energy. - Right, right. - So I, I would like to see, I would like to see them actually fully support it just so that somebody can use this. And really this is just, this doesn't actually do anything super useful. It's just like a software base so that if anyone wants to build a physical device, they can use this brains of the software to receive the braille display and then change their braille dots and things like that. - So that's just a project you have on the site. How would somebody find out more about that? Do you have a? - So that, that's in GitHub as well. I can provide you some links for that. They're all under my name. - Yeah. We'll add that to the podcast notes after we're finished. But yeah, and making sure that people can use Android devices with whatever device seems a pretty logical and critical step for accessibility. Yeah. So just to close up, if you had some takeaway advice to give to help app designers and developers improve Accessibility of their app, what was that one thing that you would encourage people to do? - I would probably say to lean in on the design patterns that app and Google have created. They are created in a sense for mobile first. They're what users are used to. When a user first turns on their phone, they encounter only Apple or Google design patterns to navigate their phone. Use those design patterns. You don't necessarily have to use their libraries like material UI and things like that. But the actual feel and Accessibility feel of these design patterns, use those because they're, they're what users are used to. They work pretty well for the most part with Accessibility services. And it just, it just makes mobile devices easier to use, not just for for touchscreen or site of users, but for everyone. - Yeah. And that kind of emphasizes that point that most people are are gonna spend most of their time using other apps, not yours. So the more you can be consistent while also focusing on the user experience you want to provide, the better it's for everyone. - Yeah, that's right. - Cool. Well, John, thank you so much for joining this month's podcast. I've learned a huge amount. Hopefully everyone listening, watching has learned as well, and mobile app Accessibility is not going away and is only gonna become more important that we get it right. So this is, this is great information to share. So thank you again for joining us. Next month's podcast is scheduled for December 11th. We're gonna be looking back over the year of digital accessibility. There's been plenty happening and probably more to happen before the end of the year. So we will see you all then. Now you know the state of accessibility, and specifically the state of mobile accessibility. I'm David Sloan, thanking John Lilly, and reminding you that the state of accessibility is always changing, so please help us affect change.