- Welcome to "The State of Accessibility" podcast from TPGi on LinkedIn Live, I am Mark Miller, and this is my co-host David Sloan, Chief Accessibility Officer for TPGi, co-author of "What Every Engineer Should Know About Digital Accessibility," and a research and Accessibility strategy specialist. - Hey, Mark. Mark is a sales director for TPGi and a member of W3C's Web Accessibility Initiative, Accessibility Maturity Model Task Force, and also a technical wizard, figuring out how to get Zoom and LinkedIn Live webcast working. Good job. - This is my first time, so we may have been live for a few minutes before we started, so apologies for that. But David, it is June 18th, and that is only 10 days to the deadline of the European Accessibility Act. But that's just a reminder, we're not gonna talk about EAA today, I know we've talked about that plenty. But if you're out there and you're concerned with EAA, just note that we're right up against that deadline. But today, we're gonna talk about another abbreviation that seems to be on everyone's mind, and that's AI, right? I know when I go to read anything on Google or whatever that there's article after article after article on AI. The AI revolution shows no signs of slowing down, so we thought we'd reflect on that today, what's happened this year in the intersection of AI and what you and I are always thinking about, David, which is Accessibility, what conclusions we've drawn and lessons learned, and what we might expect to see happening. So I guess we can look at AI's relationship to Accessibility through a few different perspectives. So David, let's start talking about its impact on assistive technologies, does that sound good? - Yeah, that sounds good. And I guess we should sort of preface this by just noting just how much the digital Accessibility community is applying the strict scrutiny to AI in terms of how it's impacting our effort to make the digital world more accessible, and a lot of what we are gonna talk about is just kind of a high-level state of the situation right now. But I think it's definitely worth giving shout out to events and other resources for more in-depth analysis. And that just recently I attended an event hosted by We Are developers on Accessibility and AI, and it had five or six Accessibility specialists sharing their thoughts on AI from different perspectives, which was a terrific learning day. And there's a monthly podcast out, "Accessibility and Generative AI" hosted by Joe Devon and Eamon McErlean, which is well worth a listen. They spend a good best part of an hour talking to people from different parts of the Accessibility community, particularly looking at the impact of generative AI on Accessibility efforts, and there's so much conversation going on in LinkedIn. So there are lots of people talking about it, and I think for us, it's just like, let's kind of do this broad survey of where we are at the minute. Now, I know earlier, I think it was back in February, we had Ryan Jones on the podcast talking about how AI's influencing the Sparrow's assistive technology products, particularly JAWS. So the ability to describe images, whether that's of an image in a document or a website, or whether it's your surroundings as a user being given valuable feedback about what's a camera picking up in addition to me? What something I don't want to, if I'm on a Zoom call or if I'm a part of a podcast, so JAWS is able to provide that. And it seems like that's something technology that's really appreciated by users, and then I think another thing I've seen over the first part of the year is just the popularity of wearables for blind and low-vision people, Meta glasses as a good example. People talk about when you're wearing glasses and then asking for a description of where you are, rather than holding up a phone in order to do, the glasses are already there, so your hands are free, so that's something that's really helping as well. - I was in at CSUN this year, and Matt Etter was with me, and many people listening may know who Matt is., he works for Sparrow, well known in the Accessibility community, and he's blind. And we were actually talking to a couple of the Meta developers, some of the people behind the RayBan glasses, and Matt cornered them to go over his own app and have them help him understand some things about the Meta RayBan glasses because of how excited he is about this technology and how useful it is. So that's been a really, just in my personal observation, and the really cool thing about those two is I don't know anybody who wouldn't like a pair of the Meta glasses, right? So we look at it in terms of the AI and its benefit to people with disabilities, but it's got kind of that curb-cutout effect where everybody can find use for the glasses even if you don't have a disability. - Yeah. And let's just be clear, we're this is not product placement, we're not being sponsored by anybody we mention when we talk about AI products here. But yeah, I think there are that additional . - Or technology like that we'll say, how does that sound? - And I think that an interesting observation that sometimes innovative technology intended to solve an Accessibility issue to be by somebody with a disability can be seen like a very much like a medical device rather than something that you're wearing and using, so that's definitely looking more at the sort style end of the spectrum rather than something that's just functional. - And I think it's also just an interesting phenomenon to pay attention to with AI. You know, you hear people talk about the singularity of when technology and humans are gonna kind of merge into one or whatever, if you're a futurist about all these things, but I do think that one of the observations I've made with AI and a lot of these technologies coming up is that the line between what is an assistive technology and what is beneficial to everybody is starting to blur. You know, I view that as very positive 'cause if you think about Accessibility in terms of inclusivity, it's really a leveling, right? Like if we're all sort of relying on similar technologies, then a lot of the difference and disparities that we might otherwise see start to start to fade away, or at least hopefully they do. - Yeah. Another interesting observation in that area is the way that AI generated voices are getting so good to the point that it's almost impossible to tell the difference between a generated voice and a recorded voice. And while that can bring opportunities and challenges more generally, specifically for people who rely on pitch output, the difference between this sort of very synthetic voice and a natural voice, and maybe you can choose, you've got much more capability to choose what voice you have reading to you. But that technology is also helping people who may be confronted with a future where they lose their speech by archiving their own voice and creating a vocabulary using their own accent, then they can continue to use that after losing speech. And it's a way to- - How much more comfortable do friends and family and all that- - Right. - Come when that generated voice is now your voice. - Yeah. - For sure. So, well you mentioned that we had Ryan Jones on a little while back, and he works here at the Sparrow and he specifically works on our JAWS technology, and there's a couple of interesting things happening within JAWS that we probably should mention. And the first one is the picture smart AI. - Yeah. Just harnessing AI to enhance or replace where... Something in digital Accessibility we often talk about alt attributes, text alternatives or images as being one of the core Accessibility things you need to do when you're building or creating digital content sometimes people forget or they provide an inadequate description. So by harnessing the capability of AI to describe an image in a richer way than a human may have done or may have time to, it gives somebody, a screen reader user the opportunity to get more from a rich image than they might otherwise have done. And JAWS supports that, and that could be an image from all sorts, might be some kind of complex organization chart, or a photograph with lots of information in it. So that's definitely something that seems to be people are really valuing as a feature. - Yeah. And it seems like in social media, David, that that would be, like, that's an area in which that's been I think a very difficult to have - Yeah, yeah. - Alternative text and social media and some of these other kinds of ways that content is presented. I do think, and so to be able to actually get a description of your family that you can, or something like that is usually beneficial, but I do think that there's a caution there that it is not a replacement or it doesn't all of a sudden mean that alternative text is not necessary, because alternative text serves a very specific purpose, and that specific purpose may not be to describe the image, but really to describe the value of that image on that page, whether it be- - Right. - that image being linked to something or having some other contextual significance. So I think if you're out there thinking, woo, we don't have to do alternative text anymore, that's not the case, you still need to do that. Where the future takes us, who knows, but right now, AI can't cover up for alt text. - Right. Right. It certainly doesn't take away the responsibility of a digital content creator to include that text equivalent. - It's kind of a yes and for those who are doing your improv classes. It's like, yes, alternative text, and we've got this great thing in JAWS that can describe images where alternative text maybe was sort of falling down. So what about face-and-view, that's another interesting thing that's- - Yeah, I mean, I kind of touched on that a little bit earlier in our Chapman, is essentially it's using the same technology, but to describe what your camera is seeing so that you can then take steps to reposition your head so that you're facing the camera or the anything that's in the background that you don't want to be in the background can be moved out the way. And this is a great example of recognizing JAWS as a workplace productivity tool. You know, it's primary focus is to help people who are blind or have low vision be productive in the workplace, earn a living, make progress in the workplace. And especially during the pandemic, and since there's a lot more remote working and video calls rather than in-person calls are more frequent in the workplace, so this is a way to help people make sure that they're acting in a professional way that they would want to be and not having stuff in the background that they maybe would rather not. - In being able to take responsibility for that themselves, because I know that independence is a lot of what we talk about here. And back in the early days of Zooming with people who are blind, there was always a period of time when they would ask like, oh, how does my video look? And you would say, oh, open up your laptop a little bit more, it's kind of low, I could see nothing but your keyboards. You know, you'd have these kind of interactions where you were essentially navigating them through where, how to point that camera. And it's a suboptimal experience because they're not being independent, they're relying on you to help them with that. Now, with the face-and-view, my experience is that people who are blind pop into a Zoom call or whatever already set up and without having to have that extra interaction with people. And I think that that, I'm not blind so I can't speak for it, but I think that that's gotta be a much better experience and a much better feeling of independence. So we've covered the assistive technology side of this, and I think that there's a lot of interesting things, and we probably could talk about it forever, particularly if we got into speculating what might happen in the future. But AI is also helping us do some other things, like generate content and test for Accessibility. And there's all sorts of things, maybe not fully developed out yet, but that we're starting to see evolve. So how do you see right now AI changing the way we create digital content and what effect does that have on Accessibility? - Yeah, it's a good question. I think it's a nice bridge from assistive technology because one of the things that generative AI tools are making it possible to do is turning ideas into usable content and making it easier to generate content through prompts or ideas. And that can be an assisted technology for people who might have difficulty writing or difficulty just earning great ideas into content that other people can engage with. And that sort of overcoming a communication barrier or comprehension barrier, whatever it may be, is something that people have been adapting as an assistive technology. And it's a great way that- - Yes, for sure. - of illustrating how people with disabilities are often right at the front of the forefront of adopting new technology in order to overcome barriers that exist in the real world. Rather than being perceived as technology laggards, it's often that people with disabilities are leading the way. So I think that functionality is available to everybody is also an assistive technology or something. - So I can give you an example here, personal example, because I've done a lot of research into AI, and as you know, and maybe people who've been listening to me for a long time know I have dyslexia. And one of the things that they talk about is that is sort of gonna be a new age for people with dyslexia because the skills that come along with that, the narrative skills, the ability to turn ideas over in your head, the ability to connect seemingly unconnectable thoughts to create new ideas, all of that is supported by AI. So people with dyslexia can dig into AI and start to pull data out in a way that is more meaningful to that way of thinking. In addition to that, there's things like Grammarly and other tools, even just in terms of proofreading that help. And one of the first ways I'll tell you that I used AI aside from some of these specific tools like Grammarly and different editing tools like that, which have made a world of difference for me, it's made it so I get people that say, oh, I didn't even know you were a bad speller. I had dyslexia because of the way technology covers for that for me now. And having that pulled out of the conversation is fantastic because there has always been, and I think it's hard for there not to be, you know, people assuming that if you're a bad speller, that that's an intelligence marker and not something else. But one of the first ways I started using AI was to summarize things. So if I had a, there was a large article online, I would ask AI to summarize, and it didn't mean that I didn't read the whole article, it meant that I had a framework before I started reading the article, which, and I don't know if that's even a dyslexic thing. Like I think maybe that would help anybody or anybody could use it that way, but certainly as somebody with dyslexia, if an article's really long, it's much more that cognitive fatigue, the short term memory issues, stuff like that, it's much more difficult for me to hang on. I didn't often finish articles. So that was the first way, and then being able, the way I can do research with that makes much more sense to me than like searching with Google or these other ways. So it's interesting how it's kind of flipping some things around, and in what I've been reading, they say with things like dyslexia, there may actually be maybe a rise of that type of way of thinking. So, yeah. - That sounds like another example of something that was maybe essential for some, useful for all as an Accessibility. - It's like the reverse curb-cutout effect. - Yeah. - It's something useful for everyone, and then certain people with certain disabilities said, wow, this is really a multiplier, force multiplier for me, for sure. - Yeah. You asked a little bit earlier about other ways of AI changing how we create digital content. I was thinking of examples of things that help coding. You know, there's been a lot of talk about vibe coding as this way of using an AI tool to generate code without having to write the code or even know the code. It kind of reduces the barrier between ideas and creating functional code to test them out. And that could be, whether it's somebody who doesn't have knowledge of a programming language, being able to sort of turn that idea into functional code to try out, or somebody who's a developer who already has coding skills but can just speed up the process with it. There's opportunities there, especially for rapid prototyping, comparing ideas and testing which one is the best. Clearly there are Accessibility risks- - Right. - I guess we're still really figuring out just how, what's the trade off between speed of creating something and the potential effort required to repair what's been created because the code isn't as good as we would like from an Accessibility perspective. I think that is something that we're still figuring out with more AI generated or supported code development, just how different are the Accessibility problems that that code creates. I think there's a cautionary tale from Figma sites. You know, Figma recently released this application that, or extension to their product suite which generates functional code, many ideas that this is not production ready, but it's there for prototyping. But the fact that they released it and the quality of the code from an Accessibility perspective was not great. Even though people might say, well, this is just a, you know, this is a first look at something, and still, it was shared with the public and it had significant Accessibility barriers, and that tells you that Accessibility still was not something that had to be met in order for a product to be released to the public. - I think the good news there is that that was very quickly recognized and it created a really good discussion, sort of Accessibility community and beyond wise around the cautions and how we need to be careful when it comes to code. I think that the big overarching lesson here is that AI is still built on a data set that humans created. And we probably weren't in a place with our coding and including Accessibility in code that allowed for AI to come in and create accessible code. Now, some people will argue or suggest, or there's, you can certainly try to prompt your way into more accessible code, but I still think that even that's difficult to do. And you just have to remember like, whatever fault sort we had as a society, as a bunch of people generating content code data, that is that bias, those biases exist in the dataset that AI relies on. So of course we're gonna see those, and the answer is, if AI is now doing things quicker, faster, stronger, how is it multiplying that problem? And how do we need to make sure we don't allow that to happen, and take a step backwards? - Yeah, and I think what we'll come back to talking a little bit more about bias in a couple of minutes, but I just wanted to sort of briefly look at the AI and its relationship to how we test digital content for Accessibility. Clearly we want to get to a point where testing is verifying that you've done everything right rather than finding all sorts of things wrong. We know, we have a, we provide Accessibility audits to our clients, many other organizations do. Accessibility audits find lots of problems, and they can be a fairly labor intensive way approach, require expertise- - Yeah. - To find lots of problems, to tell somebody where all the problems are and then hope that they fix them. You know, if we were designing Accessibility from scratch, we wouldn't pin so much on the manual Accessibility audit, but still well this is where we are and it's still an important part of helping people improve Accessibility. But clearly there is a goal to make, automate Accessibility testing more automated and people have different perspectives on how much, how far we can get, how quickly we can get there. But there are lots of efforts towards trying to automate as much Accessibility testing as possible to the extent that it's possible to do that, and taking advantage of what machines are good at and what humans aren't so good at. So I think that the principle of trying to automate as many tasks as possible is something that everyone I talk to in Accessibility is like, yep, I want to automate the stuff that the machine can do better than me. I don't think anyone's arguing with that. - Yeah, I think that there's a couple things that I think about when I think about this, and one of them is that, it is very seductive in the sense that the point that you just made about doing the manual effort it takes to properly evaluate, let's say like a website for Accessibility, is pretty significant. And it's something people want to make faster and easier and less expensive in a pretty big way because of costs associated with it. To me that's, like let's be very, very careful. Like we want something like that so badly that the temptation to run towards it too quickly is there. That's the first piece of it. The second piece of it is that we can't forget, in my opinion, I wanna hear what you think about this, David, but when I think about any sort of automated testing, even AI testing it's still not the same as a person. So when, let's back up, when we think about automated testing or a person testing, really what I'm thinking about right now is a functional test or technical conformance test against the guidelines. So does it conform to the WCG guidelines or not, right? And when a human being finds an issue and says, this doesn't meet the guidelines, it's because they had a user experience issue that uncovered that, my screen reader couldn't continue on, it got trapped. I had a keyboard trap, I couldn't perceive whatever it is, I didn't know what this form field meant. That's very different than technology saying, I'm looking at the code and based on what I know of code and based on what I know of the guidelines, there's not a match here. So no matter what you do, you still don't have, and I would never claim that testing a website against the guidelines is a usability thing. But when a person using assistive technology is the one doing it and finding the issues or verifying that the issues aren't there, it is much closer than what AI can do, inspecting the code. So I think to me, and I don't know what that means from a technical standpoint because I'm not a technical person, but when I think about AI coming in and sort of giving a rubber stamp saying, yes, I looked at this code and it's accessible, carry on, you're all set, it makes me very nervous because it doesn't have that piece where a human being actually successfully utilized the site with the assistive technology, the ultimate means. So I don't know, what do you, is that sort of sum it up or am I overthinking? - Yeah, I think you're touching on the context of use and the successful use that ultimately is used to judge whether a product works or not. And you can get too fine grained, you might look at it from a specific component perspective and an AI might have generated it and verified that it meets some technical level of Accessibility, but was it the right component for the right user journey? And is it the right component in the context of this larger business goal and user goal? Yeah, so I worry that people can... that people might be focused too much in the details and not in the kind of collective, in putting everything together, does this provide for a good user experience? We know that, even though WCAG has priority levels, some instances are gonna have more disruptive impact on users than others because of where the a barrier lies in that particular user flow. So ultimately it's, we don't want to see this as a way to avoid having to work with user representatives to focus on providing the best possible user experience. This is giving you, the ideal is that the AI tools give free up time to allow you to do more of that user research and more of that working with users. - And not to forget, I think that the whole idea, like the guidelines are there as guidelines to help us with an initial way to determine whether something is close to being accessible or not. But whether you're looking at it from the standpoint of the law, like Title III of the ADA that says, places of public accommodation need to be usable by people with disabilities or just through the effort of we need to make this usable. The one piece that is continues to be missing is a person actually trying to use it. And that's where I think we can get too wrapped up in the guidelines and the ability to assess against the guidelines and not keeping in, so maybe something says, yeah, this fits all the guidelines, but when a screen reader user or other type of user tries to use it, there's something that wasn't visible that's stopping their usage, you've failed that other test of, is it usable by a person with disability? So I think that that's the caution there. And not that you don't want to use it, not that you don't want it to speed things along, but you really do need to make sure that you're not relying on it at a hundred percent. - Yeah. - Until which day we have something that tells us differently, you know? - Yep, yep. - We're a little over time, which I think is good 'cause I think I spent a little time fooling around technologically here to get us started. Is there any last points you wanna make before we wrap up completely? - Well, I think we need another podcast that we have so much more to talk about- - We just scratched the surface. - And I feel like we might want to come back and talk more about some of the points that we didn't get to. I mean we raised the issue disability bias and then why does this impact organizational Accessibility strategy? So I think rather than going into those in a high level, let's think about maybe coming back to that in our next podcast. I think overall, just from what we've heard so far, there's clearly opportunity, and as a tool it's figuring out how do we make use of AI power tools to do the things quicker and better than we can do in the context of this broader set of tasks. One thing that I love, I dunno who said it, but I've heard it a a couple of times, like thinking about generative AI in particular, generative AI tools are like a intern fresh out of college with boundless energy. So what Accessibility tasks could you find for somebody who fits that profile to do? Thinking about that boundless energy, they're not gonna say, I don't wanna do it anymore, lets keep offering to do that task. So thinking about that as part of harnessing AI but you know, clearly there's a need for caution. And I think we'll talk more about that next time around. - Great. Well thank you David. And thanks everyone for listening, now you know the state of Accessibility. I am Mark Miller thanking David Sloan and reminding you that the state of Accessibility is always changing, so please help us affect change.