- [Mike] Thank you for joining today's session, we will begin momentarily. - Hi and welcome. This is Laura Miller and I am here to welcome everyone in honor of GAAD 2022 to the event Ask Me Anything About Kiosk Accessibility and How to Create an Accessible Kiosk Experience. With me I have Matt Ater, vice president of business development, and Ryan Jones, solutions architect at TPG Interactive and parent company Vispero. And with that, Matt, Ryan, did you wanna say hello and welcome? - Well, good afternoon everybody on the East Coast and good morning on the West Coast and anywhere else in the world you are. I'll say this is we talk about accessible kiosks, as somebody who spends a lot of traveling time and was just in two airports, two hotels in two days and restaurants in between, I didn't find one accessible device. And whether it is from a payment machine to a self-ordering experience, to a ticket kiosk check-in in a hotel, none of them were accessible. And it's just amazing to me that we're still having this conversation. So I'm looking forward to kind of some of the questions that will be hammered at us today in terms of not just the why, but the how we go about solving this problem. So, Ryan. - Yeah. Hi everyone, I'm Ryan Jones. And kind of to piggyback on what Matt said, I think after COVID has slowed down and we've all sort of emerged from our cocoons that we were in for a long time, we're noticing, as Matt said, this inaccessibility of kiosks, and we're also noticing that there are more self-service machines and kiosks out there than when we all went into our cocoons a couple of years ago before COVID. So we're coming out and seeing all these new things, and they're not usable for many of us who are blind or have low vision. - Well, thank you both for that intro. And to get everyone started, wanted to also remind folks or to share that you can ask questions in the question and answer bubble at the bottom of the Zoom menu. And then you can also sort of add your questions in chat, but we recommend the Q&A bubble there. And to get us started, as someone who works in the kiosk space pretty extensively, I just wanted to sort of set the stage that I don't know how many of you have heard or read about the McDonald's kiosks that now include JAWS, the JAWS kiosk screen reader. But if you travel with Matt or Ryan anywhere, you often will end up testing those kiosks because you can't pass a McDonald's. - Oh, we didn't stop anywhere and test those. Come on. - You can't go anywhere without testing the McDonald's kiosks if you're traveling with Matt. We don't always eat, well yeah, sure we do. - At least get ice cream, if nothing else. - Right, french fries. - Or an iced tea. Or an iced tea, I mean. - Absolutely. So those are great examples of devices that we've worked on to make them accessible. It's sort of the proof is in the pudding of how we made out. So we highly recommend that you check those out and then you can tell us the difference between the accessible McDonald's kiosks and hopefully the ones that Matt was encountering in his travels. So I do have a question in the Q&A. So again, if folks wanna remember to enter your questions into the Q&A. But first question is, "If I already have self-service kiosks or if I already have devices that are deployed, can I make those accessible?" - Yeah, I mean, a lot of it comes down to, and this was ones that are already deployed, right? Is that the question? - Yes, for already deployed kiosks. - So I look at like anything else, we have to start with making sure that the application has basic I'm gonna say usability. Accessibility has a part in this, but in a kiosk, we don't look for straight-up compliance from a software perspective, we look at workflows being usable. And I think the air carrier, and I'm gonna say that right, the Air Carrier Act, what was it? - Yeah, the Air Carrier Access Act, yep. - Yeah, Access Act, sorry, ACAA. And the rules around that were around is it functionally usable? And there may be a better term and I'm not down to the laws and regs here, I'm just talking about from that perspective. Can somebody perform the tasks that they're required to do? And so when we look at an existing implementation of an application and it's not, you have both the hardware to look at and the software, but I'm gonna talk about the software here. We need to be able to make sure that we can use a screen reader with it. It doesn't have to say everything on the screen. It has to be able to go through the workflow. And then on the hardware side, there's definite regs around height, reach, and other things, tilt of screen. And we obviously have to solve that problem as well. And then we may need an input device, a headphone jack, and all this depends on the workflow and any privacy-related stuff. I mean, I saw a ticketing kiosk for a local train and that one didn't need a headphone jack 'cause there was no privacy, there's nothing to worry about. It wasn't on a loud platform. I could tap the bottom right corner of the screen, it started talking, I went through the total workflow. I didn't use the touch screen. Once I hit that bottom right corner, it told me to use the keypad. So all of that functionality existed, and could we do the same with any kiosk? Well, a lot of it depends on the infrastructure, the software application, and sometimes the operating system. - And one thing to add maybe to that, an example of taking a kiosk that's already out there and making it more accessible would be the McDonald's project that we work on where those kiosks are already there. And part of the process of making them accessible was adding a keypad. So there was some hardware retrofitting that had to be done, but it didn't have to be totally redesigned or rebuilt, and then the work on the software side to make sure that it was usable. But in general, the kiosk was already there and it was just a matter of making those software and hardware modifications to it. - And Ryan and Matt, there is a question that came in on this topic which is, "Is there a resource to learn more about how McDonald's accessibility implementation was executed?" And we have some press releases and we have the demo video that I can attach. Is that sound like a good answer to that question? - I mean, I think the demo video is something we have to get permission to share, but the, in terms of... Are you asking how they went around implementing it? I'm sorry if I misunderstood the question. - Well, yeah, how it was executed, how the implementation was executed. - Well, I don't know that the press releases or the demo video explain that. There was work done both on a tagging perspective inside the application, because out of the box, the application needed work, and that happens with any application and this is no different than web. You had to turn around and you had to support multiple languages. You had to provide different instructions. Because one of the things to think about when we talk about kiosks, we cannot assume that the person using it has ever used a computer and has ever used a mobile device. So everything we do with a screen reader today and everything we do with, and whether it's on a desktop application or a mobile application is we assume the person's using a keyboard, a full size keyboard or a touch screen. And let's use the example of a keyboard. If I wanted to move my heading, I hit the letter H. But I know what a heading is because I use a computer all day. But in reality, the traditional customer who may walk up and use this kiosk, who could be blind, may not use a keyboard or a computer. So we can't set the expectations that the user's gonna be a expert at what... The effort that went into the McDonald's one, which is a very complex user experience, I mean, there's a lot of stuff happening on the screen. You had to help them lay out the, not change the way the screen looks, but put tags on the screen so that the screen reader could move to those sections and then provide instructions on how to use it. Because they're not gonna be familiar with maybe the keypad they're using that's on the McDonald's device or any kiosk. And because may only use this once in their life, or they may use it 10 times in their life, but they're not using it every day like I do with my computer or I do with my phone. And so there's a lot of instructions that get baked into it. You had to support the multiple language tagging, which is fine from a concept perspective, but then when you think about pronunciation of words that may be on the screen... I can customize those. So there's a slew of activity that went into a McDonald's, which is way more complex than say a locker. A locker is a very simple kiosk experience. A kiosk to buy a train ticket is a very simple kiosk. Whereas a kiosk at the airport to check into your flight is a more complex kiosk because they're wanting you to type things in using an onscreen keyboard. So there's a wide range of tasks and not one of them matches up as simple. And I like to think of, I'm sorry, I'm rambling on this topic, but I like this topic. So a very complex experience could be a movie theater. And it wasn't complex 20 years ago because you went to the movie theater and you... But if you've been to a movie theater today, it's not open seating in most theaters, it's pick your seat. And if you're with two or three people, now you have to pick two or three seats that are next to each other in the viewing area that you wanna sit in. So that's a very complex user experience and takes a lot of thought process from a UX perspective of how you're gonna drive the screen reader user, and I'm using them as an example. It's gonna take a lot to drive that user through that experience. - So Matt, so kiosk complexity varies for different use cases is what you're saying? - Yeah, mostly because of the application, right? Exactly. - Okay, so somebody asked, "What features are in a kiosk with JAWS installed that are not in other kiosks?" - I don't know if I understand that. That are features of JAWS or features of the kiosk? - Well, I think maybe I can take a stab at that one. And Ryan, you might wanna jump in here as well because of some of the things that we see. So some of the features of JAWS that add features to your kiosk. So when you plug in the headphones and JAWS starts, there's a welcome message for instance. And that would be an introduction to the kiosk, maybe some description of what the layout of the kiosk is so you know where vital components are. And then some additional information about how to navigate or what the purpose of the kiosk is, right? So that is a feature that you get with JAWS because you can add that sort of introductory message for instance. Another one would be the ability to blank the screen when you're working, because- - That actually is built into JAWS, so I don't know that that one's any unique, but yes. - Well, it's unique because it has JAWS. If it doesn't have JAWS, then the kiosk may not be able to blank the screen, correct? - I think any ATM is a good example of one that can totally blank the screen. I mean, that's an offer that people have, but. And I think ATMs are the perfect experience in that scenario of what you just discussed. - Yeah. Okay. So other other features would be the ability to switch languages. So the speech output, you can change the language based on what's on the kiosk itself, correct? For JAWS? - Yeah, I mean, JAWS will detect language switching on its own, but the laying attribute in the application will do that for you. But sometimes certain applications may not be able to pass a laying attribute, which is basically saying, "Hey, this is Spanish. Now you need to switch." Some applications that are not browser based may not be able to do that, so we have to do it for the application for the screen reader to switch. - Okay, there are some- - I think the headphone insertion if you didn't mention that would be a top one that would've been something that is not standard in JAWS today, but is supported in the kiosk version. - Well, and I think the question is more about kiosks. What are the features in a kiosk with JAWS installed that are not in other kiosks? So just in terms of, not the differences of JAWS versus JAWS Kiosk, but the difference between the kiosk without JAWS and a kiosk with JAWS. So one difference might be that JAWS modifies based on what... JAWS reads the content, so if you update the application on the kiosk, JAWS will actually read the new information that's on the kiosk. So if a kiosk is not using JAWS and is using some other voice that's maybe hard coded in, then the content may or may not change, right? So that'd be another feature that is available in JAWS that's not, or not feature, but something that you see in JAWS that you don't see in other kiosks. - JAWS is a full-blown screen reading software, so it's reading and responding to the actions and the text that's on the screen. Whereas some text-to-speech implementations on a kiosk could be made where everything has to be predefined like a exact transcript of what the speech will say, or even sometimes prerecorded messages. So JAWS in the kiosk implementation is allowing for no matter what you do with the interface, no matter what text you put on the screen as long as the accessible tagging is done, then JAWS is gonna be able to read that and you're not having to have a separate interface just for the words that you want the text-to-speech to repeat. - I do have some follow-up questions from folks. One is, "When you say tagging, do you mean customization on the JAWS side or on the kiosk software itself?" And that comes from some description that Matt had given earlier about how we tagged the app. The actual question is, "Is there any customization required on the kiosk software or compatibility with screen reader that can be done using JAWS script?" - So we can, and Ryan may jump into this, we could do either. There's some benefits in doing the tagging in the application using aria labels or some other form because of some language techniques. But then some of the things that's unique to JAWS that it can do is it can say, "Hey, I saw you touch the screen, so I'm going to give you different instructions." That's in a script and then the messages that come with that, which would be the instructions on how to use it are now going to tell you to use swipe gestures rather than keyboard presses. And then as soon as you move your finger down off the screen and touch the keypad and say press the right arrow, the instructions then switch to the keypad instructions. And this gives the user the ability to switch between the two different modes of input. And then the JAWS scripting is handling, detecting what was the press and then it's grabbing the instructions from the aria labels or other techniques to provide those instructions. It just depends on the, again, one, the complexity of the application, and two, it depends on the ability of the organization to make modifications to the application. In some instances, they want us to do everything through scripts. In others, they would like to do everything inside the applications so that they have some control of their destination. - And usually even at a very fundamental level, there's still things that you almost have to do at the application level. So an example would be keyboard access. So if you're using a keypad, that's basically you're at the application software level, that keypad is sending in keystrokes to the software, almost like if you had a USB keyboard plugged in, so Tab or Shift Tab, or the Enter key map to different things on that keypad. And so the application needs to be aware of those key presses and widgets on the screen have be able to respond to the Enter key, for example, when you're using a keypad like that. Whereas if it's just if your application's just supporting touch screen right now, it may not have some of those things built into it to respond to key presses. So those are some basic things, and that's really not much different than regular web accessibility, but it's often overlooked in kiosk because most of the time you're just developing for a touch screen and you're not thinking about keyboard access. But then we have to address that when we get into accessibility, even though maybe we're only using a very small keypad and not a full keyboard. - And I think there are some other things that they have to do in the application. And I think of modals as an example, Ryan, we still need the modals to lock in the user, into the modal, into the dialogue. We can't have them go behind and read the text behind the dialogue. And that's a requirement. So there are some basic things in terms of accessibility, and I think of keyboard access is number one, two, I think the modals, three, proper labels on things if they don't have them, obviously we can do it through scripting. - And having real text, ensuring that we're not having images of text. JAWS is not gonna be able to read what's on that image. You've gotta have real text there for JAWS to read. - If the image is relevant, right? Combining items into specific, combining some... There may be three or four elements that all are part of one, and they may have put the tab structure into all four elements and I don't need it because it's, okay, it's the name of the item, it's the price of the item, and maybe it's the calories of the item. I don't need that to be three different tab stops. I need that to be one. And so there's some of that that we need to do, because our goal is to get the user in and out as fast as possible, or whatever the standards of the organization are. We don't want somebody taking 10 minutes to order a cheeseburger. We want them to get out in whatever the standard is for that organization. - Great. We do have another question. "As someone who," and this is from Palo in Brazil, "As someone who only learned and developed for web content, mainly sites and online systems, what should I learn to create an accessible kiosk? Are there guidelines or documentations specific for that purpose?" - Well, I think, Laura, you could probably speak to guidelines, but I mean, a lot of it depends on what's the purpose of the kiosk. There's guidelines definitely on the hardware, on the software side. I think if you start with weak ag standards, you're halfway, 90% there probably. It's not that there's not some other things we need to focus on, but it's similar things. Is there a tab order? Does the tab order make sense? Can I reach the elements that I need to interact with? Can I interact with those elements? Proper labels on items. These are not strange things when it comes to accessibility. If you're not from the accessibility industry, then you may need to be adding some of that skill to your knowledge base. - And many of the standards, I'm just gonna add, many of the software-related standards that we've seen for other countries, for example, Canada, some of their standards, they a lot of times all point back to the WCAG guidelines, which are those fundamental accessibility guidelines. So that's really a core place to start from a software perspective. - And I would just add in absence of specific guidelines for kiosk hardware and the whole entire solution, the kiosk solution, which for instance, the ADA does not have really defined solutions. They basically, some of it is defined, but some of it is a little left up to interpretation. But the reality is that when it comes down to it, it's how accessible, how usable is it? So if you create a usable experience and one that can be used by people who are blind and who have low vision, then it will likely hit most of the highlights of any specific standards that are out there. There are other standards, like Canada has some really specific standards, but at the end of the day, you get pretty close just by building one that is usable and keeping in mind testing and just the user experience for particularly blind and low vision users. Anything else to add there? - [Matt] Nope. - Okay. I think we have another one. "What makes something a kiosk?" We're talking about kiosks here. Is it a kiosk if I'm doing clock time, clock entry on a wall panel, or only are we talking about self-service devices at a restaurant like McDonald's? - I think it's the broader one, which is I think of time, time systems on the wall. I think it could... Anything that you have to go up and use that's a closed system is a better term for it than say a kiosk. Some may try to extend that to TVs and other things like that. I'm not getting to that. I'm speaking about things in public spaces. And I've seen in towns where they have way finding kiosks, where you go up and you wanna look up what restaurants are near you, and they're in the middle of a town square. That's a self-service device. You see them in shopping malls where it's the map of the shopping mall and you can walk up and type in the name of what store you're looking for, and it shows you on the map where you are and where it is, and it helps guide you to that, visually, let's just say that. Totally inaccessible today. I use the ones like ticketing kiosk, payment kiosk, those. We tend not to think of ATMs as kiosks, they're self-service, but they're, I think they're the predecessor to what we consider a kiosk today. And today you'll see in banks, you will see self-service as they wanna call them, tellers that are broader than an ATM, and so there's a range of that type of technology as well. But yeah, I wouldn't think of traditional digital signs that are just changing content on the sign. Whether that needs to be solved someday is another story. But there's interactives in museums that are kiosks today, officially they're kiosks. Not all interactives in a museum are that way. Does that answer that, Laura or Ryan? Do you have other thoughts? - I think usually there has to be in a kiosk term, there has to be some transaction that you're performing, whether it's purchasing, whether it's directional, whatever it is, there's a transactional nature to a kiosk where you're interacting with it. And as Matt said, closed system would be another term. One thing I always think about is a kiosk is gonna be a device where the user is limited to whatever experience and technology that kiosk provides. So in other words, we don't have access to our traditional assistive technology like we might have if we're on a smartphone app that we could put on different platforms or a website that we could use from a laptop or a Mac or a PC or whatever it is. So we're limited to that particular device and that platform and whatever input that it has. It's other things that sort of play into what's a kiosk. Because it is a gray, somewhat of a gray area, but there's these general themes that lead us to that conclusion. - And there's, I would say to add to what Ryan just said, there's some that are more like computers and I'll use internet cafes as an example. There's cruise ships with internet cafes on them. Could a blind person walk up and use those without installing something else on it? And they can't. So the benefit of having the screen reader added to that, which may be more like the original screen reader that's more locked down, meaning that there's a lot of functionality you don't have. So some of the other functionality that may exist in JAWS Kiosk that's not in the standard JAWS is that every time you plug in your headphones, you get the default settings, you don't get your settings. So if I walked up and used it and I changed the voice and I changed the speed at an internet cafe, as soon as I leave, it should reset back to the factory defaults. And that's something that's done in those internet cafes that are on some cruise ships, they're in libraries today. So those are locked systems because they basically allow you only to access the browser. - We did get a question. Are we good? I'll go with the next one. "Is there a minimum set of hardware components that should be on a kiosk? Nav pad, refreshable braille device, et cetera. And is that changing or likely to change? And if so, what would be the expected lead time to incorporate those devices?" - So I'll start with it and Ryan may add to it. I mean, any kiosk may have its, I'm gonna stick to the accessibility items rather than non-accessibility, meaning I'm not gonna talk about printers and payment terminals and things like that. But just to the accessibility perspective, they need to be a modern kiosk. And so what do I mean by that? You need a touch screen that does support 10-point touch. You need some form of input and that beyond just the screen, because not everybody and not just blind folks, but other people with disabilities or other people who are not comfortable with touchscreens need access to another way to move through the workflow of the kiosk. You need a headphone jack, both from a privacy perspective, as well as comfort for the user. And yes, that's the old fashioned 3.5 millimeter jack, the traditional like headphone jack, not the Lightning jack you'd have on the iPhone or something like that. Those are kind of the main hardware pieces. Now somebody's gonna say, why not braille? Well, there's a couple challenges with braille. One is the cost of it, and two is the ability for it to stay clean, and three it's the weatherproof of it. So not everybody reads braille, so there has to be, and I'm talking about the refreshable braille, not braille labels or directions on a kiosk, but the refreshable braille. And so I apologize for not defining that ahead of time 'cause a lot of people who may be on here may be thinking of braille that you may see on an ATM or braille that you may see on an elevator. That is expected regardless to provide basic instructions to somebody. And when I say basic, I don't want you to go write paragraphs and paragraphs on how to use this thing. That's the intent of plugging in the headphones and providing the auditory feedback. When it comes to refreshable braille though, there's a few cases where it's been useful and I think libraries and internet cafes are one. Another could be somewhere where the content is more complex and having the ability to read more may be useful. And I think of government buildings where there may be some courtrooms and they may need the refreshable braille because they need to say, "Okay, which courtroom am I assigned to?" And you have to type in your name and your social security number or your phone number and it tells you which courtroom to go to. That gets, now you get this long list and what's easier to look at when you may not know all the content that you're looking for? And that's an example where braille may be useful. But again, it depends on the percentage of people who read braille that are the audience of that specific kiosk. So there's some decisions that people need to go through with that. Ryan, did you have other thoughts on this? - Yeah, I think that generally covers it. And one of the things that's common across all the standards that we've interacted with is, as Matt was mentioning about braille labels is you do have to label other peripheral devices with braille. So if you have a barcode scanner, there needs to be a braille label so that the user can understand this is the barcode scanner or this is the ticket reader or this is the cash dispenser or this is the printer. So all of those other devices that you would have to interact with or peripherals that you would interact with on the kiosk, they are required to have a braille label, a physical label there so that the blind user will know what they are. - Great. We have a couple more questions. "Aside from the screen reader, are there other assistive technologies we need to consider on the kiosk? For example, magnification, high contrast, and voice recognition." - You wanna start with it, Ryan? Or you want me to take? - Well, I think, so voice recognition is an interesting one that there's been a lot of discussion about and all of these things are really good things to happen. Are they all required? Maybe, maybe not. So a lot of times it really depends on what is the purpose of the kiosk and what are the environmental challenges? So I think voice control, for example, certain types of venues aren't going to be very conducive for that. If you're at an airport or let's say a restaurant or some other, a ticketing at a train station where you're gonna have very loud ambient noise around you. Voice control may be somewhat of a challenge because then you're gonna have ambient sounds interfering with that microphone. And obviously privacy concerns can come into play with voice control, especially if you were verbalizing any sort of personal identifiable information or anything like that. So there could be a time and a place for that, but it really, there's not gonna be one answer that fits all scenarios. It's gonna be very much customized to what is the real goal of the kiosk and what is the environment that it's in? And Matt, I think you can speak to some of the low vision aspects as you use that quite a lot yourself. - Are you talking the magnification or were you talking the- - Yeah, I mean, so magnification, what are your thoughts on, should there be magnification on a kiosk? What concerns for low vision users would we wanna think about? - Yeah, so I think that, again, it depends on the use case. When I'm buying a train ticket and I have some speech output, it works for me. Some complex screens are very difficult to use with magnification. And it also depends on the height of the screen. And sometimes magnification may be really hard to use if it's a really tall screen. As somebody who's low vision myself, I find magnification doesn't work as well in those experiences. It doesn't mean that it's not necessary to include and magnification could be added to any like requirement. Traditionally, we think about accessibility from the highest risk of a user not being able to interact with it as the speech output. I think your comments around speech input are right on. I think the part of the challenge that I would add if this wasn't covered is depending on the complexity of the kiosk, speech input may be really difficult. And let's use the example of a menu ordering system. If I walked up and just said, "I wanna order," or something like that, what is it gonna do? It's gonna say, "What do you want?" "Well, I want you to read the whole menu." Well, there's 46 sandwiches to choose from. Isn't it nice that in the case of a menu system allowing me to filter down to what may be more relevant to me, maybe I only care about chicken or beef for pork or fish. And so having the ability to filter that down is very useful. Could you do that through speech input? Possibly. And natural language processing is really good today, but we also have to consider the speed of how long that transaction takes. And so if you get frustrated or it doesn't accept it because of you not knowing exactly what you wanna answer or what you want, it may take a long time. And so what I like the most about having speech output when I think about this is I like the fact that I can hear one piece of the information, the name of the product, and I'm using the menu again. And so if I heard Big Mac and I don't want Big Mac, I don't need to listen to the amount it costs. I don't need to listen to the calories. I press the key to move forward again. And then it says Little Mac. Then I press again, it says Quarter Pounder with Cheese, and that's the one I'm interested in. And it says the dollar amount and the calories and tells me how to interact with it. That's faster for me because I don't have to listen to the entire thing. And I think for anybody who's used smart home technology and there's probably a few people out there listening who have, we love them for what they do, but when it comes to speeding them up or interacting with them as you're using it, if you're listening to something, no, skip, no, skip, no, skip. Do I have to keep saying? I mean, I'm pretending if that's the method. Whereas with a key press, I can just do that and interrupt the speech and move forward. And that's some of the benefits that a key press can do rather than, or a swipe gesture could do rather than a speech input. - One of the things- - Go on, Ryan. - Sorry Laura, I'll just add one other point. So sometimes the voice input is to help address mobility impairments. And one way to ensure that that is addressed also is ensuring that your touch screens respond to prosthetic limbs and respond via maybe capacitive touch versus certain types of pressure points and things. The prosthetic is a big one because you can't just depend that it's someone's finger with those electrical properties of your finger. If someone has a prosthetic hand or arm or something, or they're using a pointing device like that, you still want to make sure those screens will respond, to serve the need for those situations where a user will be using a prosthetic or some other device to interact with the touchscreen. - Good point. So someone did ask some questions about our testing process. "Does test design prototypes of kiosk software with people who are blind or have low vision? And can we explain that user testing process, or just some overall information about our testing process in general?" And I was gonna take that one, 'cause I've had a little bit of time to think about it. And I wanted to first talk about the fact that a lot of the testing that we do is on solutions, like the full solution user testing and user testing with customers who will actually be using the systems in situ, like in their environment. Because you find out a lot when you do user testing of a deployed kiosk, so maybe a pilot or something like that, before you've rolled out the entire thing. So that would be the kind of testing that I think yields a lot of really great information about how... And it's great when you do it actually in the store because you find out things about noise and whether or not a device is going to work in that actual environment, right? Is it too loud? Is there ambient noise? Is there a subway going by? That kind of thing. And so some of the user testing that we do is actual solutions testing. Others are, I mean, we do review applications to see, we do design reviews, we do actual user testing of software itself to provide feedback with throughout the development of a kiosk application. And we do the same with hardware. We'll do testing, not usability testing, but we'll do accessibility testing against the standards of the hardware. But we still find that the best information comes from reviewing the hardware and the software in use together to see what actually provides a usable accessible solution. And now I know Ryan and Matt are going to have answers to this that exceed mine, but go ahead. - Ryan? - Yeah. I think you generally covered it. I mean, one of the things that we often like to do with the software testing is use some of our own employees across the Vispero brands who are blind or low vision to do some of that testing with us. And you can do that in a way to say, like for let's say it's a restaurant ordering system. Your goal is to buy a chicken sandwich and then you're gonna let them do that and you're gonna watch them and see how do they go about that task with the screen reader. Are the tutor messages or the help messages that the screen reader is speaking, is it enough? Is it giving them what they need to get started? Is it giving them enough as they go through the ordering process or is it giving them too much? And then you sort of go back and evaluate with them, "Okay, what was your experience? What worked? What didn't work? What are your recommendations?" So that user feedback is really important right now, especially where kiosk accessibility is still fairly young in its span. And so with websites and smartphone apps, we have all this years of history there and years of users experiencing these things and data to show what people do and don't do. And with kiosk, it's very new. So we find that this user testing as much as we can do it is really helpful right now in helping shape what these interactions are like. - And Matt, you had mentioned really early on some things about with COVID, or maybe it was Ryan, actually. Things about COVID and how we've sort of been driven more towards kiosks. And I would say kiosks have been around for a long time, but the extensive use of them in every location and the fact that it's touchscreen instead of QWERTY keyboard are the two big shifts that have happened in the more recent years that really have affected accessibility. Because a QWERTY keyboard sort of automatically is more accessible than a touch screen just by default of having something tactile to sort of interact with. It's not necessarily accessible, but it's going to be more accessible than a touch screen. And even more technology now, one of those user tests that we have done, Ryan, I know you probably will remember me saying this. We had a kiosk that had a completely flat, it was like a tabletop kiosk. It had a really flat screen and the screen itself was only maybe 10 inches of touch screen real estate. And the rest of the flat screen was just a design feature that you could put stuff on and hang out at. But from a tactile perspective, there was no indication where the actual touch screen was versus the larger kiosk screen that was there just for aesthetics. And so I think that as kiosks have changed from a user perspective, from a technology perspective, it's changed the accessibility. So as we see more touch screens, as we see more technology being fancy and slick and minimalistic, they become less accessible just by default, unless you pay attention to the accessibility. - I think, I wanna make a comment on the QWERTY keyboard, 'cause I think there's expectations that they're needed. And a lot of it, again, depends on the what you're trying to accomplish on the kiosk. Additionally, I think the QWERTY keyboard will also it's the most important thing about a QWERTY keyboard is it's gotta feel and all the keys gotta be where a QWERTY keyboard would expect them to be, a user would expect them to be. If for some reason you buy a keyboard and the backslash key is where the Control key is, and the user's trying to press Control something to move the screen reader and they can't, you've made this really difficult for them because a blind person, the one thing they can't do is hunt and peck on a keyboard. So they need to know where the keys are. Now somebody's gonna say "Fine, I'll put braille on every key." Please don't do that. One, not everybody reads braille, and two, I can't sit back and read every braille character to figure out where it is. So if you're requiring some level of input that's beyond selection process, meaning you have to input your name or your phone number, your address, you're probably better off using a QWERTY keyboard or some other method for inputting stuff. A lot of people are gonna want to use an onscreen keyboard. And that may be the only choice that you have based on the form factor you have. But we have to keep in mind that the time it's gonna take to enter something in on a nonstandard QWERTY keyboard or on an onscreen keyboard in a screen size that they're not familiar with is going to take time. And what do I mean by that? Both Ryan and I use a touchscreen every day on our iPhones. That's a screen size that we're familiar with. We know exactly where the characters are for the main characters. When we switch to symbols, it takes a little bit longer because we may not use them all the time. Or when we're using numbers, they're probably fine. But the minute I take that and I move it from a, I don't know, three inches across screen to a eight inches across screen or whatever that may be on that specific self-service device, it's gonna take me longer to use that keyboard. It's a fact, and anybody tells you otherwise, they're wrong. It's gonna take time. And so familiarity with the device or making it as simple as possible is key. You want somebody to perform the task and get out, okay? They need to be able to do what they need to do and move on to the next task. And if you make it difficult for them, it will drag out the process for the user. And they'll probably, you know what, walk away. - Any other questions that we have from the chat or the Q&A? - I encourage everybody to carry around their headphones and plug them into any device where you see a headphone jack, and try to use it. You're gonna find stuff that does work. You're gonna find stuff that's halfway there. You're gonna find stuff that was used some level of technology that as over time with changes to the kiosk, they no longer were accessible or usable by a blind person. And I was at a place where I had to, I just wanted to make a key, like anybody else, you take a key to the hardware store and somebody would go behind the counter and make the key for you. And now it's a self-service device. And I think, Laura, you're with me when I did this this day. And it was just excitement. I saw a headphone jack, I got excited. There was a keypad. In my opinion, the keypad was too low to the ground, but whatever, there was a keypad, I could sit on the floor and use it. And I plugged in the headphones. I went through the entire process and I got to the close to the end, and then that last screen wasn't made accessible. So I couldn't complete the task. The same time I think we were at the US Postal Service. And I can call it the Postal Service 'cause why not, right? And I just wanted to buy a stamp, something simple. I think it was a stamp. And I got to an item on the kiosk and it said, "This was not configured for audio." That was the message attached to that item on when I got to that section of the screen. I'm like, "How can something not be configured for?" I got that far, but because of the way they implemented their accessibility in their audio navigation, they basically had to have every single element configured for audio rather than a screen reader that's properly reading the content. So in the case of McDonald's, McDonald's can make changes to their kiosk. If they make a really radical change, they probably need to loop us in. If they're following what they've done across the board and adding new screens but keeping the standards across the board, they may want us to check it, but they're ready to go. And so keep that in mind when you're implementing text-to-speech, whether you're implementing JAWS or something else, think about the level of effort it's gonna take to make changes and having a screen reader that can just adapt and work within your applications useful versus say one that you say, "Oh, I'm gonna program the sound." And I think of voting machines. I'm amazed, the control that they used on voting machines is kind of, I don't know, Ryan, you tell me if you get the same one, it's got like a dial and then it's got a triangle path. - Triangle, yeah. - Right, right? - In fact, I voted yesterday and used one of these, actually. - Oh, did you? - Yes, yes. - And so the weirdest thing about it is somebody records, and I don't know this happened to you, but it wasn't a screen reader. - Yeah, it's verbal recording. - Somebody recorded it. It's an audio recording of each element on the screen, right? - [Ryan] Exactly. Yep. - Yeah, so, and I'm not talking text-to-speech recording, I'm talking somebody's voice was recorded for each element on the screen. - Yeah, somebody verbally read those names and those items and instructions into that machine. - It makes me laugh every time I do a voting that that's what I'm getting. - And the challenge is there is, okay, now you've got this person's accent to listen to, or the way they pronounce things may be different from what you're used to when you hear text-to-speech and then there's language things. So there's all these factors involved with that, but yeah. - Yeah, and I don't even... Mine defaulted to English. I don't remember getting a choice to switch to Spanish, but if they had, would've they read all the, would I just gotten the English recording? And not that the name wouldn't have changed, but maybe their title would've changed to a different pronunciation based on language. So I don't know, I'm fascinating still today that one, we have inaccessible websites and mobile apps, but two, that this self-service stuff which has been around for a while if you think of ATMs and airlines and others who have joined into that self-service model, we're still finding machines today. I was on the road for the last weekend and we're driving to New York and we stopped at the Delaware House. And there was a row of kiosks that were totally inaccessible. Not one of them was usable. - Hey Matt, we do have a question if you're... Sorry. - Yep. No, let's do it. Let's hit the questions. - Okay, this is a good one. "Any advice or thoughts around connecting personal mobile devices to kiosks to complete tasks?" And I wanna give you a second to think about that and just say one of the things we've seen that has nothing to do with accessibility and actually probably is not accessible is those QR codes that allow you to scan and then pull up, basically can navigate the kiosk using your touch screen, but like it mirrors the touch screen experience on the screen. That is a way that people do to connect your personal mobile device to the kiosk, but it just mirrors the touch screen. So I would say that's not accessible. - It's not accessible today. I think there's several things. And the problem comes back to one, let's think about the user and whether or not each user has access to a mobile phone. And we can't make the assumption that they do. We like to think so, because I don't know, there's one here on my desk 'cause I hold one up. I assume Ryan has one on his desk. Laura has one on her desk, okay? Or behind me, I probably have three more tablets, but we can't assume that the user has it. Two, they have to be able to find the QR code and you could put a QR code on the screen or on the side and label it in braille and hopefully somebody who can read braille can find that QR code. Next, if the application can be totally controlled by the phone and not a simulation of the key, not simulating clicks on the screen of the kiosk, you could probably... I mean, this is similar to going to restaurants today. And last yesterday I was at lunch and there's a QR code in the middle of the table. You scan the QR code, you get the menu, and you could place your order right there on the app. That process can work, assuming that that app is fully accessible. But in terms of giving the control over to the phone, I think having it do that is still early stages. For one, not all kiosks support Bluetooth connectivity, or some type of wifi connectivity to it. It doesn't mean it can't be done. I've seen prototypes of stuff like this. But then the question is, if you already have to plug in your headphones to get audio, are you gonna pass the audio through that mobile app as well so that the user can use the app on their phone and use the audio being transmitted through to their phone. And so there's data issues, there's privacy issues. There's a few issues that as security that come into play. And so I think we're still early stages in that game. - Yeah, and I think that those technologies are great technologies that add to a kiosk experience, but they used by themselves sort of in lieu of other accessible, like known accessible technologies is not necessarily the way to go. - I think people were thinking of, I think people thought of this to make it so you didn't have to touch the screen during COVID. I mean, that was the original- - It was a sanitation thing. - It was sanitation. - Yeah, it was, yeah. - And in reality, most of these screens could be wiped down pretty quick. The keypads could be wiped down. So I don't know that it's as big of a deal as it was intended to solve. Similar to when we first came out with temperature checking kiosks during COVID. It was like a boom and now they're sitting, they could be sitting on the shelf. - One good use I've seen with QR codes is maybe the login flow. So if you have a kiosk with a reward system, for example, you could have a flow where a QR code comes up on the app on your phone. You use that to sign in to the kiosk for the rewards. And that keeps you from having to use the onscreen keyboard to enter username and password, for example. - And Ryan, I think the other is lockers, right? In the locker kiosk, there's traditionally like you're gonna get a message. Hey, go to the locker, scan the QR code, and it'll tell you which locker to go get your stuff out of. Right, that's a perfect use case for both the QR code and using your mobile phone. But once again, don't forget, not everybody carries their mobile phone. Now, if the locker's there- - It can't be the only way to do something. - If the locker's there so you could pick up your phone, then you're probably a mobile phone user. - Although your phone might be broken and that might be why you're picking up. - Yeah, it might be why you're getting a new one. Exactly. - Well, I think we are out of time and hopefully we've answered as many questions as were out there and that this was educational. If folks have questions that have not been answered or that they think of later, you are welcome to email us. I am at lmiller@thesparrow.com and I'll type that into the box here. Oh, that's not public. Try that again. And then I can forward your question on to the appropriate person or answer it. If you have any interest in any of the things that we've talked about today, then feel free to send me an email. And definitely let us know if there are particular topics that you're interested in to hearing more about with regard to kiosks and kiosk accessibility. Thank you, Ryan. Thank you, Matt. - Thanks, Laura. Thanks, Ryan. Good seeing you guys. Take care. - Thank you, everyone. - Thank you, thanks all. - Happy Global Accessibility Awareness Day. - Yes. - Absolutely, thanks.