- [Mike] Hi everyone, my name is Mike Mooney, I'm the Digital Marketing Manager at TPGi, and I'd like to thank everyone for joining us for an exciting webinar today on mobile accessibility testing toolbox with Senior Accessibility Engineer, Laurie Pagano. I have a few housekeeping items. First off, this session is being recorded and we will email everyone the recording after the event. We have a live captions available, so feel free to use those as needed. And lastly, if we have time, we will have a live Q and A at the end, so please use the live Q and A box. And if there's time, Laurie will answer those questions later. Without further ado, Laurie I will let you introduce yourself and we can get started, thanks. - [Laurie] Thank you. Thanks everyone for coming, welcome. My name is Laurie Pagano, I'm a Senior Accessibility Engineer here at TPGi. I focus a lot on mobile testing in my day to day, and this is the mobile accessibility testing toolbox. Today I'm gonna talk about some of the helpful tools and tips you'll want to become familiar with when you're starting to think about mobile accessibility testing. After a bit of introduction, I'll start with the device settings and assistive technologies, the things that come more or less built right into Apple and Google devices, and you'll see it a little bit later what I mean when I say more or less. As well as some additional testing tools or the bonus tools that may not be as obvious or we'll just make it easier to test, but either way we'll really begin to round out your toolbox. Note that for the sake of efficiency at any point, if I just say tools, it's safe to assume I'm referring to everything I just mentioned, device settings, assistive technologies, and the additional testing tools as a unit tools will just be shorthand for all of those things. This is not an exhaustive list of all the settings or assistive technologies that people with disabilities might use with a mobile device. This is just the beginning. What I'm highlighting today are some of the more useful introductory tools for mobile accessibility testing, but there are many, many different options for configuring a mobile device to work best for folks with disabilities or for any number of needs or preferences. So as you become more familiar and more comfortable with what's in today's talk, I really do encourage you to dig deeper into some of the tools that I didn't get a chance to shout out, make room in your toolbox for them as well. I promise you the size of the shout-out has very little to do with the prevalence of use or the importance of any one tool over another, but just with the amount of time that we have available to us today. On top of there being more tools than we can comfortably cover in a single talk, there are also considerations like normal regular software updates, which may move things around, adjust the way they work or even add and remove features and settings entirely. And the big software version updates tend to happen just before anytime I give this talk. So I tried to set aside a few places to highlight some of these changes, but just know that what's available at the time of this talk might be a little bit different depending on your device, what versions of the operating system or applications that you have, any number of things. And speaking of operating systems, today I'll be talking about Apple's iOS and Google's Android operating systems. And when I talk about Android devices for testing, I specifically mean a Pixel since the Android operating system is fragmented and has its differences in the user interface and the features and the functionality across all the different devices offered by different manufacturers. If you can only choose one device to test with, I do recommend going with a Pixel to get the pure Android experience. A note on compatibility testing, for testing native mobile content, some of the considerations for accessibility are going to be pretty much the same. An image without a text alternative on the desktop or in any web based content is still going to be an image without a text alternative on a mobile app, right? But of course, there's also plenty of differences. One obvious difference is just finding where all the different tools can be turned on and off, and how to use them enter this webinar. One difference that's maybe less obvious when you're thinking about starting mobile accessibility testing, but becomes very obvious once you've started, is that on a native mobile app, there is no inspect element to go digging around in the code. So unless it's your product that you've personally made or you've been provided with the code, you're unlikely to have access to that code for the application that you're testing. So if you aren't doing that, or maybe you aren't planning to make specific recommendations based on the code. Or you just aren't necessarily aligning your testing with any version of the web content accessibility guidelines then you may be planning for compatibility testing. And with compatibility testing, you'll be using combinations of settings and assistive technologies that are commonly used by people with disabilities to access the content and determine is this product compatible with these combinations. And we're off. First things first, where are the accessibility settings? For iOS, the accessibility settings are right in the top level of the device settings app. Up through iOS 12, they were buried a little bit in the general screen of the settings app, but then they got promoted. iOS 15 released to the public just last week and while all of my screenshots are showing iOS 14 in 15, they are still in the same top level location. Also in 15, on top of changing settings at the operating system level, you also have the opportunity to set some of these accessibility settings differently for individual apps on an app by app basis. And like I said, I will try to touch on some other new iOS 15 features, particularly if there'll be helpful for testing. But I will generally be referring to iOS 14 for today. I don't expect there to be too many differences as far as testing is concerned, but there are a few neat accessibility updates that won't affect testing, but are still good to know about. In Android, the accessibility settings are also in the top level of the device settings app. And again this and any other Android screenshots you see today is as seen on a Pixel. In Android 12, rumor says is releasing to the public early next week. You can expect to find a few of the display settings shuffled around and reorganized into new sub menus. So it may look a little bit different from Android 11, but it should all still be there. As part of the Android accessibility suite, you can enable an accessibility menu shortcut, which, as of this week, will either stick a handy accessibility button right at the bottom of the screen, or depending on your system navigation settings, it could launch with a two finger swipe gesture from the bottom of the screen. This will give you shortcut access to a lot of functionality that otherwise might rely on the physical hardware buttons, like taking screenshots, locking the screen, adjusting volume and brightness and directly opening the accessibility settings screen. Which will be helpful as a tester. This shortcut does get an update in 12 so you can expect a few changes to the handy button itself, but the rest of the functionality should still be there. The Android Accessibility Suite is a suite of applications that includes this Accessibility Menu, the screen reader TalkBack, which of course we will talk about, and two other apps, Select to Speak and Switch Access. This should come pre-installed on most Android devices, but in case it's not, it can be downloaded from the Google Play store for free, and then it will integrate seamlessly with the other settings. And this extra step of downloading the suite is what I meant earlier by more or less built right in. And right away, we'll be able to start the testing process just by noting, does the app I'm testing respond to the changes I've made in the device's accessibility settings? Or does it respond to some but not others? Or does it not respond to anything? As I go forward and introduce this tool, just like I'm doing here, I'll be pointing out some of the common issues that you're likely to find when testing with that tool. And you'll find that the same issue will pop up a lot. Does the app I'm testing respond to the options provided? Or does it provide its own equivalents? So I really like grabbing low hanging fruit first. So we'll start with the settings to resize text which are pretty straightforward. In iOS, there are two locations where you can change the text size at the device level. Device level, meaning that changing this one setting and this one place will affect text across all default apple apps like Mail, Calendar in the phone, as well as any other app that supports Dynamic Type. one place in settings is the display and brightness screen. And then text size. This screen provides a range of seven text size options with the default falling right in the middle of that range. But to open a wider range of text sizes, including larger sizes for accessibility, you can go to the accessibility settings screen and then Display & Text Size and then Larger Text. And by toggling that larger accessibility sizes control at the top of the screen, you'll introduce five additional larger text sizes. For Android, there are two options for resizing content. And as of today, both are found right on the accessibility settings screen. But I believe these will be moved into a sub menu in Android 12. One is Font Size, which simply resizes the text. There aren't as many options here for font size as what iOS has. But Android also has display size, which you might use to change the size of text as well as any containers that define the layout, any controls or icons, and you can use a combination of font size and display size to further adjust the sizing of the content. Some common issues you might find when you test text resize through the device settings is portions of texts that do not resize, texts that is cut off with ellipses or dot dot dot, texts that is cut off by the boundary of its container or texts that does not respond to text size settings at all. I'm gonna back up for a minute because previously I mentioned dynamic type for iOS, and I do wanna mention that developers need to actively support dynamic type for the text of a native iOS app to successfully resize and accommodate the setting. By using text size with Apple systems response they'll get that dynamic type or it support built right in, but this isn't the case for Android. So that as long as a scalable unit has been used to define the font size, Android resizing features should work across the operating system unless they've been actively suppressed. Okay. Another low-hanging fruit you might wanna start with is the color and contrast settings. This one is maybe not as low-hanging as resizing texts, because there are a lot of different options for adjusting color and contrast and different combinations of them might work best for different folks. So again, even though I've highlighted only a few, I certainly wouldn't dismiss any of them, but I still count this as low-hanging because even with all the options and while there are some additional tools that will help you identify specific colors and color contrast ratios, which I promise I'll cover later in additional testing tools, you still aren't really learning a whole new set of skills if you're already familiar with testing color and contrast on the desktop or in web based content. On iOS, most of the settings that affect color or contrast can be found in the accessibility settings screen and then display and text size here. Here are settings for classic invert and smart invert, which respectively will invert all displayed colors or all displayed colors except for media like graphics and video, and some user interface elements. Dark Mode uses darker backgrounds and colors to decrease brightness and reduce eyestrain. You might debate whether this is in and of itself an accessibility setting, but regardless, any changes made to the visual design in order to support dark mode could certainly introduce new issues that are probably contrast related. On iOS, dark mode is not found through the accessibility settings screen, but instead you can navigate from settings directly to display and brightness to choose between light and dark modes. Android has your options for color and contrast, and they are currently available right on the accessibility settings screen. Although I'm certain, these will be moved to a sub menu in Android 12. Dark theme is available as a single switch to toggle on or off and color inversion will invert all display colors. Starting in Android 10, you could use dark theme and color inversion together for a new combination of support. Some common issues you might find when you test color and contrast through the device settings are of course insufficient contrast for text elements, insufficient contrast for user interface elements and states insufficient contrast for images and icons that convey information, or you might find content that does not respond to color and contrast settings at all. Note that we have some of the more straightforward tools in the toolbox. I'll pivot just a bit and get into one of the more daunting of the mobile assistive technologies for new testers, the screen readers. Even if you feel like a total pro with other screen readers like JAWS, you're probably used to using a certain set of keyboard commands to navigate around. And you probably have a certain expectation about how to interact with content when using a screen reader. But then to switch to a totally different method of operation, like swipe gestures, it can take some time to get the hang of it. So across Apple products, you'll find VoiceOver. And even if you're familiar with VoiceOver on Mac OS, VoiceOver on iOS is going to be a little bit different. Some of that does have to do with the different needs and abilities of a touch screen interface, like I mentioned, but also with release schedules and even just what features Apple has decided to support when and where they decide to support them. You can enable VoiceOver from the accessibility settings screen and then VoiceOver. There's a control to turn it on and off and a whole bunch of settings to customize behavior like how fast it speaks or how much it says. And starting in iOS 13, you can customize gestures and commands. There's one option, the Caption Panel, that I do want to highlight because I think it's really helpful for testing. And it can be found a bit toward the bottom of the VoiceOver screen, turning the caption panel on will display VoiceOver speech output as text at the bottom of the screen as a bit of a visual supplement to the audio output that VoiceOver already provides. The Caption Panel may be familiar if you've used the Mac OS version of VoiceOver but for iOS, it only became available in iOS 13. So you can find it in iOS 13 and above. And this does not happen by default. So if this is a feature that you're interested in, you do have to turn it on. Android has TalkBack as its screen reader, which you can enable from the accessibility settings screen, and then TalkBack. It also has this settings to customize things like speaking rate, verbosity and gestures. And again, TalkBack is part of the Android accessibility suite, which I mentioned back with the accessibility menu buttons. So it is technically a separate app, which lets it run on a different update schedule than the operating system as a whole, but it integrates with the rest of the settings and even for the devices where it doesn't come pre-installed it still feels built right in. If TalkBack doesn't automatically give you that visual caption option, like the one I pointed out for VoiceOver, you can go from the TalkBack screen to settings and then advanced settings and then developer settings to toggle displays speech output. Just start, here is a quick and dirty, easy, limited list of standard gestures for basic navigation. These I think are the very bare minimum that will keep you focused on navigating without panicking that you've lost control of your device or getting caught up in memorizing the complete list of gestures for each screen reader. And because these are standard enough to be shared across iOS and Android, I didn't even split them up by operating system. They're going to be the same for either device. To explore the screen, drag one finger around the screen to announce whatever is under your finger. There is a, this is a non-linear way to get an idea of what is on the screen and where. And because many apps have common layouts like a back button at the top left or tabs navigation along the bottom, this can be really helpful for jumping directly to a certain part of the screen where you can reasonably expect to find whatever it is that you're looking for. And if the element under your finger has been created with even a whisper of accessibility in mind, you'll be able to tell what has screen reader focus by hearing the announcement, seeing the announcement in those caption panels or seeing the screen reader focus. And this shows up as a black and white outline for VoiceOver which helps it show up on most backgrounds and a green outline for TalkBack. If you know where an item is, you can also just tap directly onto it to give it screen reader focus with a one finger single tap, or one finger swipe to the right or to the left moves the screen reader focus to the next or previous item of any type sequentially in the reading order. And a one finger double tap activates the button or control that currently has screen reader focus. You can double tap anywhere on the screen and it will activate the item. You don't have to double tap directly over the item as long as it has screen reader focus. And then on top of those very basics are gestures that are multi finger or multidirectional and introduce another layer of options for navigation. In iOS, the Rotor gives you the opportunity to switch between different navigation options or to conveniently adjust some settings like hints or speaking rate without going all the way back through the settings app. Some of what you can access through the Rotor is just there by default and it changes based on what makes sense with the content that's on the screen at the time. But you can also customize the list of possible options from the VoiceOver screen in the settings app. There is an option there for Rotor. To access the Rotor, you'll do a sort of twisting motion with two fingers like you're dialing a knob directly on the screen and you'll know you've got it because you'll either see it appear visibly on the screen or hear VoiceOver start to announce the options. At each step of the Rotor, you'll see or hear which option you're activating. And then once you've got the option you want, you can swipe down or up with one finger to move forward or backward through that option on the screen. Or if it's setting to adjust that setting. For example, if I used the Rotor to activate headings like I have in my screenshot, I could then start swiping down, down, down with one finger to navigate forward heading by heading through just the headings. Or up, up, up to navigate backward through them. Which is a way of navigating that might sound familiar if you have experience using quick navigation commands for other screen readers. Then, I can switch back to swiping left and right to explore the content surrounding that heading. Or if I used the Rotor to activate something like speaking rate, I could swipe up or down to make VoiceOver speak faster or slower on the fly. And as a side note, if you started gearing up to use your Rotor for adjusting a lot of settings, which I don't expect to be necessary, but if you did, iOS 15 introduced a VoiceOver quick settings option that you can access with a two finger quadruple tap so you can access the settings quickly that way, rather than by filling up your Rotor. Earlier this year, Google and Samsung released a co-developed TalkBack update that introduced a few significant changes. This also offers a little bit more consistency among the different Android devices by making TalkBack the default screen reader on Pixel and now most Galaxy devices. But the way you interact with TalkBack may be different between these two versions, depending on your device or software. If you have an older version of TalkBack or Android, you have the context menus plural. The global context menu provides global options for interacting with TalkBack or the device itself. And the local context menu provide some local options for interacting with the more immediate content. And like the VoiceOver Rotor, the local context menu options change based on the currently displayed content. To access the global context menu, you'll swipe down and then write all in one move drawing a right angle or a capital L shape. To access the local context menu, you swipe up and then right. Then you can select whatever you need from whichever menu you've pulled up. In the newest versions of TalkBack in Android, these two menus have been combined into a single TalkBack menu, which you can still access by swiping down and then right in that L shape. If you have a device that supports multi finger gestures, you can also perform a three finger tap to get that TalkBack menu. Then you can select what you need from the menu. And you can customize what appears in the menu as well from the TalkBack screen, then settings, and then customize menus. If you have any question about your version, you'll be able to tell what your device supports when you swipe down and then right, because the menu that appears will be titled a global context menu or TalkBack menu. At one point, getting to the navigation options was done by swiping up or down to cycle through them, and then switching to swiping left or right to navigate by that option. By heading, by word, by whatever you've picked. And then you talk to swipe up or down again, to get back to the default navigation and explore the surrounding content. More recently, this is called reading control and you'll swipe down and then up or up and then down to cycle through those options. If your device supports the multi-touch gestures, you can also cycle through them using a three finger swipe. Which can be three fingers swiped either right and left or up and down. Like the VoiceOver Rotor, there will be both a visual and audio indication of which option you've activated. And once you've got the option want, it's also a bit closer to the Rotor now in that you can swipe down or up with one finger to new forward or backward through that navigation option on the screen, and then easily switched to a swiping right or left to go back to the default reading order and go item by item. You can also customize what appears in the reading controls, including settings like verbosity or hiding the screen display like the TalkBack menu, that's from the TalkBack screen then settings, then customize menus. Some of the other features from that update for the curious are lots of other new multi finger gestures, more customization features, and the ability to use voice commands to control TalkBack and some updates to the braille keyboard as well, which is fairly new itself. The feature that you might like, whether you're just getting the hang of mobile screen readers or you've decided to learn more gestures is the Practice Area, which is available on both operating systems. On iOS, after you activate VoiceOver a VoiceOver practice button appears just beneath the VoiceOver toggle and it opens a scratch pad. As you make gestures on the scratch pad, it will repeat the gesture that you made, and it will tell you what it does. On Android, there is a tutorial in help section in the TalkBack settings that give options for a guided hands-on tutorial or a practice gestures area, much like the VoiceOver practice. So if you can make your way through the content you need with just the basic navigation gestures that I share, maybe you don't need to learn and memorize every possible gesture right away. But as you get more comfortable with navigating and encounter different kinds of content or more complicated content that requires some of these other gestures, you'll have a good foundation already to build on and some good resources to start learning more. Some common issues you might find when you test with a screen reader are images with missing or incorrect text alternatives. Note that VoiceOver is getting smarter, so it will describe graphics and into iOS 15, even photographs based on whatever context it can pick up. And it isn't perfect so I do recommend you pay close attention to text alternatives when testing, since they may or may not have been provided by a human being. Visible information, structure, and relationships that are not communicated to users through a screen reader announcement, that's things like visible text that's completely skipped over or headings that haven't defined as headings. Content that is visibly hidden and meant to be completely hidden but it's still announced by the screen reader. An illogical reading order. Interactive elements that are not able to be operated while the screen meter is enabled. Content updates that are not communicated like error or status messages. And interactive elements with missing or incorrect names, roles, states, or values. To make your life easier, you can make use of accessibility shortcuts. I mentioned earlier that Android has the option to add an accessibility menu, but you can also set other accessibility shortcuts on either device to quickly toggle different frequently used settings and turn assistive technologies on or off. For iOS, there's an accessibility shortcut option at the bottom of the accessibility settings screen. On this accessibility shortcuts screen, you can choose any number of accessibility options to include in the shortcut, including voiceover, some color contrast settings, and a lot more. You can then triple click the side button on the right side of the device for an iPhone 10 or later. Triple click the home button if your device has that or add a button to the control center. And if you've selected one tool, it will just toggle that on and off. And if you've selected multiple, it will bring up a menu where you can choose which one to turn on and off. Android also has an accessibility shortcut, but instead of that one-stop shop for choosing what you want to be included in the shortcut, you can choose it from the setting screen where you'd initially enable each setting or assistant technology. So from the TalkBack screen or from the voice access screen, there will be a control to enable the shortcut. And you can choose from a few options on how to trigger it. One is pressing the physical up and down volume buttons together at the same time to toggle the one tool or choose from several. Another is to add it to the accessibility menu or, sorry, the accessibility button or the two fingers swipe gesture, which is the same shortcut where the accessibility menu lives. And that last trigger is what's getting an update next week. So in Android 12, it will take the form of a floating shortcut button. One shortcut that has its own unique trigger is magnification. If you enable the magnification shortcut, you can just triple tap the screen with one finger to magnify your screen. For the sake of offering as much information as I can, these are still not all of them but here are some more options for color and contrast. For iOS, you can reduce transparency, increase contrast, differentiate without color, which is available in iOS 13 and up, use color filters and reduce white point. For Android, you can turn on color correction or high contrast text, which is an experimental setting. Some additional settings for visual support for iOS, you can zoom, which will magnify the whole screen or part of the screen. The magnifier, which uses the camera to magnify things in person. And it became a default app in iOS 15. Spoken content, which includes speak selection and speak screen, which do what they sound like they do. Bold text, button shapes to add some visual cues to differentiate button controls from plain text. On/off labels to make it clear whether a switch control is set to on or off and reduce motion or animation. For Android, magnification which does magnify the whole screen, select to speak, large mouse pointer for if you've paired a physical mouse with your device and remove animations. Additional settings for mobility and touch. For iOS Voice Control, which will let you control the device with spoken commands and is available in iOS 13 and up with additional language support in iOS 15. Switch Control, allows you to control the device with a switch such as an external device with a physical button or by using the camera to capture movements like a tilt of the head or an iOS 15 with simple mouse sounds to interact with elements as they are highlighted on screen. Assistive Touch, let's see replace swipe gestures, pressing physical buttons, or moving the device with a customized touch action or pair of point your device, like a physical mouse. Reachability to help make the top corners of the screen content easier to reach when your hand is all the way down at the bottom of device. Haptic Touch and Touch Accommodations options to adjust things like hold duration. And Back Tap, which runs a shortcut when you tap the back of the device and is available in iOS 14 and up. Android has Voice Access and Switch Access, which is similar to voice and switch control on iOS. Currently in beta, it's an update adding the ability to use facial expressions to control your Android device. Autoclick or dwell timing is tied to using a physical mouse. Touch and hold delay for hold duration. Time to take action or accessibility timeout to dictate how long temporary messages might stay on the screen. And vibration and haptics options. And so additional settings for hearing and audio. For iOS hearing devices, as to pair made for iPhone hearing aids or sound processor, as opposed to going through the general Bluetooth settings. Sound recognition, which listens for important audio in the environment like babies crying or fire alarms. LED flash for alerts. So there's an additional visual method of communicating notifications and subtitles and captioning. For Android, their sound amplifier, which was available in Android 9 and updated an Android 10. Live Transcribe transcribes environmental speech, like people talking around you into text. Sound Notifications like Sound Recognition in iOS. Live Caption captions audio that's being played by the device and is available in Android 10 and up. And hearing aids to support paired devices. And if you do want to learn more after this, either because you immediately forgot everything that I said, or you're ready to learn more gestures, or you want to learn some of those other device settings that I unfortunately had to breeze through, Apple and Google both of course have accessibility support pages with more information about how to use some of these on device tools. Additional testing tools. These are the bonus tools that I mentioned at the top of the talk, they're bonus, and that they aren't at all built right into iOS or Android devices, but they're not bonus in that they're going above and beyond. They are important for testing and you should definitely, definitely, definitely put these in your toolbox. First is a physical keyboard. This is likely to be Bluetooth rather than wired, but it could be wired. Putting a physical keyboard in the toolbox is crucial because it is not at all uncommon for a person to pair one with their mobile device. It can be used as an input method for screen reader users, or for users with limited mobility or dexterity that makes it difficult to use gestures full-time or at all. Lots of folks like to use them specifically for working on tablets. And you will want to test the keyboard both with and without a screen reader enabled. In iOS first you'll need to turn on full keyboard access to support keyboard use. This was introduced in iOS 13, so it's still fairly new and you might find a few wrinkles in the keyboard support for some iOS apps. Full keyboard access can be found from the accessibility settings screen, then keyboards and then full keyboard access. On this screen, you'll find not only the control to enable the setting, but also some information about the key commands you'll use to navigate an iOS device. And you can customize those commands. For iOS, I do recommend going out of your way to turn full keyboard access all the way off. If you aren't actively testing with a keyboard, because it may interfere with VoiceOver focus and touch gestures. Rumor has it that this was a bug that was fixed in iOS 15, but it is one of the settings you can add to that accessibility shortcut. So it's easy enough to toggle it on and off just to be safe. Android has had keyboard support a bit longer and you shouldn't need to turn anything on to use it, just pair the keyboard. But you can find a list of basic system-wide keyboard commands to help you navigate in settings, then system, then languages and input, and then physical keyboard. And some more specific keyboard commands to help you navigate with TalkBack or in the TalkBack settings, and then advanced settings and then keyboard shortcuts. And if you're going to be testing on multiple devices, say one for iOS and one for Android, treat yourself a multi-device or easy switch keyboard. Pairing and unpairing a keyboard to alternate between devices is annoying and you'll thank me. Some common issues you might find when you test with a physical keyboard are interactive elements that are not operable with or without the screen reader enabled. You might find that a control works completely differently when you've paired a keyboard with a screen reader, versus when you're using only the keyboard. And the only way to be sure is to test both ways. An illogical focus order when using Tab or the arrow keys, or one of the keyboard commands identified on the settings pages. Or missing focus indicators while the screen reader is not enabled. This is another reason why it's important to test both with and without a screen reader. When a screen reader is on, remember there will be that visible screen meter focus. Again, that's the black and white outline for VoiceOver and the green outline for TalkBack. But when you turn the screen reader off notice, is there any visible indication of keyboard focus? iOS generally uses a blue outline or shadow by default, although that can be changed in the full keyboard access settings so that there's more visual contrast. And Android uses a gray shadow or background. And to reiterate, the different culminations are keyboard with screen reader, keyboard without screen reader and screen reader without keyboard. And when testing that last screen reader without keyboard combination on iOS, turn full keyboard access all the way off. Automated testing, it can be done on native mobile content. Of course, just like for web based content, automated testing tools are to supplement manual testing, never to replace it. You may choose to start your testing with an automated test just to dip your toe in. I like to run these at the end for a final gut check. Either way, they're quite helpful for identifying a few issues that may be difficult to test manually with total accuracy. For iOS, if you have access to a computer running Mac OS and Xcode and a free developer account, you have access to the accessibility inspector. If you have access to Mac OS but don't have Xcode, you can download it for free in the App Store. Once you have Xcode and opened it, you can find the accessibility inspector in the menu bar at the top of the screen under Xcode, then open developer tool, and then accessibility inspector. You'll need to physically connect your testing device to the computer and then you'll have access to a few options that are helpful. One is inspecting by element. You can select elements on the device to see a very basic, very basic rundown of some of the properties that have been set on it by the developer and get a better understanding of what is making VoiceOver announce the things that it does for any particular element. You can also run an audit. This will show a list of potential issues for the screen content and is particularly helpful for identifying elements that might have insufficient color contrast and target sizes for buttons and controls that may be too small to comfortably select with a tap. Those are two things that can really be tricky to accurately test when you're doing compatibility testing. And also shows things like missing labels and texts that may not support Dynamic Type. But I found that those are easier to test manually with manual testing tools. Android has a few automated scanners that you can download directly to your device. The one I use is accessibility scanner, which is a Google app, and it's available for free in the Play Store. The accessibility scanner lets you enable a floating button right on the screen, which you can move around wherever you'd like. You can then run a snapshot scan of the screen or capture an interaction and it will provide you with a list of potential issues like insufficient color contrast for texts and images, like juice ball touch, target size, also issues with labels and descriptions. Each of which you can select to get more information about the issue. I did already mention all of these with each automated testing tool, but to recap, some of the more common issues and the most useful information you're likely to find in your automated test results are cases of insufficient color contrast or target size. An automated scan will give you real actual numbers for these things, which is very useful for a tester. You may also find information about missing or incorrect names and labels useful if it will help you determine why a screen reader identifies a certain element the way that it does. These poor last few didn't quite fit into a category, but here they are. Screen mirroring. If you're testing an app, chances are pretty good you'll need to communicate your findings with another person. And while each device will, of course, let you take screenshots or record yourself going through a process right through the device, screen mirroring can be really helpful when you need to get those screenshots or recordings from the device to your computer and then into an email or document. Or if you need to hop on a call and share your screen, it just helps get everything handy all at one place for your workflow. Another thing it can be helpful for is testing some color contrast using the color contrast analyzer tool, which is coming up on the next slide. So I will come back to that. I like to use LonelyScreen for iOS to screen mirror, which uses airplane to cast screen to a laptop or computer. For Android. I use Screen Copy, which does require you to be physically plugged in to your computer. Both of these are free downloads although LonelyScreen works as a free trial, so if you'd like to get rid of the free trial pop up, you may opt for the paid version. Back to the color contrast analyzer. If you aren't already familiar with it, this is a free tool from TPGi that does as it says. Analyzes color contrast. You can plot colors straight from the screen using an eyedropper tool. You can also use sets of sliders to find a specific color value. Or if you do already know the actual color values that are being used, you can enter them directly into the tool. Either way, you'll get a contrast ratio and some guidance related to it and you can use all of this data when you document your testing results. This tool can be helpful if you see a color combination on your mobile app, that feels like it's going to be trouble, but isn't being picked up by an automated scan or you don't have access to those automating testing tools at all. In a pinch, if you have your screen mirroring set up, you'll be able to use the eyedropper from the color contrast analyzer to grab a little sample of the colors and verify the contrast. Now this method is not perfect. A screen cast to a screen with Pixels picked up by an eyedropper it's quite a journey and so it probably won't spit out the exact colors that the app is actually using. Regardless, it can still be helpful for checking color combinations that may need more attention or just need to be verified. And again, that's available from TPGi at tpgi.com/color-contrast-checker, but you can also just search the internet for TPGi color contrast and it will pop right up. I'm so glad that I left a little bit of time for questions 'cause I was afraid I wasn't going to. But that's all I have. So I'm hoping that if you didn't know where to start before you do know. A manageable set of tools and the knowledge of how to use them goes a really long way with taking that first step. So with that, I did see a couple of questions come in that I think I can address right off the bat. But with Android today, I was talking about Android 11. Even though I know that Android 12 is coming up, it's not out yet and so I could only go based on rumors. So Android 11. For one button or gesture that will turn TalkBack on and off, there is the accessibility shortcut for that, which you can turn on from the TalkBack settings screen. And you can choose which button or gesture turns to TalkBack on and off. Goodness, would you test with both Switch Control and Bluetooth keyboard or choose one? I usually recommend the Bluetooth keyboard just because it's easier for testers that are new to access and to learn how to use. But usually if keyboard accessibility has been taking into account with when developing the app, it will translate fairly well to Switch Control. If you do have access to both, certainly certainly use both. In a country, I reside in a country where only minority use iOS and Google Pixel is not distributed officially the country. But do you think that using your phone like Samsung or another device for Android testing. If you know your user base definitely on the side of what you know is in use. Like I said, Samsung and Google are sort of getting a little bit closer together in terms of how they operate and how they both use TalkBack so you can certainly use a phone like Samsung to do some testing. Goodness, was there an automated tool for iOS testing? There is an automated tool for iOS testing, I think you came in immediately after I mentioned it, which is built into Xcode. So if you connect to a Mac Book with X code, there is a automated scan there. If a mobile app that is being used to remotely operate a kiosk screen has a timeout feature, in other words, the app will disconnect from the kiosk if no activity in the app for a period of time, that are there any accessibility implications? There are always accessibility implications for a timeout feature, but there are a lot of different options for, I want to say getting around that for helping to reduce implications, any sort of method of turning it off, if it's a kiosk, I assume it would have to be temporarily, so probably the most common, the most common thing to do would be to allow a user to extend the session via a pop-up or an alert in real time. If you have a native app that has both native content and embedded web views, are the text resize settings honored by both types of content. That is a good question because sometimes they do not. iOS does have some, well not iOS, CSS has some, some things you can add to it, to support iOS as Dynamic Type in web content. That is more than I have time for, the remaining five minutes of this talk but the resize settings should honor much of the content. However, there is a separate resize setting in both Chrome on the Android device and Safari on the iOS device. So if you are using or viewing specifically web content on a mobile device, you'll want to change the settings in the browser rather than in the settings app. Can you turn on/off the VoiceOver feature of trying to guess an image is alt, text, or description and is this same feature on TalkBack? I don't, and I will tell you why. I don't want it to accidentally not announce something that it should be announcing. So I would rather it just announce as much as it can and then go into settings and play with the VoiceOver verbosity settings to see if that is what is making VoiceOver announce things it does. Like I said, with the automated scan, that information can be helpful, but being able to toggle it and see the differences is I think even more helpful. I have three minutes left. Mike, do I have time to answer any more? Should I? - [Mike] Yeah, if you want to answer, I would say one more. That'd be fine if it's, you know, short enough. If not, we can take these, I can send them to you after and if you'd have some time to answer them, I can email them all to everyone. - [Laurie] I see a short one right in front of me and I skipped around so I don't know if there's any I missed so I can revisit them but before I go, what does reduce white point do? It will turn everything on screen a little bit darker. So when you've got your screen say in a dark room and everything on the screen is white, like the white background of a website, it's very very bright and in your face. And you can reduce white point to just sort of dim it a little bit more than the screen dimming feature does as like a baseline. Hopefully that helps. - [Mike] Awesome. Awesome job Laurie. Thanks everyone for attending today. I will send out a email following the event once the recording has been processed with the slides, as well as the recording and any tips and tools that Laurie mentioned today, we'll add in the email as well, along with some questions that Laurie will answer after the webinar. So thanks again Laurie, thanks again everyone else, have a great day afternoon meeting. - Thanks. - Bye.