- [Mike] Hi everyone, thanks for joining the webinar today. We will begin momentarily as people start to sign in. Hi everyone, thanks for joining the webinar. We're gonna begin in a moment, just waiting for a few more people to sign in and we'll start momentarily. All right, good morning, good afternoon. My name is Mike Mooney, I'm the Digital Marketing Manager at TPGi. I want to thank everyone for joining us today for the Mobile Accessibility Testing Toolbox with Laurie Pagano, Senior Accessibility Engineer, TPGi. Before we get started, I just have a few housekeeping items to go over. First off, this session is being recorded and we will email everyone the recording after the event. We have a live captions available, so feel free to use those as needed. And lastly, we will have time for live Q&A at the end of the webinar, so please use the Q&A box to submit your questions and Laurie will answer those as many as you can as time permits. And if anyone needs any Accessibility support or training, feel free to reach out to us after the webinar, and we can connect you with a TPGi expert. And with that, I will let Laurie begin, Laurie. - [Laurie] Thanks Mike, and thanks everyone for joining. Welcome, hello, hello. As mentioned, my name is Laurie Pagano, I'm a Senior Accessibility Engineer here at TPGi. Welcome to the Mobile Accessibility Testing Toolbox. I spend much to most of my time at TPGi working with mobile devices and content. And I remember it can be a little overwhelming and a little confusing to switch from working with desktop web content to working with mobile. So today I'm going to cover some helpful tools and tips you'll want to become familiar with when you are ready to start exploring mobile accessibility testing. After a bit of an introduction, I'll start with the device settings and assistive technologies. The things that generally come built right in to Apple and Google devices and you'll see later what I mean when I say generally. As well as some additional testing tools, the bonus peripheral tools, the may not be so obvious or will just make it easier to test, but either way we'll really help to round out your toolbox. And for the sake of efficiency at any point over the hour, when I say tools just tools, you can assume safely assume that I'm referring to everything that I just mentioned. Device settings, assistive technologies, and the additional testing tools just as one unit tools will be shorthand for all of those things. Okay, let's get started. Some important points. This is not an exhaustive list of all the setting or assistive technologies that people with disabilities might use on a mobile device, this is just the beginning. What I'm highlighting today are some of the more useful introductory tools for mobile accessibility testing, but there are many, many, many different options for configuring a mobile device to work best for various disabilities, needs or preferences. So as you become more familiar and more comfortable with what's in today's talk, I really do encourage you to dig deeper into some of the tools that didn't get the spotlight and make room in your toolbox for them as well. I'll try to at least mention as many as possible, but the amount of time I dedicate to any tool doesn't mean anyone is more important than another, just that we only have an hour. And on top of there being more tools than we can comfortably cover, there are also considerations like normal, regular software updates, which may move things around, adjust the way they work or even add and remove features and settings entirely. So just be aware that what's available at the time of this talk might be a little bit different depending on your device and what versions of the operating system or applications that you have. Speaking of operating systems, today, I'll be talking about iOS from Apple and Android from Google. And when I talk about Android devices for testing, I specifically I'm referring to Pixel since the Android operating system is fragmented. There are some differences in the user interface and the features and the functionality across all the different devices offered by different manufacturers. So, if you can only choose one device to test with, I generally recommend going with a Pixel to get the pure Android experience. A note on compatibility testing. For testing native mobile content, some of the considerations for accessibility are going to be pretty much the same, an image without a text alternative on the desktop or in any web based content is still going to be an image without a text alternative in a mobile app, that's pretty straightforward. But of course there are plenty of differences. One obvious difference is just finding where all of the different tools can be turned on or off and how to use them hence the toolbox. One difference that's maybe less obvious when you're thinking about mobile accessibility testing, but becomes pretty obvious once you've started, is that on mobile, particularly on a native mobile app, you may not be able to you behind the scenes or go digging as easily as you might for desktop web content. So, unless it's your product that you've personally made, or you've been provided with the code, you're unlikely to have access to that code for the application you're testing. So, if you aren't looking at the actual code, or maybe you aren't planning to make specific recommendations based on code at all, or you just aren't necessarily aligning your testing with any version of the Web Content Accessibility Guidelines, WCAG, then you may be looking at compatibility testing and with compatibility testing, you'll be using combinations of Accessibility settings and assistive technologies that are commonly used by people with disabilities to access the content and determine is this product compatible with these combinations? And here we go. First things first, where are the Accessibility settings? For iOS, the Accessibility settings are right in the top level of the device settings app, up through iOS 12, they were buried in the general screen of the settings app, but then they got promoted in iOS 13 to be right off the main settings screen. And as of the latest major release iOS 15, this past fall, you can both change settings at the operating system level and set some of Accessibility settings differently for individual apps on an app by app basis. In Android, the Accessibility settings are also in the top level of the device settings app. And again, this and any other Android screenshots you see today is as seen on a Pixel. In Android 12, the Accessibility settings have been shuffled around and reorganized a bit with sub menus, but all of the major players are still there. And some of these settings might appear in multiple places. So I'll be taking you to them via the Accessibility settings screen. But if you confused, you can always search the settings app to find them. As part of the Android Accessibility Suite, you can enable an Accessibility menu, this will give you a handy Accessibility button available all the time from whatever screen you're on that provides shortcut access to a lot of functionality that otherwise might rely on the physical hardware buttons, like taking screenshots, locking the screen, adjusting volume and brightness. And it also provides direct access to the Accessibility settings screen. The Android Accessibility Suite is a suite of applications that includes this Accessibility Menu, the screen reader TalkBack, which of course we'll talk about, and two other apps Select to Speak and Switch Access. This should come pre-installed on most Android devices, but in case it's not, it can be downloaded from the Google Play Store for free, and then it will integrate seamlessly with the other settings. And this extra step of downloading the suite is what I meant earlier by generally built right in. And right away, we'll be able to start the testing process just by noting does the app I'm testing respond to all of the changes I've made in the... As I go forward and introduce each tool, just like I'm doing here, I'll be pointing out some of the common issues that you're likely to find when testing with that tool. So you'll find that this same issue will pop up a lot over the hour. Does the app I'm testing respond to the options provided in the devices accessibility settings or does it provide its own equivalence? Okay, I like easy. So I'll start with the settings to resize text, which are fairly straightforward. In iOS, there are two locations where you can change the text size at the device level, device level, meaning that changing this one setting in this one place will affect the text across default Apple apps like mail, calendar and the phone, as well as any other app that supports dynamic type. One place in settings is the display and brightness screen and then text size. This screen provides a range of seven text size options with the default falling right in the middle of that range. But to open a wider range of text sizes, including larger sizes for accessibility, you can go to the Accessibility settings screen, then display and text size, then larger text. By toggling that Larger Accessibility Sizes control at the top of the screen, you'll introduce five additional larger text sizes. And developers do need to actively support dynamic type for the text of a native iOS app to successfully resize and accommodate this setting by using Apple system fonts and text styles that will get that built right in. For Android, there are two options for resizing content and they used to be on the top level of the Accessibility settings, but of Android 12, they live in a text and display sub menu. One of them is font size, which simply resizes the text. There aren't as many options here for font sizes as what iOS has, but Android also has display size, which you might use to change the size of text as well as any containers that define the layout or any controls or icons. And you can use a combination of font size and display size to further adjust the sizing of content. And as long as a scalable unit has been used to define sizes, Androids resizing features should work across the operating system unless they've been actively suppressed. Some common issues you might find when you test text resize through the device settings, a portions of text that do not resize, text that is cut off with ellipses or ... affecting meaning, text that is cut off by the boundary of its container affecting meaning, or text that does not respond to text size settings at all. And it is important to note that Safari on iOS and Chrome on Android do have their own text resize settings in each browser. So, if you're using the text resize settings through the settings app, you're unlikely to see it change there for web content. If you want to test text resizing for web content, when viewed in the browser, you'll want to use the browser text resize settings. Another low hanging fruit you might want to start with is the color and contrast settings. This one is maybe not as low hanging as resizing text because there are actually a lot of different options available for adjusting color and contrast and different combinations of them might work best for different folks. So, again even though I've highlighted only a few, the rest of them deserve respect as well, but I still count this as low hanging because even with all the options and while there are some additional tools that will help you identify specific colors and color contrast ratios, which I promise I'll cover later in additional testing tools, you still aren't really learning a whole new set of skills, if you're already familiar with the concept of testing color and contrast on the desktop or in web based content. On iOS, most of the settings that affect color or contrast can be found in the Accessibility settings screen and then display and text size. Here there are settings for classic invert and smart invert, which respectively will invert all displayed colors or all displayed colors, except for media like graphics and video, and some user interface elements. Android has fewer options for color and contrast. And as of Android 12, the color settings have also been moved from the top level of the Accessibility settings to the text and display sub menu, along with the font and display size. Similar to iOS, Android has color inversion, which will invert all displayed colors. Dark Mode as it's called on iOS or Dark theme, as it's called on Android, uses darker backgrounds and colors to decrease brightness and reduce eye strain. You might debate whether this is in and of itself an Accessibility setting, but regardless, any changes made to the visual design in order to support Dark Mode, could certainly introduce new issues that are probably contrast related. On iOS, Dark Mode is not found through the Accessibility setting screen at all, instead you can navigate from settings directly to the display and brightness screen to choose between light and dark modes. On Android, it's on the top level of the Accessibility settings screen available as a single switch to toggle on and off. And starting in Android 10, you can use Dark theme and color inversion together for a new combination of support. Some common issues you might find when you test color and contrast through the device settings are of course, insufficient contrast for text elements, insufficient contrast for user interface, elements and states, insufficient contrast for images and icons that convey information, or you might find content that does not respond to color and contrast settings at all. Now that we have some of the more straightforward tools in the toolbox, I'll pivot just a bit and go for one of the more daunting of the mobile assistive technologies for new testers, the screen readers. Even if you feel like a total pro with other screen readers like JAWS, you're probably used to using a certain set of keyboard commands to navigate around and you probably have certain expectations about how to interact with content when using a screen reader. But then to switch to a totally different method of operation like swipe gestures can take some time to get the hang of it. So across Apple products, you'll find VoiceOver. And even if you're familiar with VoiceOver on macOS, VoiceOver on iOS is just a little bit different. Some of that does have to do with the different needs and abilities of a touch screen interface, like I mentioned, but also with release schedules and just what features Apple might decide to support when and where they decide to support them. You can enable VoiceOver from the Accessibility settings screen, and then VoiceOver, there is a control to turn it on and off and a whole bunch of settings to customize behavior like how fast it speaks or how much it says, and starting in iOS 13, you can customize gestures and commands that you use to control it. And there's one option, the Caption Panel that I want to highlight, because I think it's really helpful for testing and it can be found a bit toward the bottom of the VoiceOver screen, turning the Caption Panel on will display VoiceOver speech output as text at the bottom of the screen as a bit of a visual supplement to the audio output that VoiceOver already provides. The Caption Panel may sound familiar if you've used the macOS version of VoiceOver, but for iOS, it only became available in iOS 13, so you can find it in iOS 13 and above, and this does not happen by default. So, if this is a feature you're interested in, you'll have to turn it on. Android has TalkBack as its screen reader, which you can enable from the Accessibility setting screen and then TalkBack. It also has those same settings to customize things like speaking rate and verbosity and gestures. And again, TalkBack is part of the Android Accessibility Suite, which I mentioned back with the Accessibility menu button. So it is technically a separate app, which lets it run on a different update or release schedule than the operating system as a whole. But it integrates with the rest of the settings and even for the devices where it doesn't come pre-installed, it feels built right in. If TalkBack doesn't automatically give you that visual caption option, like the one I pointed out for VoiceOver, you can go from the TalkBack screen to settings and then advanced settings and then developer settings to toggle display speech output. To start, here's a quick and dirty, easy, very limited list of standard gestures for basic navigation. These I think are the very bare minimum that will keep you focused on getting comfortable with the screen reader without panicking that you've lost control of your device if you're not used to it or getting caught up in memorizing the whole entire list of gesture. And because these are standard enough to be shared across iOS and Android, I didn't even bother to split them up by operating system, they're going to be the same for either device. To explore the screen, drag one finger around the screen to announce whatever is under your finger. This is a non-linear way to get an idea of what is on the screen and where because many apps have common layouts like a back button at the top left or tapped navigation along the bottom of the screen. This can be really helpful for jumping directly to a certain part of the screen where you might reasonably expect to find whatever it is that you're looking for. And if the content under your finger has been created with even a whisper of Accessibility in mind, you'll be able to tell what has screen reader focus by hearing the announcement, seeing the announcement in those caption panels or seeing the screen reader focus. This shows up by default as a black and white outline for VoiceOver, which helps it show up on most backgrounds, dark or light, and as a green outline for TalkBack. If you know where an item is, you can also just tap directly onto it to give it screen reader focus with a one finger single tap. A one finger swipe to the right or to the left moves the screen reader focus to the next or previous item of any type sequentially in the reading order and a one finger double tap activates the button or control that currently has screen reader focus. You can double tap any anywhere on the screen and it will activate the item. You don't have to double tap directly over it, as long as it already has that screen reader focus. And then on top of those very basics, there are gestures that are multi finger or multidirectional and they introduce another layer of options for navigation. On iOS, the rotor gives you the opportunity to switch between different navigation options or to conveniently adjust some settings like, whether VoiceOver announces hints or how fast it speaks without going all the way back through the settings app. Some of what you can access through the rotor is just there by default. And it changes based on what makes sense with the content that's on the screen at the time, but you can also customize the list of possible options from the VoiceOver screen in the settings app, there is an option there for the rotor. To access the rotor, you'll do a sort of twisting motion with two fingers like you're dialing a knob directly on the screen and you'll know you've got it because you'll either see it appear visibly on the screen, or you'll hear a VoiceOver start to announce the options as you, "Turn the dial." At each step of the rotor, you'll see or hear each option that you're activating. And then once you've got the option that you want, you can swipe down or up with one finger to move forward or backward through that option on the screen. Or if it's a setting to adjust that setting. For example, if I used the rotor to activate headings like I have in my screenshot here, I could then start swiping down, down, down with one finger to navigate forward, heading by heading through just the headings or swipe up, up, up to navigate backward through them, which is a way of navigating that might sound familiar if you have experience using quick navigation commands for other screen readers. Then I can switch back to swiping left and right to explore the content surrounding that heading. Or if I used the rotor to activate something like speaking rate, I could swipe up or down to make VoiceOver speak faster or slower on the fly. And as a side note, if you started gearing up to use your rotor for adjusting a lot of settings, which I would not expect to be necessary, but if you did, iOS 15, introduced a VoiceOver quick settings option that you can access with a two finger quadruple tap, so you can access some settings quickly that way, rather than by filling up your rotor. About a year ago, Google and Samsung together released a co-developed TalkBack update that introduced a few significant changes. This also offered a bit more consistency among the different Android devices by making TalkBack the default screen reader on not only Pixel, but now most galaxy devices as well. But the way that you interact with TalkBack, may be different between the two, depending on your device or software. If you have an older version of TalkBack or Android, you have the context menus, plural. The global context menu provides global options for interacting with TalkBack or the device itself. And the local context menu provide some local options for interacting with the more immediate content. And like the VoiceOver rotor, the local context menu options change based on the currently displayed content. To access the global context menu, you'll swipe down and then right all in one go, drawing a right angle or a capital L shape. And to access the local context menu, you'd swipe up and then right and then you can select whatever you need from whichever menu you've called. In the newest versions of TalkBack and Android, these two menus were combined into a single TalkBack menu, which you can still access by swiping down and then right in that same L shape. If you have a device that supports multi finger gestures, you can also perform a three finger tap to get that TalkBack menu, and then you can select what you need from the menu. And you can customize what can appear in the TalkBack menu as well from the TalkBack screen, then settings, and then customized menus. If you have any question about your version, you'll be able to tell what you have when you swipe down and then right because the menu that appears any menu will appear, and it will either be titled global context menu for the older version or TalkBack menu, if you have the newer version. At one point, getting to the navigation options was done by swiping up or down to cycle through them, and then switching to swiping right or left to navigate by that item by heading, by word, by whatever you've picked. Then you'd have to swipe up or down again to get back to the default navigation and explore the surrounding content. After the update, this is called reading controls and you'll swipe down then up or up then down to cycle through them, cycle through the options. If your device supports the multi-touch gestures, you can also cycle through them using three finger swipes, which can be three fingers swiped either right and left or up and down. And like the VoiceOver rotor, there will be both a visual and audio indication of which option you activated. And also like the VoiceOver rotor, once you've got the option that you want, you can then swipe down or up with one finger to move forward or backward through that navigation option on the screen. And then you can easily switch to swiping right or left to go back to the default reading order item by item. You can also customize what appears in the reading controls, including settings like verbosity or hiding the screen display. Like customizing the TalkBack menu that is from the TalkBack screen and then settings, and then customized menus. Some of the other features from that update for the curious are lots of other multi finger gestures, more customization features, the ability to use voice commands to control TalkBack, and some updates to the braille keyboard. A feature that you might like, whether you are getting the hang of mobile screen readers, or you've decided to learn more is the practice area, which is available on both operating systems. on iOS, after you activate VoiceOver, a VoiceOver Practice button appears just beneath the VoiceOver toggle. It opens a scratch pad, and as you make gestures on it, it will repeat back to you the gesture that you made, and it will tell you what it does. On Android, there is a tutorial and help section in the TalkBack settings that give options for a guided hands on tutorial, or there's a practice gestures area, much like the VoiceOver Practice. So, if you can make your way through the content you need with just those basic navigation gestures that I shared, maybe you don't need to learn and memorize every possible gesture right away, but as you get more comfortable with navigating, and as you encounter different kinds of content or more complicated content that requires some of the other gestures, you'll have a good foundation already to build on and some good resources to learn more. Some common issues you might find when you test with a screen reader, images with missing or incorrect text alternatives. A note that VoiceOver is getting smarter. So, it will describe graphics, on into iOS 15, it will describe photographs based on whatever context it can pick up. It isn't perfect, it identifies my dog as a horse. So I do recommend that you pay close attention to text alternatives when testing, since they may not have been provide by a person. Visible information, structure, and relationships that are not communicated to users through a screen reader announcement, that's things like visible text that's completely skipped over, or headings that haven't been defined as headings content that is visibly hidden and meant to be completely hidden, but is still announced by the screen reader. An illogical reading order. Interactive elements that are not able to be operated while the screen reader is enabled. Content updates that are not communicated like error or status messages, and interactive elements with missing or incorrect names, roles, states, or values. To make your life easier, you can make use of accessibility shortcuts. I mentioned earlier that Android has the option to add an Accessibility menu shortcut. You can also set other accessibility shortcuts on either device to quickly toggle different frequently used setting and turn assistive technologies on or off. For iOS, there's an accessibility shortcut option at the bottom of the Accessibility settings screen. On this accessibility shortcut screen, you can choose any number of Accessibility options to include in the shortcut, including VoiceOver, some of the color and contrast settings and others. You can then triple click the side button on the right side of the device for an iPhone 10 or later. Triple click the home button, if your device has that, or you can add a button to the Control Center. And if you've selected one tool, it will just toggle that one on and off. And if you've selected multiple tools, it'll bring up a menu where you can choose which one to turn on and off. Android also has an accessibility shortcut, but instead of that one stop shop for choosing what you want to be included in the shortcut. You can choose it from the setting screen where you'd initially enable each of the settings or assistive technologies. So, from the talkBack screen or from the Voice Access screen, there will be a control to enable the shortcut. And then you can choose from a few options on how to trigger it. One is pressing the physical up and down volume buttons together at the same time to toggle the one tool or choose from several. The other is the Accessibility button, which is the same place where the Accessibility menu lives. And you can use that to toggle one shortcut with a long press you can choose from multiple. And as of Android 12, you can also display the Accessibility button as a floating button or a series of floating buttons for multiple shortcuts. One shortcut that does have its own unique trigger, is magnification. If you enable the magnification shortcut, you can choose any of the options that I just mentioned, or you can just triple tap the screen with one finger to magnify your screen. For the sake of offering up as much settings information as I can, these are still not all of them, but here are some more options for color and contrast. For iOS, there's reduced transparency, increased contrast, differentiate without color, which is available in iOS 13 and up, use color filters and reduce White Point. For Android, there is color correction and high contrast text. And some additional settings for visual support. For iOS, Zoom to magnify the screen or part of the screen. Magnifier, which uses the camera to magnify things you see in person. And this became a default app in iOS 15. Spoken Content, which includes Speak Selection and Speak Screen. bold text, button shapes to add some visual cues that differentiate button controls from plain text on off labels to make it clear whether a Switch Control is set to on or off and reduce motion. For Android, magnification, which magnifies the screen or part of the screen. Select to Speak, remove animations, large mouse pointer for if you've paired a physical mouse with your device and bold text. Additional settings for mobility and touch include, for iOS, Voice Control, which will let you control the device with spoken commands and is available in iOS 13 and up, with additional language support in iOS 15. Switch Control, which allows you to control the device with a switch such as an external device with a physical button, or by using the camera to capture movements like a tilt of the head or in iOS 15 with simple mouth sounds to interact with elements as they are highlighted on the screen. Assistive Touch, lets you replace swipe gestures, pressing physical buttons or moving the device with customized touch actions or pair a pointer device like a physical mouse. Reachability to help help make the top corners of the screen content easier to reach when your hand is all the way down at the bottom of the device. Haptic Touch and Touch Accommodations options to adjust things like hold duration and Back Tap, which runs a shortcut when you tap the back of the device and is available in iOS 14 and up. For Android, Voice Access and Switch Access, which are similar to voice and Switch Control on iOS. And as of Android 12 includes using facial expressions to control your Android device. Autoclick or dwell timing, which is tied to using a physical mouse. Touch and hold delay for hold duration. Time to take action or Accessibility timeout to dictate how long temporary messages might stay on the screen, and vibration and haptics options. And some additional settings for hearing and audio. For iOS, hearing devices is specifically to pair made for iPhone hearing aids or a sound processor, as opposed to going through the general Bluetooth settings. Sound recognition, which listens for important audio in the environment like babies crying or fire alarms. LED Flash for alerts, so there's an additional visual method of communicating notifications and subtitles and captioning. For Android, sound amplifier, which was available in Android 9, but updated in 10. Live transcribe for transcribing environmental speech, like people talked around you into text. Sound notifications like sound recognition on iOS. Live caption for captioning audio that's being played by the device and which is available in Android 10 and up and hearing aids to support paired devices. And if you do want to learn more after this, either because you immediately forgot everything or you're ready to learn more about screen reader gestures, or any of those other device settings that I just flew through. Apple and Google, of course both have Accessibility support pages with more information about how to use some of these own device tools. Additional testing tools. These are the bonus tools that I mentioned at the top of the talk, they're bonus in that they aren't built right in at all to an iOS or Android device, but they're not bonus in that they're going above and beyond, they're still pretty important for testing. First up is a physical keyboard. This is likely to be Bluetooth rather than wired, although it could be wired. Putting a physical keyboard in the toolbox is crucial because it is not at all uncommon for a person to pair one with their mobile device. It can be used as an input method for screen reader users, or for users with limited mobility or dexterity that makes it difficult to use gestures full time or at all. And lots of folks like to use them specifically for working on tablets. And you'll want to test to the keyboard both with, and without a screen reader enabled. In iOS, first you'll need to turn on Full Keyboard Access to support keyboard use. This was introduced in iOS 13, it's fairly new, but you might find a few wrinkles in the keyboard support for some iOS apps. Full Keyboard Access can be found from the Accessibility settings screen, then keyboards and then full keyboard access. On this screen, you'll find not only the control that enables the setting, but also some information about the key commands you'll use to navigate an iOS device, which I do recommend reading through, if you're going to be testing with Full Keyboard Access, 'cause they might be different than what you expect, and you can customize them if you'd like. You can also use a keyboard to navigate with VoiceOver though there are a few differences in key commands from those from even from those with Full Keyboard Access. The keyboard commands to navigate with VoiceOver are documented in Apple's iPhone user guides online. But I do recommend going out of your way to turn Full Keyboard Access all the way off if you aren't actively to with it, because it may interfere with VoiceOver focus and touch gestures. Rumor has it that this was a bug that was fixed in iOS 15, but it is one of the settings that you can add to the accessibility shortcut, so, it is easy enough to just toggle it on and off to be safe. Android has had keyboard support a little bit longer and you shouldn't need to turn anything on to use it, just pair the keyboard and go, but you can find a list of basic systemwide keyboard commands to help you navigate in settings and then system and then languages and input and then physical keyboard and some more specific keyboard commands to help you navigate with TalkBack are in the TalkBack settings, then advanced settings and then keyboard shortcuts. And if you're going to be testing on multiple devices say, one device for iOS and one device for Android, treat yourself to a multi device or easy switch keyboard because pairing and unpairing keyboards to alternate between devices is annoying, and you'll thank me. Some common issues you might find when you test with a physical keyboard as interactive elements that are operable with or without the screen reader enabled, you might find that a control works completely differently when you're using a keyboard to operate the screen reader, versus when you're using only the keyboard or only swipe gestures, and the only way to be sure is to test all ways. An illogical focus order when using tab or the arrow keys or one of the keyboard commands identified in the settings pages or missing focus indicators while the screen reader is not enabled. This is another reason why it's important to test both with and without a screen reader, when a screen reader is on, remember that there will be that visible screen reader focus again, that's the black and white outline for VoiceOver and the green outline for TalkBack. But when you turn those screen readers off, notice, is there any visible indication of where the keyboard focus is? iOS generally uses a blue outline and shadow by default, although that can be changed in the Full Keyboard Access settings so that it has more visual contrast and Android uses a gray shadow or background. Automated testing, it can be done on native mobile content, of course, just like for web-based content, automated testing tools are to supplement manual testing, never to replace it. You may choose to start your testing with an automated test just to dip your toe in, I like to run them at the end for a final gut check. Either way, they are particularly handy for identifying a few issues that may be difficult to test manually with total accuracy. For iOS, if you have access to a computer running macOS and Xcode and a free developer account, you have access to the Accessibility Inspector. If you have access to macOS, but don't have Xcode, you can download it for free in the app store. Once you have Xcode and have opened it, you can find the Accessibility Inspector in the menu bar at the top of the screen under Xcode, then Open Developer Tool and then Accessibility Inspector. You'll need to physically connect your testing device to the computer, and then you'll have access to a few options that are helpful. One is inspecting by element. You can select elements on the device screen to see a very basic rundown of some of the properties that have been set on an it by the developer and get a better idea of what is making VoiceOver announce the things it does for any particular element. You can also run an audit. This will show a list of potential issues for the screen content and is where you can identify elements that might have insufficient color contrast, and target sizes for buttons and controls that may be too small to comfortably select with a tap. Those are two things that can be real tricky to accurately test when you're doing compatibility testing. It also shows things like missing labels and text that may not support dynamic type, but I've found that those are easier to test manually. Android has a few automated scanners that you can download directly to your device. The one I use is the Accessibility Scanner, which is a Google app and is available for free in the Play Store. The Accessibility Scanner lets you enable a floating button right on the screen, which you can move around wherever you'd like. You can then run a snapshot scan of the screen or capture an interaction and it will provide you with a list of potential issues like insufficient color contrast for text and images, like the two small touch target size and also issues with labels and descriptions, each of which you can select to get more information about the issue. And I did already mention all of these, but to recap, some of them are common issue and most useful information you're likely to find in your automated test results are cases of insufficient color contrast, or target size. An automated scan will give you real actual numbers for those things, which is very useful for a tester. You may also find information about missing or incorrect names and labels helpful. If it'll help you determine why a screen reader identifies a certain element, the way that it does. And these poor last few didn't quite fit into a category, but here they are, Screen Mirroring. If you're testing an app, chances are good, you'll need to communicate your findings with another person. And while each device will, of course let you take screenshots or record yourself going through a process right on the device, Screen Mirroring can be really helpful when you need to get those screenshots or recordings from the device to your computer and then to an email or document, or if you need to hop on a call and share screen with someone in a way that's not holding your device awkwardly up to the webcam which I have done, Screen Mirroring can just help get everything handy, all in one place for your workflow. Another thing it can be helpful for is testing color contrast using the Color Contrast Analyser tool, which is coming up on the next slide, so I will come back to that point. I mostly use windows and I like to use LonelyScreen for iOS, which uses AirPlay to cast the screen to a laptop or computer. For Android, I use screen copy, which does require you to be physically plugged into your computer. Both of these are free downloads. Although LonelyScreen works as a free trial, so, if you'd like to get rid of the free trial pop up, you may opt for the paid version. Back to the Color Contrast Analyzer. If you aren't already familiar with it, this is a free tool from TPGi that does as it says, analyzes color contrast. You can pluck colors straight from the screen using an eyedropper tool. You can also use sets of sliders to find a specific color value. Or if you do already know the actual color values that are being used, you can enter them directly into the tool. Either way, you'll get a contrast ratio and some guidance related to WCAG and you can use all of this data when you document your testing results. This tool can be helpful, if you see a color combination on your mobile app, that feels like it's going to be trouble, but isn't being picked up by an automated scan. Or if you don't have access to one of those automated testing tools at all. In a pinch, you can send a screenshot to yourself or if you have your Screen Mirroring set up, you'll be able to use the eyedropper from the Color Contrast Analyzer to grab a little sample of the colors and verify the contrast. Now this method is not perfect. It will be a screen displayed on another screen with Pixels picked up by an eyedropper, that's quite a journey. And so it probably won't spit out the exact colors that the app is actually using, but regardless it can still be helpful for checking color combinations that may need more attention or to flag them to be verified. And again, that's available from TPGi, tpgi.com/color-contrast-checker. But you can also just search the internet for TPGi Color Contrast and it will pop right up. Whew, we made it, I'm hoping that if you didn't know where to start before you do now, or if you thought you knew, now you're more sure. So with a manageable set of tools and the knowledge of how to use them, you should feel well equipped to start testing content on mobile devices. And I did not leave a whole lot of time to answer questions, I'm so sorry. But I do see some that came in that I think I may have covered. How do you share the audio from the screen reader on the various platforms using VoiceOver and TalkBack? Actually some of the Screen Mirroring software will do that. If you want to do that, but you have to turn one on and then the other, for example, turn on VoiceOver and then mirror the screen or vice versa for it to pick up. Otherwise it is really helpful to turn on those caption panels, if you need to share something and can't quite get the audio to pick up. Is there an accessible Screen Mirroring program that can be used for iOS and Android? I did share my favorites for windows and I don't do this often on a Mac, so I don't have any handy, but I can find out and follow up. When you test for Accessibility, do you test only on the most updated mobile devices and why? I usually do because I'm only testing on one, but if you've got a software that has only just come out, it might make a little bit of sense to wait a bit and let some of the wrinkles get ironed out. But the most updated device and the most updated software is usually a safe bet. It will last you longer than having an older device, and it will just confirm that the most updated versions will support the content. Mike we have three minutes, am I cutting it close or am I good? - [Mike] Yeah, you're good if you wanna answer a couple more. I know we're close to the top of the hour, so, if you wanted to pick, one or two more, that's fine. - [Laurie] Okay, okay, cool. Is there a way to record the gestures a customer makes while navigating an app so we can identify where he has issues? There is a way right on the device for Android to record taps and gestures, and I don't have a handy little roadmap like I did for the rest of the settings, but I will follow up with that because it is very convenient to see what exactly is being done to get the results. And when a Bluetooth keyboard is being used with the web form on an Android device, is there a way to jump to the soft keyboard, to utilize custom auto suggest, for example, a keyboard command perhaps? That actually is a good question. Typically, connecting a keyboard, a physical keyboard just drops the soft keyboard all together. I will of course follow up on this one as well, just to make sure but I have not off the top of my head, found a way to do that. - [Mike] Awesome, well I appreciate the time and excellent resources that you've shared today Laurie. Thanks everyone for attending. We'll follow up after this session, one or two days with the recording and any resources mentioned today, we'll share in an email and if anyone else has any questions for Laurie after the fact, feel free to reach out to IDA ida@tpgi.com and we can help you, assist you in any way. So, thanks again Laurie. - [Laurie] Thank you. - [Mike] Have a good one, bye everyone.