- [Stef Cushchnir] Good morning, good afternoon, good evening everyone. Thanks for joining the webinar. Just waiting for people to come into the room now. We'll give it a few more minutes. - [Speaker] What you're doing. - [Stef Cushchnir] Oh, we have people from all over, joining today, Romania, Lisbon, Maryland, Wisconsin, wow. All right everybody, my name is Stephanie Cushchnir. I'm part of the business development team here at TPGi. We'll give it one more minute to the top of the hour, and then we'll get started. I have a few housekeeping things that I'm gonna read, as we get started. Okay, great. Thanks everyone for joining us today. This is "Intro to Mobile Accessibility Testing Tools," with Laurie Pagano. I have a few housekeeping items, as I mentioned. This session is being recorded and we will email everyone the recording after the event. We do have captions available, so feel free to use them as needed. We will have time, hopefully, for live Q and A. Laurie will keep an eye on the questions in the Q and A box. Please use the Q and A box, and we'll answer as many questions as we can. If we run out of time, I will gather all those questions and be sure to send it out with the recorded session answers from Laurie. We sometimes miss questions if they're in the chat, so try not to put them in the chat. Lastly, I'd like to mention, if anyone needs any Accessibility support, training, usability testing, I will be sending out an email with a link to schedule a time to speak with one of our experts after the webinar. And with that, I will let Laurie get started and provide an introduction of herself. - [Laurie Pagano] Thank you. Thanks and thanks everyone for joining. I'm Laurie Pagano. I'm a principal Accessibility engineer here at TPGi. In fact, this is "Intro to Mobile Accessibility Testing Tools." I spend quite a bit of my time at TPGi working with mobile devices and mobile content. If you're already familiar with Accessibility Testing or assistive technologies, as it all relates to desktop web content, I know firsthand that it can be overwhelming and a little or a lot confusing to switch gears and then focus on mobile. So today I'm going to introduce you to some of the different ways folks can access content on mobile devices, including helpful tools and tips you'll want to become familiar with if you're testing Accessibility for mobile content. After a brief introduction, I'll start with the device settings and assistive technologies, or the things that generally come built right in to popular devices, as well as some additional testing tools, the bonus peripheral items that may not be so obvious, or will just make it easier to test. But either way, they will really give you a more well-rounded set of tools to work with. And for the sake of efficiency, at any point, if I just say tools, you can assume I'm referring to everything, all of that, the device settings, the assistive technologies, and the additional testing tools. So tools just gonna be shorthand for all of those things. So let's get started. Some important points, this is not an exhaustive list of all settings or assistive technologies that people with disabilities might use with a mobile device. This is really just the beginning. What I'll be highlighting are some of the more useful, intro tools for Mobile Accessibility Testing, but there are many, many, many different options for configuring a mobile device to work best for various disabilities, needs, or preferences. So as you become more familiar and more comfortable with the things that I mentioned today, I really do encourage you to dig deeper into some of the tools that didn't get the spotlight, and get familiar with those as well. I'm going to at least mention as many as possible, but the amount of time I dedicate to any particular tool, it's not any sort of statement, it's just we only have an hour. On top of there being more tools than we can comfortably cover, there are also things like normal, regular software updates which may move things around, adjust the way they work, or even add and remove features or settings entirely. As of today, we're working with iOS 17 and Android 14, but just be aware that what's available today might be a little bit different, depending on your device and what versions of the operating system or applications that you have. And speaking of operating systems, as I mentioned, today I'll be focusing on iOS from Apple and Android from Google. And speaking of Android, when I talk about Android devices for testing, I specifically mean a Pixel. Since the Android operating system is fragmented, there are some differences in the user interface, the features and the functionality across all the different devices offered by all the different manufacturers. So anything I mention today, you can assume I'm referring to a Pixel, and if you can only choose one device to test with, I do recommend going with a Pixel. A note on compatibility testing, for testing native mobile content, some of the concepts that relate to Accessibility are going to be pretty much the same. An image without a text alternative is an image without a text alternative, no matter where it's accessed, right? That's pretty straightforward. But of course, there are also plenty of differences. One obvious difference is just finding all the different settings and assistive technologies, and where they can be turned on or off, and how to use them, which is where I come in. One difference that becomes pretty darn obvious, once you've gotten started, is that on mobile, particularly on a native mobile app, you may not be able to go dig around behind the scenes or in the code as easily as you might for desktop web content. So unless it's your product that you've personally made or you've been provided with that code, you're unlikely to have access to what makes the application you're testing do what it does. So if you aren't looking at that actual code, or maybe you aren't planning to make specific recommendations based on code at all, or if you just aren't necessarily aligning your testing with the spirit of the web content Accessibility guidelines or with CARIG, then you may be looking at compatibility testing. And with compatibility testing, you'll be using combinations of Accessibility related settings and assistive technologies that are commonly used by people with disabilities to access the content and determine is this product compatible with these combinations? And here we go. First things first, where even are the Accessibility settings? For iOS, as of iOS 13 a while back, the Accessibility settings graduated to right on the top level of the device settings app. iOS 15 introduced the ability to change settings, both globally at the operating system level, and on an individual app by app basis. In Android, the Accessibility settings are also in the top level of the device settings app. And again, this and any other screenshots from Android that you see today is as seen as on a Pixel. Android 12 shuffled the Accessibility settings around and reorganized a bit with sub menus, and some of these even appear in multiple places, both in and out of the Accessibility section. So while I'll be taking you to them by way of the Accessibility settings screen, if you get confused, you can always just perform a search for them. As part of the Android Accessibility suite, you can enable an Accessibility menu. This will give you a handy Accessibility button, available all the time from whatever screen you're on, that provides shortcut access to a lot of functionality that otherwise might rely on the physical hardware buttons, like taking a screenshot, locking the screen, adjusting the volume and brightness. And it also provides direct access to that Accessibility settings screen. The Android Accessibility suite is a suite of applications. It includes this Accessibility menu and the Screen Reader TalkBack, which obviously, we're going to talk about, and select to speak. It used to include switch access, but switch access was spun off into its own app in early 2023, I believe. This should come pre-installed on most Android devices, but just in case it's not, it can be downloaded from the Google Play store for free, and then it integrates seamlessly with the other settings. And right away, we'll be able to start the testing process just by noting, does the app I'm testing respond to any changes that I've made in the device's Accessibility settings, or does it respond to some, but not others, or does it not respond to anything? And as I go forward and introduce each one of these, just like I'm doing here, I'll be pointing out some of the common issues that you're likely to find when testing, using that tool. So you'll find that this same issue will pop up a lot. Does the app I'm testing respond to the options provided in the device's Accessibility settings, or does it provide its own equivalence? Okay, I like things that are easy, so we'll start with the settings to resize text, which are somewhat straightforward. In iOS, there are two locations where you can change the text size at a device level, meaning that changing this one setting in this one place will affect the text across default Apple apps, like mail, calendar, and the phone, as well as any other app that supports dynamic type. One place in settings is the display and brightness screen, and then text size. This screen provides a range of seven text size options with the default falling right in the middle of that range. But to open a wider range of text sizes, including larger sizes for Accessibility, you can go to the Accessibility settings screen and then display and text size, and then larger text. By toggling the larger Accessibility sizes control at the top of the screen, you'll introduce five additional, even larger text sizes. And developers do need to actively support dynamic type for the text of a native iOS app to successfully resize and accommodate that setting. If they've used Apple system fonts and text styles rather than custom fonts, they will get that dynamic type support built right in. For Android, there are two options for resizing content. As of Android 13, they live in Accessibility settings. Then display size and text. One of your options is font size, which simply resizes the text. Android 14 added several font size options, now seven up from four, which was intended to allow resizing all the way to 200%. Android also has five display size options, which you'd use to change the size of everything. That's text as well as any containers that define the layout, controls, icons, and you can use these in combination with one another to further adjust the sizing of the content. And as long as scalable units have been used to define sizes, Android's free sizing features should work across the operating system, unless they've been actively suppressed. Some common issues you might find when you test text resize through the device settings are portions of the text that do not resize, text that's cut off with ellipses or the dot, dot, dot, and that affects the meaning, text that is cut off by the boundary of its container, affecting its meaning, text that does not respond to text resize settings at all. And as an important side note, Safari on iOS, and Chrome on Android, do have their own text resize settings. So you shouldn't expect to see the system text settings apply to web content when viewed in those mobile browsers, but by going through the settings that are found right in the browser, you'll find options there for page scaling and text scaling. Another low hanging fruit you might want to start with is the color and contrast settings. This one is maybe not as low hanging as resizing text because there are actually a lot of different options available for adjusting color and contrast, and different combinations of them might work best for different folks. But I still count this as low hanging because even with all the variety of options to narrow down, and while really the bulk of contrast testing comes from the additional tools that help you identify specific colors and color contrast ratios, which I promise I'll cover later, you still aren't really learning a whole new set of skills, if you're already familiar with the concept of testing color and contrast on the desktop or in web-based content. In iOS, most of the settings that affect color or contrast can be found in the Accessibility settings and then display and text size. Here there are settings for classic invert and smart invert, which, respectively, will invert all displayed colors or all displayed colors except for media, like graphics and video and some user interface elements. You may notice that there is also an option to increase contrast. Though if you're looking to WCAG for your color contrast testing needs, you'll find that you can't rely on contrast enhancing technology, such as this, to provide that sufficient contrast. Android has fewer options for color and contrast. As of Android 13, these have landed in an Accessibility settings sub menu, named color and motion. Similar to iOS, Android has color inversion, which will invert all displayed colors. Now while all those invert settings do usually darken the screen, for testing, you are more likely to want to pay attention to dark mode instead. Dark mode, as it's called on iOS, or dark theme, as it's called on Android, uses darker backgrounds and colors to decrease brightness and reduce eye strain. And while this isn't necessarily an Accessibility setting, any design choices made to support dark mode could certainly introduce new issues and they're probably contrast related. On iOS, dark mode is not found through the Accessibility setting screen at all. Instead, you can navigate from settings directly to display and brightness to choose between a light and dark appearance. And on Android, it is in the Accessibility settings, under color and motion. It's available as a single switch to toggle on or off. Some common issues you might find when you test color and contrast through the device settings are, of course, insufficient contrast for text elements, insufficient contrast for user interface elements and states, insufficient contrast for images and icons that convey information, or you might find that content does not respond to color and contrast settings at all. Now that we have some of the more straightforward tools in the toolbox, I'll pivot just a bit and dive into one of the more daunting of the mobile assistive technologies for new testers, the Screen Readers. Even if you feel like a total pro with other Screen Readers like JAWS, you're probably used to using a certain set of keyboard commands to navigate around and you probably have certain expectations about how to interact with content when using a Screen Reader. But then to switch to a totally different method of operation, like swipe gestures, it can take some time to get the hang of it. So across Apple products, you'll find voiceover. Even if you're familiar with voiceover on macOS, on like a laptop, voiceover on iOS is going to be a bit different. Some of that does have to do with the different needs and abilities of a touchscreen interface, but also with release schedules, and even just what features Apple decides to support, when and where. You can enable voiceover from the Accessibility setting screen, and then voiceover. There is a control to turn it on and off and a whole bunch of settings to customize behavior, like how fast it speaks or how much it says. And as of iOS 13, you can customize gestures and commands used to control it. There's one option that I do wanna highlight because I think it's really helpful for testing, and that's the caption panel. It can be found towards the bottom of the voiceover screen as a switch or a toggle. Turning the caption panel on will display voiceover speech output as text at the bottom of the screen, as a bit of a visual supplement to the audio output that voiceover already provides. You may be able to see this in my screenshot on the slide, but I believe it's very small. This may sound familiar to you, if you've used the macOS version of voiceover. It also has a speech output caption panel and it's really helpful if you have trouble understanding something that voiceover says, or for recording a video of an interaction to share with others. And the caption panel does not appear by default, so if this is a feature you are interested in, you'll have to turn it on. Android has TalkBack as its Screen Reader, which you can enable from the Accessibility settings screen, and then TalkBack. It also has those settings to customize things like speaking rate verbosity and gestures. And again, TalkBack is part of the Android Accessibility suite, which I mentioned back with the Accessibility menu button. So it is technically a separate app, which means it can run on a different update or release schedule than the operating system as a whole, but it integrates with the rest of the settings, and even for the devices where it doesn't come pre-installed, it ends up feeling like it's built right in. If TalkBack doesn't automatically give you that visual caption option, like the one I pointed out for voiceover, you can go from the TalkBack screen to settings, and then advanced settings, and then developer settings, and then toggle the display speech output option. To start, here is a quick, easy, very limited list of default standard gestures for basic navigation. These, I think, are the very bare minimum that will keep you focused on getting comfortable with the Screen Reader without immediately panicking that you've lost control of your device, or getting caught up in memorizing the whole entire list of gestures. And because these are standard enough that they're shared across iOS and Android, I didn't even bother to split them up. They're going to be the same for either device. To explore the screen, drag one finger around the screen to announce whatever is under your finger. This is a non-linear way to get an idea of what is on the screen and where. Because many apps have common layouts like a back button at the top left or tabbed navigation along the bottom of the screen, this can be really helpful for users who want to jump directly to a certain part of the screen where they might reasonably expect to find what it is that they're looking for. And if the content under your finger was created with any amount of accessibility in mind, you'll be able to tell what has Screen Reader focus by hearing the announcement, seeing the announcement in those caption panels, or seeing the Screen Reader focus. This shows up by default as a black and white outline for voiceover, which helps it show up on most colors of background, and as a green outline for TalkBack. If you know where an item is, you can also just tap directly onto it to give it Screen Reader focus with a one finger single tap. A one finger swipe to the right or to the left moves the Screen Reader focus to the next or previous item of any type sequentially in the reading order. And a one finger double tap activates the button or control that currently has Screen Reader focus. You can double tap anywhere on the screen and it will activate the item, as long as it has Screen Reader focus. You don't have to be directly over the item, like you would if a Screen Reader was not one. And then on top of those very basics, there are gestures that are multi finger or multi-directional and introduce another layer of options for navigation. On iOS, the rotor gives you the opportunity to switch between different navigation options or to conveniently adjust some settings, like whether voiceover announces hints or how fast it speaks without going all the way back through the settings app. Some of what you can access through the rotor is just there by default and it changes based on what makes sense with the content that's on the screen at the time, but you can also customize the list of possible options from the voiceover screen in the settings app. There's an option there for the rotor. To access the rotor, you'll do a sort of twisting motion, with two fingers, like you're dialing a knob, directly on the screen, and you'll know you were successful because you'll either see it appear visibly on the screen or you'll hear voiceover start to announce the options, as you turn the dial. At each step of the rotor, you'll see or hear which option you're activating, and then once you've got the option you want, you can swipe down or up with one finger to move forward or backward through that option on the screen. Or if it's a setting, to adjust the setting. For example, if I used the rotor to activate headings, like I have in my screenshot, I could then start swiping down, down, down with one finger to navigate forward, heading by heading, through just the headings. Or swipe up, up, up to navigate backward through them, which is a way of navigating that might sound familiar if you have experience using quick navigation commands for other Screen Readers. Then I can switch back to swiping left and right to explore the content surrounding that heading. Or if I used the rotor to activate something, like speaking rate, I could swipe up or down to make voiceover speak faster or slower on the fly. And if you started gearing up to use your rotor for adjusting a ton of settings, which I wouldn't expect to be necessary, but just in case you did, iOS 15 introduced a voiceover quick settings option that you can access with a two finger quadruple tap. So you can access some of the voiceover settings quickly in that way rather than by filling up your rotor with a lot of stuff. A while back, Google and Samsung, together, released a new co-developed TalkBack update that introduced a few significant changes. This offered a bit more consistency along all those different Android devices by making TalkBack the default Screen Reader on not only Pixels, but most Galaxy devices as well. So the way you interact with TalkBack may be different between the two versions, depending on your device or software, but in most post-2021 versions of TalkBack and Android, there is a TalkBack menu which you can access by swiping down and then right, in a uppercase L type of shape. If you have a device that supports multi finger gestures, you can also perform a three finger tap to get the talk back menu. This menu includes some talk back settings and commands, such as read from the top, and you can just select whatever you need from the menu. And you can customize what appears in the TalkBack menu as well from the TalkBack screen, and then settings, and then customize menus. If you have any question about your version, you'll be able to tell what you're working with when you swipe down and then right, in that L shape. In older versions, it will be titled the global context menu instead. Getting to the different navigation options is done via reading controls, and you'll swipe down and then up, or up and then down, to cycle through those options. If your device supports the multi-touch gestures, you can also cycle through them using three finger swipes, which can be three fingers swiped left or right, or up and down, and like the voiceover rotor, there will be both a visual and audio indication of which option you've activated. Also, like the voiceover rotor, once you have the option you want, you can then swipe down or up with one finger to move forward or backward through that navigation option on the screen, and then you can easily switch back to swiping right or left to do the default reading order, item by item. You can also customize what appears in these reading controls, including a very small handful of settings, like veracity or hiding the screen display, like customizing the TalkBack menu, that's from the TalkBack screen, and then settings, and then customize menus. A feature that you might like, whether you're just getting the hang of mobile Screen Readers or you've decided to learn more gestures, is the practice area, which is available on both operating systems. On iOS, after you activate voiceover, a voiceover practice button appears just beneath the voiceover toggle. It opens a scratch pad, and as you make gestures on it, it will repeat the gesture you made and tell you what it does. On Android, there is a tutorial and help section in the TalkBack settings that give options for a guided, hands-on tutorial, or a practice gestures area, much like voiceover practice. So if you can make your way through your content with just the basic navigation gestures that I shared, you may get lucky and not ever need to use a single other gesture. Even if you aren't that lucky, which you probably won't be, you really don't need to learn and memorize every single possible gesture right away. As you get more comfortable with navigating and encounter different kinds of content or more complicated content that requires some other gestures, you'll have a good foundation already to build on and some good resources to learn more. Some common issues you might find when you test with a Screen Reader are images with missing or incorrect text alternatives. Note that Screen Readers are getting smarter. So both voiceover and TalkBack are now able to describe many graphics, even photographs, based on whatever context they can pick up. It isn't perfect, it does identify my dog as a horse, and I do recommend you pay close attention to text alternatives when testing because they may not have been provided by a person. Visible information structure and relationships that are not communicated to users through a Screen Reader announcement, this includes things like visible text that's completely skipped over, or headings that haven't been defined as headings. Content that is visibly hidden and intended to be completely hidden, but is still announced by the Screen Reader. An illogical reading order. Interactive elements that are not able to be operated while the Screen Reader is enabled. Content updates that are not communicated, like error or status messages, and interactive elements with missing or incorrect memes, roles, states, or values as appropriate. To make your life easier, you can make use of Accessibility shortcuts. I mentioned earlier that Android has the option to add an Accessibility menu shortcut. You can also set other Accessibility shortcuts on either device to quickly toggle different frequently used settings and turn assistive technologies on or off. For iOS, there's an Accessibility shortcut option at the bottom of the Accessibility settings screen. On this Accessibility shortcut screen, you can choose any number of Accessibility options to include in the shortcut, including voiceover, some color and contrast settings, and others. You can then triple click the side button on the right side of the device for an iPhone 10 or later. Triple click the home button if your device has that, or add a button to the control center, and if you've selected one option, it will just toggle that one thing on or off. If you've selected multiple, it will bring up an action sheet where you can choose which one you'd like to turn on or off. Android also has Accessibility shortcuts, but instead of that one stop shop for choosing what you want to be included in the shortcut, you can choose it from the setting screen where you would have initially enabled the setting, or assistive technology. So for example, from the TalkBack screen, or from the voice access screen, there will be a control to enable the shortcut. And then you'll be offered a few different options to choose from on how to trigger it, depending on what the setting is. One is pressing the physical up and down volume buttons, together at the same time, to toggle one or choose from several. Another is the Accessibility button, which is the same place where the Accessibility menu lives, and you can use that to toggle one shortcut, or long press it, to choose from multiple. As of Android 12, you can also display the Accessibility button as a floating button, or a series of floating buttons, if you have multiple shortcuts. One of the shortcuts that has its own unique trigger is magnification. So if you enable the magnification shortcut, you can choose any of those options I just mentioned, or you can choose to just triple tap the screen with one finger to turn the magnification on and off. So for the sake of offering up as much settings information as I can, these are still not all of them, but here are some other options for color and contrast. iOS has reduced transparency, differentiate without color, use color filters, and reduce white point. Android has color correction, high contrast text, extra dim, in Android 12 and up, and a new to Android 14 contrast setting for material theme colors, which kind of varied in the developer options. Some additional settings for visual support, for iOS, zoom to magnify the screen or part of the screen, magnifier, which uses the camera to magnify things that you see in person. It became a default, standalone app in iOS 15 and has been receiving updates with each major iOS release. Spoken content, which includes speak selection and speak screen, bold text button shapes to add some visual cues that differentiate button controls from plain text. On-off labels to make it clear whether a switch control is set to on or off, and reduce motion, which was updated in iOS 17 to include some auto play preferences, and the ability to automatically dim the screen if flashing is detected. For Android, there's magnification, which magnifies the screen or part of the screen, and was updated with new customization options, in Android 14. Select to speak, remove animations, large mouse pointer for if you've paired a mouse with your device. Bold text and audio description, as of Android 13, which automatically turns on a verbal description of onscreen content for videos that support it. Additional settings for mobility and touch for iOS include voice control, which will let you control the device with spoken commands and just received a new onboarding experience. Switch control, which allows you to control the device with a switch, such as an external device with a physical button, or by using the camera to capture movements, like a tilt of the head, or an iOS 15 and up, with simple mouth sounds to interact with elements as they're highlighted on the screen. Assistive touch lets you replace swipe gestures, pressing physical buttons, or moving the device with customized touch actions, or pair a pointer device, like a mouse. Reachability to help make the top corners of the screen content easier to reach. Haptic touch and touch accommodations options to adjust things like hold duration, and back tap, which runs a shortcut when you tap the back of the device, and is available in iOS 14 and up. For Android, voice access and switch access, which are similar to voice and switch control, on iOS. Auto click or dwell timing, which is tied to using a physical mouse. Touch and hold delay for hold duration. Time to take action or Accessibility timeout to dictate how long temporary messages might stay on the screen. And vibration and haptics options. And some additional settings for hearing and audio, for iOS, hearing devices, specifically to pair made for iPhone hearing aids, or a sound processor, as opposed to going through general Bluetooth. Hearing Control Center, which is a new widget for the iOS control center. Sound recognition, which listens for important audio in the environment, like babies crying, or a fire alarm. LED flash for alerts, so there's an additional visual method of communicating notifications. Subtitles and captioning and live captions, which will provide live caption transcription for things like FaceTime calls. It was new to iOS 16, but it's technically still in beta. For Android, sound amplifier, which affects the way you receive both environmental and device sounds. Live transcribe for transcribing environmental speech, like people talking around you. Sound notifications, like sound recognition on iOS. Live caption for captioning audio that's being played by the device. Hearing devices, formally hearing aids, to manage paired devices and flash notifications, such as new in Android 14 for that additional visual method of communicating notifications. Usually I stop there but Apple kept going, so I will too. As of iOS 17, there's also a live speech and personal voice to speak typed content aloud through system voices or recordings of your own personal voice. And assistive access, which uses customized simple layouts and visuals to make it easier to complete tasks. So if you do wanna learn more after this, either because you forgot everything I just said, or because you're ready to learn more about the Screen Reader gestures or you want to learn about some of those other settings I just rattled off, Apple and Google both, of course, have Accessibility support pages with more information about how to use some of these settings and assistive technologies. Additional testing tools, these are those bonus tools that I mentioned at the beginning. They're bonus in that they aren't built into an iOS or Android device, but they're not bonus in that you don't have to do them. They're still important. First up is a physical keyboard. This is likely to be Bluetooth rather than wired, although it could be wired. Testing with a physical keyboard is more important than you might initially think because it's not at all uncommon for a person to pair one with their mobile device. It can be used as an input method for Screen Reader users, or for users with limited mobility or dexterity that makes it difficult to use gestures full time, or at all. Lots of folks like to use them specifically for working on tablets or for tasks that require a lot of typing. In iOS, first you'll need to turn on full keyboard access to support keyboard use. This was introduced in iOS 13, so while it's no longer new, per se, you might find a few wrinkles in the keyboard support for some iOS apps. Full keyboard access can be found from the Accessibility settings screen, and then keyboards, and then full keyboard access. On this screen you'll find not only the control that enables the setting, but also some information about the key commands you'll use to navigate an iOS device. You can customize these commands if you'd like, but I strongly recommend reviewing them to see how they're different from common commands for web content. You can also use a keyboard to navigate with voiceover, though there are even different turn key commands for that than those for full keyboard access, which are documented in Apple's iPhone user guides, online, but I do recommend going out of your way to turn full keyboard access all the way off whenever you turn voiceover on because it may sometimes interfere with voiceover focus and touch gestures. But full keyboard access is one of the settings you can add to the Accessibility shortcut, so it can be toggled easily enough. Android has had keyboard support a bit longer, and you shouldn't need to turn anything on to use it. Just pair the keyboard. The commands are unfortunately largely undocumented. Currently, if you have a large screen device, you can find a list of system keyboard commands in settings and then system, then keyboard, and then physical keyboard I believe it is. Otherwise, a small handful of system and context-specific commands can be brought up using your OS key, together with a forward slash. The OS key may be different, depending on what keyboard you're using. It's often called search. Some keyboards map it to start or command. So try a few different keys with the forward slash to see which one works for your device. And if you're going to be testing on multiple devices, say one iOS device and one Android device, treat yourself to a multi-device or easy switch keyboard. Pairing and un-pairing a keyboard to alternate between devices is super annoying, so you're welcome. Some common issues you might find when you test with a physical keyboard are interactive elements that are not operable. An illogical focus order when using tab or the arrow keys, or one of the keyboard commands identified in the documentation. or being unable to see where the focus is, either because the focused element is obscured or hidden, or because the element is missing a focus indicator. For focus indicators, iOS generally uses a blue outline or shadow by default, although users can change this in the full keyboard access settings so that it has more visual contrast. And Android uses a gray shadow or background. Automated testing, it can be done on mobile content. Of course, just like for web-based or desktop content, automated testing tools are to supplement manual testing, never to replace it. So you may choose to start your testing with an automated test, just to dip your toe in. I like to run them at the end for a final check, but either way, they are particularly handy for identifying a few issues that may be difficult to test manually with total accuracy. For iOS, if you have access to a computer running macOS and Xcode, and a free developer account, you have access to the Accessibility Inspector. If you have access to macOS but don't have Xcode, you can download it for free in the app store. Once you have Xcode and have opened it, you can find the Accessibility Inspector in the menu bar at the top of the screen, under Xcode, then open developer tool, and then Accessibility Inspector. You'll need to physically connect your testing device to the computer and then you'll have access to a few options that are helpful. One is inspecting by element. You can select elements on the device screen to see a very basic rundown of some of the properties that have been set on it by the developer, such as its label or its traits, and you'll get a better idea of what is making voiceover announce the things it does for any particular element. You can also run an audit. This will show a list of potential issues for the screen content and it's where you can identify elements that might have insufficient color contrast and target sizes for buttons and controls that may be too small to comfortably select with a tap. As a side note, if you haven't been paying attention to touch target sizes, I recommend you start, especially if you are aligning to a CAG. Anyway, those are two things that can be tricky to accurately test when doing more general compatibility testing. It also shows things like missing labels and text that may not support dynamic type, but personally, I found that those are just easier to test manually. Android has some automated scanners that you can download directly to your device. The one I like to use is called Accessibility Scanner, which is a Google app and is available for free in the Play Store. The Accessibility scanner lets you enable a floating button right on the screen, which you can move around wherever you'd like. You can then run a snapshot scan of the screen or capture an interaction and it will provide you with a list of potential issues, like the insufficient color contrast for text and images, or the two small touch target sizes, and also issues with labels and descriptions, each of which you can select to get more information about the issue. I did already mention all of these, but to recap, some of the more common issues and most useful information you're likely to find in your automated test results are cases of insufficient color contrast or target size. An automated scan will give you real actual numbers for those things, which is very useful for a tester. You may also find information about missing or incorrect memes and labels useful if it will help you determine why a Screen Reader has announced what it has. These poor last few didn't quite fit into a cute category, but here they are. Screen mirroring, if you're testing an app, chances are very good you'll need to communicate your findings with another person. And while each device will, of course, let you take screenshots or record yourself going through a process right on the device, screen mirroring can be really helpful when you need to get those screenshots or recordings into your computer and into an email and into a document. Or maybe you need to hop on a call and share your screen with someone in a way that's not holding your device awkwardly up to the webcam, which I have done. Screen mirroring can just help get everything handy, all in one place for your workflow. Another thing it can be helpful for is testing color contrast, using the color contrast analyzer tool, which is the next slide, so I will come back to that. I mostly use a Windows machine for this. So one free option for iOS is Lonely Screen, which uses airplay to wirelessly cast the screen onto your laptop or computer. For Android, I use screen copy, which does require you to be physically plugged in to your computer. Actually, both of these are free to download, but there are several other paid options, if you use it a lot. Back to the color contrast analyzer, if you aren't already familiar with it, this is a free tool from TPGi that does exactly as it says, it analyzes color contrast. You can pluck colors straight from the screen using an eyedropper tool. You can also use sets of sliders to find a specific color value. Or if you do already know the actual color value that's being used, which is a best case scenario, you can enter them directly into the tool. Either way, you'll get a contrast ratio and some guidance related with CAG, and you can use all of this data when you document your findings. This tool can be helpful if you see a color combination on your app that feels like it's going to be trouble, but isn't being picked up by an automated scan. Or if you don't have access to one of those automated scans, in a pinch, you can send a screenshot to yourself, or use your screen mirroring setup and use the eyedropper tool to grab a little sample of the colors and verify the contrast. Now, this is not perfect. It will be a screen displayed on another screen with pixels picked up by an eyedropper, which is kind of a long journey. So it probably won't spit out the exact colors, but it can be really helpful for checking color combinations that just might need more attention. And again, that's available from TPGi, TPGi.com/Color-Contrast-Checker. But you can just Google TPGi color contrast and it'll pop right up. And that is it. I'm hoping that if you didn't know where to start before, you do now, or if you thought you knew, now you're more sure, or maybe you even picked up some new tips. So when you're able to break that huge list from earlier down into a manageable set of tools and start building your knowledge of how to use them, it should feel much more comfortable to start testing any content on mobile devices. So I did see some questions come in, so I'm gonna try to be helpful there. For iOS, the largest text size that corresponds, sort of to a keg, is, so that's a 200% text resizing that we would be looking for. And if you wanted to reach that 200%, you would probably want to look at Accessibility extra large is the text size setting, which is like the third notch from the end. That will take your body text and most of your text styles up to 200%, but even the largest text size doesn't resize the largest text styles to 200%. So Accessibility, extra large, third notch from the end is what I recommend for iOS. And the top level for Android is intended to be 200%, as of Android 14. You have a question of is it an issue if the text is cut off with ellipses on mobile? It is if there's no other way to access the content. So if it's like an email subject line and then you click on it and it opens the whole email with the whole subject line, that's fine. But if something like a next step button is completely shortened with the ellipses, that's gonna be an issue for your text resizing. So you wanna make sure that any information that's being provided in text is still available in some way, once the text is resized. Stephanie, I'm not sure how many more you want me to- - [Stef Cushchnir] Well, we have about three minutes, so let's try to get through. There's quite a lot in the chat. I've got them recorded and I will be sending them to Laurie after this. So she'll be answering all the questions at some point. If you wanna answer one or two more, Laurie, on the Q and A, that would be great. - [Laurie Pagano] Okay. I am going to try to find a three minute question. - [Stef Cushchnir] I saw one that said, "Do any tools exist to be able to add alternative texts to images sent in text messages?" Is that a short- - Oh. - [Stef Cushchnir] Well, we can leave it at this. We can leave it at this, and we'll give Laurie time to think about the answers to all these amazing questions. Thank you everybody for joining the webinar today. Like I said, I will send all the questions from the Q and A box, as well as there were some in the chat that I tried to capture. I may have missed a couple, but I'll send them to Laurie and we'll send them out with the recorded session. And if you have any questions, you can email us at IDA@TPGi.com. Thanks everyone. - [Laurie Pagano] Thank you.