Working with ALS – Insights from the Ability Summit

The 14th annual Ability Summit is a global event that I attended a few weeks ago. It is hosted by Microsoft, and it presents the latest technology innovations and best practices for accessibility and inclusion. The event has three main session tracks: Imagine, Build, and Include. Each track examines different aspects of how technology can enable people with disabilities and make the world more inclusive. The event is free, and anyone can register online to attend. All sessions are recorded and can be watched at any time on demand.

Ability Summit 2024 Highlights

As we think about our enduring commitment and goal at Microsoft, which is to build that culture of accessibility and embed it into everything we do, grounded always by the insights of people with disabilities. – Jenny Lay-Flurrie

In the first keynote, Microsoft CEO Satya Nadella and Chief Accessibility Officer Jenny Lay-Flurrie talked about how AI can remove obstacles and create more accessible experiences, while also addressing the challenges and concerns of responsible AI. The keynote showed several examples of how AI can help people with disabilities, such as voice banking for people with ALS, descriptive audio for people with low vision, and Copilot for people with diverse speech patterns. It was very impressive to see Team Gleason featured as a partner with Microsoft to work on AI to help the ALS community preserve their voice.

Team Gleason and Microsoft Team Up to Give an ALS Person His Voice Back

As a platform company, we have to absolutely lean into that and make sure that everything we’re doing, whether it’s Copilot and Copilot Extensibility or the Copilot stack in Azure is all ultimately helping our ISVs, our customers, our partners, all achieve their own goals around innovation, around accessibility. – Satya Nadella

Build Session: Bridging the Disability Divide with AI

The conference had many sessions and keynotes, but this one about the disability divide and AI was very interesting to me. These are three main points I learned from this session: 1) how people with disabilities are benefiting from AI in their personal and professional lives; 2) advice on how to begin and advance the AI journey with accessibility as a priority; 3) the significance of accessibility as a basic value for developing technologies that enable everyone.

This session also provided some resources and opportunities for us to learn more about AI and accessibility, such as the Accessibility Bot, which is a chatbot that can answer questions about Microsoft’s products and services regarding accessibility topics; the AI Studio, which is a platform that allows users to explore and build AI applications using various cognitive services and SDKs; and the AI Platform Team, which is a group of developers and researchers who work on making AI more accessible and inclusive.

In Real Life

I belong to the ALS community (I have ALS), and I rely on a lot of accessible technology both hardware and software to accomplish work. I used a combination of Voice Access in Windows 11, a Stream Deck foot pedal, a foot pedal joystick on my wheelchair and Microsoft 365 Copilot to write this blog post. Voice Access helps me with dictation and specific commands like selecting paragraphs or capitalization. A Stream Deck allows me to do backspace and deletes. A foot pedal joystick acts as a mouse. Copilot assists me with summarizing and rewriting content. As you can tell, we need a whole set of tools to suit our needs, and there is no single tool or method that works for us. I’m excited to see how AI will enhance accessibility for all of us. My goal is to keep sharing the tools and techniques I use to live and work with ALS through my blog and YouTube channel.

Demonstrating How I Use Voice Access

As part of my effort to inform people on how to use tools with their disabilities and continue working, I created a series of videos that demonstrate how I use Voice Access to perform a number of activities that support my work. As we mentioned in the previous post, we have launched a Data on Wheels channel on YouTube. Over the past few days, I have created six videos which will walk you through various processes or types of work using Voice Access. You can get to the entire playlist here. I plan to add additional videos as I find more ways to work with Voice Access that might be interesting to others.

Here is a list of the videos I created with a summary if you are interested in just a single video as opposed to viewing the whole list.

Working With ALS: Voice Access-Introduction and Setup

In this video I walk through what Voice Access does and how to get it set up in a Windows 11 environment. If you are unfamiliar with accessibility capabilities inside of Windows 11 this is a suitable place to start.

Working with ALS: Voice Access-Using Grid Overlays

This video focuses on grid overlays used to manipulate mouse location and the ability to click various locations on your screen. This and the number overlays which are next are key command sets used to reduce the need for a physical mouse when working with Windows.

Working with ALS: Voice Access-Working with Number Overlays

Voice Access gives you the ability to click any location on a screen that can be clicked with meaning. It assigns numbers to all the locations in a window or on your entire screen depending on how you use it. This video walks through a number of examples of using number overlays to navigate a couple of different applications.

Working with ALS: Voice Access-Changing Modes

Voice Access allows you to issue commands and dictation at the same time when using the default mode. There were two other modes that potentially can be helpful depending on what you are trying to accomplish-dictation and command. I demonstrate how to use both modes and some of the difficulty of using these modes as they do not always work as expected.

Working with ALS: Voice Access-Navigating Web Pages

Whereas the first few were about specific functionality inside of Voice Access, navigating web pages is how to accomplish a specific task using Voice Access. In this one I move through a couple of websites interact with those websites using Voice Access functionality has been demonstrated previously.

Working with ALS: Voice Access-Real World Dictation

In this video, I demonstrate how to use dictation in a real world setting. I create a journal entry for my Caring Bridge site using Voice Access. This gives the viewer the opportunity to understand what is required to use dictation and commands to create content. This shows how I typically create documents, blog posts, and Teams discussions. Voice Access is unique in that it allows us to use commands and dictation at the same time.

I realize not everyone has a need to use Voice Access. But if you work with someone who has limited functionality in their ability to navigate with a mouse or to create any type of content with the keyboard, these videos may help them understand how Voice Access can be used to improve their day-to-day work.