Video meetings: glimpses into the barriers 4

I’m going to go into more detail here, partly to serve my need to let it all out because of the pent-up stress, and partly to let anyone who reads it have a glimpse into something of what it is like for blind people using access technology, how a visually impaired person’s engagement in a meeting is different from a sighted person’s.

 

I’m sure that younger visually impaired people who have grown up with fast-moving technological progress may find all the below much more manageable than me, which is fantastic. I’m in my late 50s and while relatively technically skilled, I am not as fast as some, especially while I’m trying to also think and participate in complex discussions.

 

In the three previous posts, I’ve described the context of video meetings, that software/app updates can have an impact on accessibility and how screen reader users hear all the information audibly through a single earphone (rather than taking some in aurally and some visually).

 

Here I want to show how hard visually impaired people have to work to keep up in such meetings.

 

I’ve had to choose a compromise in  how I access work meetings. The use of a laptop to manage video meetings with a screen reader can work well so long as the laptop isn’t needed to also access documents and emails while in the meeting. I do need to access documents and often write notes and therefore use an iPad for the meetings and the laptop for all my documents and writing. Both devices have screen readers so speak the text that I instruct  them to read at any time in an electronic voice through their own earphone.

 

I know that many sighted people also find video meetings tiring, continually having to turn the mic on and off, raise and lower their hands, being always watchful and on show. This is particularly true of sighted people with specific learning difficulties (e.g. dyslexia), some mental health issues and low energy. Sighted people can see the entire screen which offers the ability to merely tap on a button to engage its function. Blind people using a tablet however, have to find the button first.

 

Without sight, this is done by moving around the iPad/other tablet/smartphone’s buttons  by a set of screen gestures using the fingers to move, tap or swipe in different ways on the screen. In Microsoft Teams, for example, to locate the main buttons, I move my finger left to right (or vice versa) just above the bottom of the iPad’s screen and as I move over the visual buttons, the screen reader speaks their function and I can ‘double tap’ to engage it. Unfortunately, due to an earlier update that maximised the screen space to accommodate more of the participants’ video feeds, these buttons now overlap the lower row of people’s videos. This means that as I move my finger across, I can hit the edge of a participant’s thumbnail and hear their name rather than ‘mute microphone’. Similarly, my screen reader will stay focussed on these buttons while the same person is speaking, but can be moved from it when a new person begins, which means that I have to find my place again.

 

I should say that in a small meeting of say five or six people who know and work together well with each other, mics can be left on and there’s no need for raising hands. Such calls are easy and enjoyable without any barriers for me. It is just on the bigger and fast-paced meetings that require a degree of etiquette that things get harder.

 

To illustrate, alongside the content of the meeting itself, I might hear the following:

-‘A’ is currently speaking,

-‘B’ has now joined the meeting, 

-‘C’ has raised their hand,

-suddenly, without warning, I hear ‘D’ is now speaking, simply because ‘D’ accidentally left their mic on and made a noise in their room (perhaps coughed) which drew Teams’ attention to them.

-this is immediately followed by ‘A’ continuing to speak  and a message that they (‘A’) is currently speaking.

-‘E’ has left the meeting.

 

I have a few choices of limiting what I hear (including not to hear any of the chat), but if I turn off my screen reader completely, I have nothing to help me locate the buttons.

 

I should say that while this cacophony of spoken information is very confusing, it is the product of accessible design. It means that  the designer has given us the ability to have as much information as possible, but one consequence, especially in a busy and detailed meeting, is that it becomes too much information coming through one earphone at once. I can manually tap the screen to stop the screen reader talking over something I’m trying to hear, but have to get in quickly.

 

Alongside all this, to properly participate, I have to:

-keep my mic on ‘mute’ when I’m not speaking

-use the button to raise my hand in order to indicate that I want to speak,

-When I’m eventually invited to speak, I have to find the ‘mute’ button again to open my microphone ,

-then lower my raised hand

-and then mute my mic again when I’m finished.

 

In meetings that I am very involved in, this is a continual process. The difficulty is compounded to an extent because a physical impairment means that it is painful to have my arm stretched out for prolonged periods which is required to be ready on the buttons.

 

I’d like to hear any solutions that other blind people have found in similar work situations. One would be to use two laptops (each with screen readers and their own earphones) – one for the documents/writing and the other for the meetings - but this wouldn’t give me much more than I have through using an iPad and laptop.

 

Too much information? yes, of course it is and that’s the point. I’d much rather not to have to think of all this in meetings. But it is this level of audible detail that visually impaired people are continually having to manage merely to participate.

 

 

Comments

Popular posts from this blog

On-street greetings - continued

'Are you causing trouble again?'

Trip hazards part 17