ACCESSIBILITY SHORTCOMINGS STILL OFTEN OVERLOOKED IN REMOTE WORK

The rise of remote work has increased the talent pool for companies to draw from, as people with disabilities, who may have been unable to regularly commute to the office, can now apply to a greater range of positions.

Following the work-from-home (WFH) trend, video conferencing tools became essential and were quick to introduce video captioning on their platforms to better serve the diverse workforce. However, according to Mindaugas Caplinksas, CEO at Go Transcript – which specializes in easy-touse, quality-focused human-powered transcriptions delivered online around the world – the current solutions fail to properly support people with hearing disabilities.

Closed captions (CCs) provide more than a written dialogue of what is being said by giving the viewer a description of what is taking place on-screen. Despite the progress that has been made regarding its accuracy, many auto-generated subtitles and CCs are often garbled, as well as unable to provide context, register sarcasm, or word emphasis.

"Business should not be lulled into thinking that present-day video call accessibility tools are enough to offer inclusivity for people with hearing impairment," commented Caplinskas, cautioning against the reliance on AI-powered video captioning alone to ensure accessibility. "The video call has opened up many opportunities for inclusivity and that should not be taken for granted. However, transcribing important meeting information could bring additional benefits to businesses – not only because synchronous transcription is still not up to par with accessibility needs, but also as more crucial

details can be captured and conveyed."

WORK TO DO: Business should not be lulled into thinking that present-day video call accessibility tools are enough to offer inclusivity for people with hearing impairment.

In addition, Caplinskas notes that despite their underpinning voice-to-text technologies, artificial intelligence-based solutions are still unable to understand foreign accents, multiple speakers, and specialist vocabulary.

"A more properly fitted environment could increase employee loyalty and productivity."

"Live-captioning on video calls is not just inaccurate in such cases, and it can be very difficult to navigate in the conversation. With multiple speakers in the meeting, the issues get more profound, for instance, when there are different accents, or when people use sign language, often not recognized by these systems," he explained. "That's the main focus of our team – to render information taking all of these nuances into account. Detailed notes of strategic sessions can help businesses with accountability, accuracy, and, most importantly, inclusivity, as more people will be on the same page and will not have missed important details that transpired in the meeting."

Major market players have taken a more serious approach towards providing more intricate solutions for the hearing impaired. For example, Google has recently come up with an idea to enable a person signing to be recognized by the closed captioning systems. As described in a research paper, the system would use a virtual audio source to generate a 20 kHz tone, which is outside the range of human hearing, but noticed by computer audio systems to make the system aware that a person is now using sign language.

However, it may be a while until this, or similar solutions are launched, thus Caplinksas concluded by emphasizing that companies should aim to be more in tune with employee accessibility needs at present. "A more properly fitted environment would portray the employers' attitude towards their team, which, in turn, could increase employee loyalty and productivity," he commented. "For this, it is really important to make sure that any improvements are made with people with hearing impairment in mind – not what hearing people think they need." •