Pages

Thursday 30 May 2024

A visit to Google’s Accessibility Discovery Centre

Kathryn Drumm, Educational Technologist at City, University of London, shares her experiences of visiting Google's Accessibility Discover Centre.

A group of City staff and our Google host standing in front of the Google building. Everyone is smiling. Above the entry to the building is a large Google logo.
Just before Global Accessibility Awareness Day 2024, I was part of a group from City who visited Google’s Accessibility Discovery Centre in the heart of London’s Kings Cross.

The group was made up of colleagues from across City, all of whom have an interest in digital accessibility. There were staff from the Library, the EDI team, academic staff from the HCID (Human-Computer Interaction Department), Marketing, the ERES from IT and, of course, the Digital Education Team. If nothing else, it was great to see how many people are committed to expanding awareness of digital accessibility across the university.

We were greeted by Hans (to the far right in the group photograph) and taken up to the ADC to meet with Praneeth, our other host for the day. Praneeth explained that Google has three ADCs across the world, and their aim is to collaborate with people outside of Google, especially members of the disability community, to make Google’s services accessible for all.

Praneeth, a young Asian man, stands in front of the a large wooden sign for Accessibility Discovery Centre and a smaller copy.
The ADC sign, behind Praneeth in the group photo, was an interesting lesson in accessibility. It was created as decoration for the room, but was also meant to be accessible via touch to those with vision loss. However, the feedback from users was that the sign was too large for anyone to interact with, so a smaller, more user-friendly version was created and installed next to it. It’s a reminder that while you can have good intentions, you always need to test your proposed solution with the intended users.

Speaking of testing, to further develop their products and services, Google invite testers to come to the ADC where take part in talk-aloud, task-based tests and use tools such as eye tracking software to get feedback. At the moment, they have to carry out this testing at Google’s London HQ, but they intend to create a virtual lab for those who can’t travel.

A video game with a wobble switch and pressure button controller
We then saw their accessible gaming arcade, the result of their work with the Manchester charity Everyone Can. Here games could be controlled through pressure switches, wobble switches or via eye-tracking. Hans impressed us all by only crashing his virtual car a few times when he raced it around the track using only his eyes.

We then looked in more detail at some of the accessibility tools aimed at those with sight or hearing loss.

The huge leaps in creating live captions was of particular interest to me, as I was a subtitler in a previous life, and remember how it used to take 10 hours to create 20 minutes of subtitles. Now, we are used to captions being generated automatically as a conversation takes place or a video plays. We also saw how Google Nests could be programmed to trigger different coloured lights to alert d/Deaf users to specific sounds, such as a red light for the doorbell or a blue light for a baby crying.

For those with sight loss, we saw specialised Braille displays. But Praneeth also demonstrated how improvements to Pixel phones can guide users via vibrations to frame objects in the camera. Not only does this allow people to take their best selfie, it also means they can centre the camera on labels or documents. The phone then recognises the text and can read it out. So it would be possible to scan a tin in your cupboard and work out if it contains tomato soup or pineapple.

Praneeth also showed us accessibility features into Android phones such action blocks which allow you to condense common multi-stage tasks into one action represented by an icon on the phone’s home screen. For example, calling a loved one can be represented by their photo, meaning that those who struggle with complicated processes for whatever reason can now complete them with one tap.

I was saying to a colleague later that while the visit was interesting, I had felt that I’d seen a lot of the technologies or similar versions before. But thinking about this further, I realised that rather than feeling disappointed that my mind wasn’t blown by some new technology, I should see it as a testament to how pervasive many accessibility tools are now. We are no longer astounded by automatically generated subtitles, but only notice where they go wrong. We expect the miniature computers in our pockets to read labels or translate conversations. And a centralised home hub, controlled through our voices, turning our lights on, playing us music or ordering our groceries online is no longer the wild imaginings of sci-fi but the everyday. It reinforces the idea that we should always be aiming towards the point where we just assume that technology will be accessible to everyone, where we’re not having to raise awareness through special weeks and that it stops being extraordinary and is instead the ordinary.

No comments:

Post a Comment