Technology, when made well, can give people with disabilities choice and empower them to achieve their goals. Robin Christopherson MBE, tells Martin Cooper MBCS why he loves smartphones and wants a driverless car.
'All information is good information, as long as it’s accurate.'
‘Today, the conversation has definitely moved on. If businesses are still asking “why?”, they’re way behind the curve.’
‘A magic wand? A single wish?’ says Robin Christopherson MBE, Head of Digital Inclusion at the charity, AbilityNet, when asked if he could magic into existence any piece of technology or change any digital product at will...
‘I wish-’ he says and then pauses. ‘I wish they’d make their products more accessible. That’d be the first one. Seriously. A recent study of two million home pages revealed that 98% of them weren’t accessible or didn’t meet the lowest standards. That’s the scale of the issue. The internet is a junky place. Number two on my wish list: driverless cars, please.’
A rucksack of empowering devices
Christopherson has been with the charity for 26 years and, as such, has a deep and wide view of how technology - in all its guises - can provide choice and help improve lives for people with disabilities. However, Christopherson’s knowledge isn’t theoretical: he started losing his sight in his school years and is now blind.
‘So, what do I use today?’ Christopherson says, addressing the accessibility technologies he uses to help him navigate, literally and figuratively, through his life. ‘It’s actually entirely mainstream,’ he reveals. ‘My iPhone, my Mac, my virtual machine that’s running Windows on the Mac; every device I use is mainstream and it is completely inclusive. That, to me, is the game changer and it’s 90% of what I use.’
He points to Microsoft’s Seeing AI as a particularly empowering app. It’s essentially a speaking camera. Install it on your smartphone, point the device at the world around you and the app narrates what its camera sees. It’ll speak text, identify products from barcodes, describe scenes and, if it has been set up correctly, even identify the faces of people in a room.
Smartphones are indeed smart
‘If I had lidar on my phone, it would tell me distances too,’ says Christopherson. ‘I’m holding out for body or head mounted cameras which would mean I don’t have to hold my phone out. The distance [feature] is very good for social distancing. If you’re a long cane user and you’re approaching objects, you’ll want to be told about how quickly you’re approaching them: “It’s at two-clock and it’s four metres away”. That’s gold - that’s really useful information.’
Beyond competing with Christopherson’s ‘low-tech and effortless’ guide dog, Seeing AI has other skills too: ‘Point at your clothes and it’ll tell you what colour they are. That’s good for avoiding odd socks. And it’ll tell you if you have the lights on - if you don’t, your guide dog will spend the evening in the dark.’
The point is, he says: ‘All information is good information, as long as it’s accurate.’
Christopherson began losing his sight at secondary school and the only adjustment that was available was to move closer to the front of the class. At university, he couldn’t see the screens in the lecture halls or read the handouts. At that point, he began to get technology involved.
‘I got a talking laptop,’ he recalls. ‘It was a really big talking laptop. It was DOS-based, had a built-in hardware speech synthesiser and a great big volume controller on the front. That was my first use of tech that opened up everything for me and I’ve never looked back. But today - the real lightbulb moment - was the smartphone.’
Expanding, he says. ‘I can’t overemphasise how important the smartphone is. Yes, computers really changed life opportunities for people with disabilities. But, imagine taking all of a computer’s smarts and making them work hard for you; add loads of sensors - camera, compass, accelerometer - they’re all important when some of your own senses don’t work. The mobile phone was such a game changer.’
Careful interface design
Along with growing compute and sensing capabilities, smartphones also have another key advantage over desktops: they tend to have much simpler user interfaces. How and where people tend to use mobiles is, of course, very different from desktop working. Desktops and websites which are accessed via a desktop can afford, to a degree, to be more complex. Mobile design and development, however, requires a rigorous attention to simplicity across interfaces and experiences.
‘In iOS, the building blocks for apps mean you get a much better default accessibility experience,’ he says. ‘That’s out of the box, without the developers really needing to be so aware of how to build things, accessibly.’
Elsewhere in Christopherson’s rucksack, you’ll find a pair of Apple AirPods - specifically, the third generation. ‘The previous generation, the AirPod Pros were fine,’ he explains. ‘But they went in your ear canal. Though they do have a transparency mode, when you’re out and about, this means you can’t hear. You can’t pinpoint exactly where, say, a car was coming from – like your real ears do.’
The low-tech difference - the fact that the AirPod 3s don’t squash into your ears means you can hear sound from the outside world. This difference isn’t, of course, about listening to music while you’re walking. Rather, it’s about combining the spoken descriptions, directions and augmentations of assistive apps whilst keeping a self-preserving ear out for cars.
For you
Be part of something bigger, join BCS, The Chartered Institute for IT.
‘The AirPod 3s have a compass too. This means you’ve now got much better awareness of your surroundings,’ he explains. ‘Those compasses mean that if you’re using another Microsoft App (called Sound Scape) and you’re walking along, it can tell you what you’re passing. As you turn your head around, points of interest pan around and you can hear exactly what’s around you. Your phone’s not guessing from just GPS.’
This richly-augmented world does, however, have a limit and, ironically, that’s at the end of your journey. Sadly, indoor navigation isn’t always as vivid and rich as outdoor. Again though, there are technologies at hand.
Labelling intelligently
One is NaviLens. There are two parts to the technology - an app which describes your environment, the other, a blocky and colourful QR code label. These labels can be placed in indoor environments or on objects.
The key difference between a NaviLens label and a QR code is the distance from which they can be read by your phone’s camera. Anybody who has done a COVID-19 lateral flow test or checked into a restaurant with the NHS’s Test and Trace app will know that QR codes are very distance-sensitive. NaviLens labels can be different sizes and, as such, be read from much greater distances.
Along with increased distance, NaviLens codes have a much wider angle of visibility. Again, with a QR code, your phone needs to be parallel with the code you’re scanning. NaviLens codes are much more forgiving when it comes to viewable angles and, better yet, can be used by their partner app to infer distance between the user and the object that has been labelled.
Why serious businesses are taking accessibility seriously
How people view websites and use digital tools is, of course, different from how businesses do. We consume the service and the business hopes that when we do, they’ll make a profit from our activities. Which leads to the question, how do businesses see accessibility? Or, how should they see it - as a sunk cost or as an investment?
‘Up until maybe five years ago, businesses were asking “Why?” which was depressing,’ says Christopherson. ‘Today, the conversation has definitely moved on. If businesses are still asking why, they’re way behind the curve. If they are still asking that question, there are juicy carrots and sticks.’
The sticks are legal requirements. There’s the Public Sector Bodies (Websites and Mobile Applications) Accessibility Regulations 2018, which defines the public sector’s responsibilities around accessibility. And there’s the Equality Act 2010.
‘Is making things accessible ethical?’ asks Christopherson. ‘Of course it is. Is it the right thing to do? Obviously. And from a brand value perspective? … You don’t want to be a company whose value and trustworthiness is eroded by very obviously not prioritising disability - that becomes apparent very quickly. And, of course, staff turnover for companies that prioritise accessibility and inclusion is much lower; all the wellbeing KPIs go up.’
Finally, Christopherson emphasises the ROI: ‘If you don’t build in accessibility, you’re going to be missing out on a large sector of the population. The “Purple Pound” is estimated to be around £274bn a year of disposable income. 75% of people with a disability instantly click away or go somewhere else if they find an accessibility problem. If you’re in a competitive market, accessibility is a differentiator.’
Born accessible is best
The key, Christopherson advises, is building accessibility into a project before the first line of code is written. Teams could use diverse personas. An accessible component library is a good idea, as is good documentation and training. When it comes to software development, the key is to shift-left: perform testing early in the lifecycle (move the task to the left on the project timeline).
‘That’s when products are born accessible and there are no nasty surprises,’ he states. And, finally, if you’re worried about cost, don’t be. Christopherson advises that building in inclusivity adds between 2-5% to a project’s costs.
‘If you create something for people with extreme needs, it’s going to be usable by everybody,’ he continues. ‘Finally, throw in mobile. If your users are going to be using mobile - which most are - it’s a mobile-first world. Anybody who uses a mobile is an extreme user. Think of that small sheet of shiny glass on a sunny day, a bumpy bus, a noisy café: good colour contrast and a good font size mean that anybody and everybody will benefit from those decisions.’
Inside a driverless car
Finally, back to those magic wand wishes. What about the driverless cars? ‘My friend has a Tesla,’ he says. ‘He got out, I got in the driver’s seat and he pressed the fob. It set off and parked itself. That was weird.’
Robin Christopherson’s final words as he ends our call are probably the best summary. After painting a picture of a future where car ownership is outmoded and app-enabled fleets of driverless cars circulate cities picking people up - each with careful and deliberate UX which embraces accessibility - he says: ‘I’m optimistic.’ And he certainly is.
Picture: Robin Christopherson receives his MBE from the Duke of Cambridge