A human hand touching a glossy round surface with cloudy blue texture that resembles a globe

How do blind people imagine AI? An interview with programmer Florian Beijers

Florian Beijers

Note: We acknowledge that there is no one way of being blind and no one way of imagining AI as a blind person. This is an individual story. And we’re interested in hearing more of those! If you are blind yourself and want to share your way of imagining AI, please get in touch with us. This interview has been edited for clarity.

Alexa: Hi Florian! Can you introduce yourself?

Florian: My name is Florian Beijers. I am a Dutch developer and accessibility auditor. I have been fully blind since birth, I use a screen reader. And I give talks, write articles and give interviews like this one.

Alexa: Do you have an imagination of Artificial Intelligence?

Florian: I was born fully blind so I have never actually learned to see images, neither do I do this in my mind or in my dreams. I think in modalities I can somehow interact with in the physical world. This is sound, tactile images, sometimes even flavours or scents. When I think of AI, it really depends on the type of AI. If I think of Siri I just think of an iPhone. If I think of (Amazon) Alexa, I think of an Amazon Echo.

It really depends on what domain the AI is in

I am somehow proficient in knowing how AI works. I generally see scrolling code or a command line window with responses going back and forth. Not so much an actual anthropomorphic image of, say, Cortana or like these Japanese Anime. It really depends on what domain the AI is in.

Alexa: When you read news articles about AI and they have images there, do you skip these images or do you read their alt text?

Florian: Often they don’t have any alts, or a very generic alt like “image of computer screen” or something like that. Actually, it’s so not on my radar. When you first asked me that question about one week ago – “Hey we’re researching images of AI in the news” – I was like: Is that a thing?

(laughter)

Florian: I had no clue that that was even happening. I had no idea that people make up their own images for AI. I know in Anime or in Manga, there’s sometimes this evil AI that’s actually a tiny cute girl or something.

I had no idea that people make up their own images for AI

Alexa: Oh yes, AI images are a thing! Especially the images that come from these big stock photo websites make up such a big part of the internet. We as a team behind Better Images of AI say: These images matter because they shape our imagination of these technologies. Just recently there was an article about an EU commission meeting about AI ethics and they illustrated it with an image of the Terminator …

(laughter)

Alexa: … I kid you not, that happens all the time! And a lot of people don’t have the time to read the full article and what they stick with is the headline and the image, and this is what stays in their heads. And in reality, the ethical aspects mentioned in the article were about targeted advertisements or upload filters. Stuff that has no physical representation whatsoever and it’s not even about evil, conscious robots. But this has an influence on people’s perception of AI: Next time they hear somebody say “Let’s talk about the ethics of AI”, they think of the Terminator and they think “I have nothing to add to this discussion” but actually they might have because it’s affecting them as well!

Florian: That is really interesting because in 9 out of 10 times this just goes right by me.

Alexa: You are quite lucky then!

Florian: Yes, I am kind of immune to this kind of brainwashing.

Alexa: But you know what the Terminator looks like?

Florian: Yeah, I mean I’ve seen the movie. I’ve watched it once with audio description. But even if I am not told what it looks like I make it a generic robot with guns…

Alexa: Do you own a smart speaker?

Florian: Yes. I currently have a Google Home. I am looking into getting an Amazon Alexa Echo Dot as well. I enjoy hacking on them as well like creating my own skills for them.

Alexa: In the past, I did some research on how voice assistants are anthropomorphised and how they’re given names, a gender, a character and whole detailed backstories by their makers. All this storytelling. And the Google Assistant stood out because there’s less of this storytelling. They didn’t give it a human name, to begin with.

Two smart speakers: A Google home and an Amazon Echo. Image: Jonas Nordström CC BY 2.0

Florian: No it’s just “Google”. It’s like you are literally talking to a corporation.

Alexa: Which is quite transparent! I like it. Also in terms of gender, they have different voices, at least in the US, they are colour-coded instead of being named “female” or “male”.

Florian: It’s a very amorphous AI, it’s this big block of computing power that you can ask questions to. It’s analogous to what Google has always been: The search giant, you can type things into it and it spits answers back out. It’s not really a person.

Alexa: Yeah, it’s more like infrastructure.

Florian: Yeah, a supercomputer.

Alexa: I wondered if you were using a voice assistant like Amazon Alexa that is more heavily anthropomorphised and has all this character. How would you imagine this entity then?

Florian: Difficult. Because I know kind of how things work AI-wise, I played with voice assistants in the past. That makes it really hard to give it the proper Hollywood finish of having an actual physical shape.

Alexa: Maybe for you, AI technology has a more acoustic face than a visual appearance?

Florian: Yes! The shape it has is the shape it’s in. The physical device it’s coming from. Cortana is just my computer, Siri is just my phone.

The shape AI has is the shape it’s in

Alexa: Would you say that there is a specific sound to AI?

Florian: Computers have been talking to me ever since I can remember. This is essentially just another version of that. When Siri first started out it used the voice from VoiceOver (the iOS screen reader). Before Siri got its own voice it used a voice called Samantha, that’s a voice that’s been in computers since the 1990s. It’s very much normal for devices to talk at me. That’s not really a special AI thing for me.

A sound example of a screen reader

Alexa: When did you start programming?

Florian: Pretty much since I was 10 years old when I did a little HTML tutorial that I found on the web somewhere. And then off and on through my high school career until I switched to studying informatics. I’ve been a full-time developer since 2017.

Computers have been talking to me ever since I can remember

Alexa: I think how I first got in touch with you on Twitter was via a post you did about screenreaders for programmers, there was a video and I was mind-blown how fast everything is.

Florian: It’s tricky! Honestly, I haven’t mastered it to the point where other blind programmers have. I use a Braille display, which is a physical device that shows you line by line in Braille. I use that as a bit of a help. I know people, especially in the US, who don’t use Braille displays. Here in Europe it’s generally a bit better arranged in terms of getting funding for these devices, because these devices are prohibitively expensive, like 4000-6000 Euros. In the Netherlands, the state will pay for those if you’re sufficiently beggy and blindy. Over in the US, that’s not as much of a given. A lot of people tend not to deal with Braille. Braille literacy is down as a result of that over there.

I use a Braille display to get more of a physical idea of what the code looks like. That helps me a lot with bracket matching and things like that. I do have to listen out for it as well otherwise things just go very slowly. It’s a bit of a combination of both.

Alexa: So a Braille display is like an actual physical device?

Florian: It’s a bar-shaped device on which you can show a line of Braille characters at a time. Usually, it’s about 40 or 80 characters long. And you can pan and scroll through the currently visible document.

I use a Braille display to get more of a physical idea of what the code looks like

Alexa: How do you get the tactile response?

Florian: It’s like tiny little pins that go up and down. Piezo cells. The dots for the Braille characters come up and fall as new characters replace them. It’s a refreshable line of Braille cells.

A person's hands using a Braille display on a desk next to a regular computer keyboard
A person using a braille display. Image: visualpun.ch, CC BY-SA 2.0, https://www.flickr.com/photos/visualpunch/

Alexa: Would that work for images as well? Could you map the pixels to those cells on a Braille display?

Florian: You could and some people have been trying that. Obviously the big problem there is that the vast majority of blind people will not know what they’re looking at, even if it’s tactile. Because they lack a complete frame of reference. It’s like a big 404.

(laughing)

Florian: In that sense, yes you could. People have been doing that by embossing it on paper. Which essentially swells the lines and slopes out of a particular type of thick paper, which makes it tactile. This is done for example for mathematical graphs and diagrams. It wouldn’t be able to reproduce colour though.

Alexa: You are a web accessibility expert. What are some low hanging fruits that people can pick when they’re developing websites?

Florian: If you want to be accessible to everyone, you want to make sure that you can navigate and use everything from the keyboard. You want to make sure that there is a proper organizational hierarchy. Important images need to have an alt text. If there’s an error in a form a user is filling out, don’t just make it red, do something else as well, because of blind and colourblind people. Make sure your form fields are labelled. And much more!

Alexa: Florian, thank you so much for this interview!


Links

Florian on Twitter: @zersiax
Florian’s blog: https://florianbeijers.xyz/
Article: “A vision of coding without opening your eyes”
Article: “How to Get a Developer Job When You’re Blind: Advice From a Blind Developer Who Works Alongside a Sighted Team” on FreeCodeCamp.org
Youtube video “Blindly coding 01”:  https://www.youtube.com/watch?v=nQCe6iGGtd0
Audio example of a screen reader output: https://soundcloud.com/freecodecamp/zersiaxs-screen-reader

Other links

Accessibility on the web: https://developer.mozilla.org/en-US/docs/Learn/Accessibility/What_is_accessibility
Screen reader: https://en.wikipedia.org/wiki/Screen_reader
Refreshable Braille display: https://en.wikipedia.org/wiki/Refreshable_braille_display
Paper embossing: https://www.perkinselearning.org/technology/blog/creating-tactile-graphic-images-part-3-tips-embossing

Cover image:
“Touching the earth” by Jeff Kubina from Columbia, Maryland, CC BY-SA 2.0 https://creativecommons.org/licenses/by-sa/2.0, via Wikimedia Commons