Facebook uses clever AI that will make images more accessible for the blind

With infinite streams of visual content dominating the majority of social platforms, we are all more than aware of the highly visual nature of online communications. However, this can often leave the visually impaired marginalised from the online experience.

Facebook wants to change this. In an effort to help the blind community, the social media giant has launched a tool that will help the visually impaired to enjoy photos shared within the platform.

The feature, now available to all, is trained to recognise and identify a variety of objects within photos. Each image is analysed and automatically translated by Facebook’s AI into a rich audio description of the image – also known as “Automatic alternative text”, or “automatic alt text”.

Previously, if a blind user came across an image whilst navigating their newsfeed via a screen reader, the description would be minimal, merely reading the text within the update, the name of the person sharing that update, simply followed by “photo”.

With automatic alternative text the user may now hear:

“Image may contain:  two people, smiling, sunglasses, sky, outdoor, water”.

If this image description is read against the caption: “We finally made it!”, the user is able to obtain a much richer understanding of the story.

The rationale behind using the phrase “image may contain”

Facebook cannot guarantee 100% accuracy of the audio description. If Facebook isn’t 80% certain of how to tag an image, it will not suggest a description.

At present, the feature only exists in iOS, and is available for English language users within the US, UK, Canada, Australia, and New Zealand.

If you would like to learn more about Facebook’s automatic alternative text, check out the video below:

Latest Posts

this post unpacks why b2b isn’t boring and how it’s moved from nice-to-have to mission-critical. it argues for trust as a working system (clear claims, named sources, human voices), puts short, sourced answers where people and ai look (linkedin, youtube, communities), and shows why people beat logos for credibility. it backs hybrid buying journeys that give control and timely human support, and it tracks intent signals like saves, sends and branded search. if b2b is your world, join us at socialday b2b forum 2025 at bounce, shoreditch on 12 november to go deeper.
Read More
If you’re a B2B marketer, you can probably see your buyer is changing. Your meetings seem to have more and more senior-positioned folk who are younger, digitally native, and social pioneers. It’s time to adapt accordingly. They research on their phones, trust creators more than brands, and expect to feel…
Read More
Social schedulers vs native: what actually works You’re managing a posting plan that never quits. Copy, links, tags, alt text, approvals, and the “can we move this to Thursday?” Loop. You’re holding social strategy in one hand and a calendar in the other, trying to keep both upright. At immediate…
Read More