Twitter

Twitter now lets users add image descriptions to help the visually impaired

Twitter now lets users add image descriptions to help the visually impaired

(Twitter Blog)

Twitter is working to make it easier for people with visual impairments to access and understand images uploaded to the site. Although screen readers and braille technology work efficiently with text posts, images have been out of reach  until now. Starting today, people using the iOS and Android apps will be able to add descriptions for images they post on the site.

Using up to 420 characters, people can provide details of the image they are tweeting. The description can then be accessed by the visually impaired and be read like any other text post using their assistive technology.

Last year, research showed that tweets with photos drive 313 percent higher engagement. Twitter’s visually impaired users, meanwhile, have been left to use third-party workarounds like EasyChirp and Alt Text Bot for alternative text. By building the feature into the product itself, the company has simplified the process of accessing photo descriptions.

The feature not only helps with accessibility, but search engines may find it easier to identify specific tweets when they are labelled with way, Venture Beat reported. Even within Twitter’s own search field or through API partners, it could prove useful.

After CEO Jack Dorsey’s #HelloWorld initiative to reconcile differences between the company and developers, Twitter has been toying with many improvements ranging from increasing the word limit, add the ability to edit tweets, accessibility improvements, and more. To keep a healthy dialogue going, Twitter has extended its newest update beyond its users by updating its REST API and Twitter Cards so publishers and developers can benefit from the service too.

And although it’s unavailable on web for now, the new feature is still likely to reach the majority of Twitter’s users. Out of the site’s 320 million monthly active users, 80 percent are active users on mobile.

(function(a,b,c,d,e,f){a[d]||(a[d]= function(){(a[d].q=a[d].q||[]).push([arguments,+new Date])}); e=b.createElement(c);f=b.getElementsByTagName(c)[0]; e.src=’https://s.yimg.com/uq/syndication/yad.js’;e.async=true; f.parentNode.insertBefore(e,f)}(window,document,’script’,’yad’)); yad(‘2ee776b0-e870-3477-bc1c-3cdf1f2a0f9f’);