Seeing AI app from Microsoft

The Seeing AI app uses artificial intelligence and the camera on an iPhone, optimized for use with VoiceOver, to perform a number of useful functions:

  • Short Text – Speaks text as soon as it appears in front of the camera.
  • Documents – Provides audio guidance to capture a printed page, and recognizes the text, along with its original formatting.
  • Products – Scans barcodes, using audio beeps to guide you; hear the name, and package information when available.
  • People – Saves people’s faces so you can recognize them, and get an estimate of their age, gender, and emotions.
  • Scenes (early preview) – Hear an overall description of the scene captured.
  • Images in other apps – Just tap “Share” and “Recognize with Seeing AI” to describe images from Mail, Photos, Twitter, and more.

Seeing IA YouTube clip

Vision and Hearing Impaired Access for A History of the World in 100 Objects

media release

Monday 10 October, 2016

Vision and Hearing Impaired Access for A History of the World in 100 Objects

National Museum Offers Suite of Disability Access Features for the First Time

For the first time in a major exhibition, the National Museum of Australia is offering a suite of special features for blind, vision and hearing-impaired visitors to the “A History of the World in 100 Objects” exhibition from the British Museum.

Specially commissioned audio tours, with Auslan (Australian Sign Language) / Conexu video, braille label text and a Touch Table have been developed by the National Museum to help blind, vision and hearing-impaired visitors get the most out of A History of the World in 100 Objects.

In its only east coast venue, A History of the World in 100 Objects uses items from around the globe to explore the last two million years of human history, sourcing the oldest objects from the British Museum’s collection and incorporating those from the present day.

From stone to gold, clay to plastic, the exhibition traces human experience through objects people have made, including a 1.6 metre tall Assyrian relief, the famous Assyrian clay Flood Tablet (from modern Iraq) inscribed with the story of a great flood and an Ark; and a small, but exquisite, gold llama from Peru.

National Museum director Dr Mathew Trinca said he was committed to greater disability access at the cultural institution.

“The National Museum is keen to ensure that blind, vision and hearing-impaired visitors can enjoy exhibitions like A History of the World in 100 Objects, alongside other Australians,” said Dr Trinca.

National Museum Diversity and Wellbeing Support Officer (who is himself vision-impaired), Scott Grimley, said, “As technology makes it easier for people with a disability to access the world around them, the Museum is showing a commitment to include everyone in the exhibitions it provides.”

The National Museum is offering two audio tours, which are linked to Apps that can be downloaded on IPhones and Android devices.
Once downloaded, these Apps offer an Auslan video tour and audio descriptions of 19 objects featured in A History of the World in 100 Objects.

The 19 objects have Braille and large print identification numbers that can be accessed by blind, vision and hearing impaired visitors and then typed into the handheld devices, to trigger the audio or video tours.

Replica objects, including the Flood Tablet, several different Lewis Chessmen, the Astrolabe and the bust of Sophocles, that duplicate the sensory experience of touching the original objects in the exhibition, are available on a Touch Table.

www.nma.gov.au
Free general entry | Open 9 am —- 5 pm daily (closed Christmas Day) | Acton Peninsula Canberra | Freecall 1800 026 132
Donations (tax deductible) are welcome, visit www.nma.gov.au/support_us

The National Museum of Australia is an Australian Government Agency For more information please contact Tracy Sutherland, (02) 6208 5338 / 0438 620 710 or media@nma.gov.au

Demonstration of a DOS Screen Reader

This is a blast from the past for screen reader users. Demonstration of a DOS screen reader.

This is how screen readers used to be.

The demonstration is in two parts. Part one can be found here: DOS Screen Reader Part 1

and part two can be found here: DOS Screen Reader Part 2

Captioning a Video Using YouTube

I am currently doing a Professional Certificate in Web Accessibility Compliance through the University of South Australia. Part of the second assignment is to caption a video. Here is how I did it.

brief summary of captioning process:

  • Created a video, just over 2 minutes in length.
  • Listened to the audio of the video file, transcribed the audio, and created a transcription text file.
  • Created a Google – YouTube account.
  • Created a channel within that account – GAM Industries.
  • Uploaded my video to the YouTube GAM Industries channel.
  • Once uploaded, went to the video manager and selected my video for editing.
  • Chose the Sub Titles and CC tab within the video manager.
  • Chose to upload a file and uploaded my transcription file.
  • The transcribed text inserted itself into the video.
  • Sincranising of the audio and captioning was good. Surprisingly, when checking the video for audio and caption synchronisation, my screen reader read out the captions at the same time the video was playing.
  • When tagging the file within the YouTube GAM Industries channel used yt:cc=on as the last tag to force captions on when the file is viewed. Not too sure if this is a good idea though, as if you are a screen reader user, and view the video you get both the video playing with its audio and your screen reader reading the captions at the same time.
  • Saved and published the video.