No quod sanctus instructior ius, et intellegam interesset duo. Vix cu nibh gubergren dissentias. His velit veniam habemus ne. No doctus neglegentur vituperatoribus est, qui ad ipsum oratio. Ei duo dicant facilisi, qui at harum democritum consetetur.
Smartphones and tablets have been paramount in providing accessibility features for its users. From dictation for the blind to video chats for the hearing impaired. A wide range of functions have been implemented in order to create a seamless experience for users who need to utilise such functions. However, moving away from a device and into the real world we fail to see as many accessibility functions for those who need them.
Imagine you needed a new pair of shoes and your only way to pick one was to feel it. You couldn't see its colour, you didn't know for certain what material was on it or even if it featured a print or text. It’s hard to imagine isn't it?
Now imagine utilising features like dictation on a smartphone to get a full description of the item, making your decision so much easier. With the aid of beacons this no longer needs to be imagined, it can now be put into place.
Step One : Designing the app
The most essential part of designing an app which offers accessibility functions, is making it easy to use and available to everyone right from the download. Implementing an automatic dictation function is a simple way to ensure those with vision impairment can use the service immediately. Those who don't need dictation can simply turn it off on the apps home screen instantly.
Knowing that the service is there for users to begin with is where a real source of focus should be placed. Shop assistants can advise customers in-store, advertisements online and on television can also aid in creating shopper awareness. Once awareness is built, functions like Siri on an iPhone can be used to download the app straight from the App Store via dictation without hesitation.
Guidance on how to use the app and the features built into it will be offered when a user first opens it. Verbal directions on how to use speech to instruct the device on what the user would like to achieve will be given on first use. Anything from what details will be offered for items to trading hours of the store.
Step Two : Beacons, beacons and more beacons
Now that an app has been created for shoppers to utilise, the next step is to add beacons, beacons and more beacons throughout the store.
Guiding the user
Beacons placed at entry ways, stairs, escalators and registers will offer a different interaction than those placed on or around products. What these beacons will offer is guidance to users through their downloaded app. The app will showcase a map, offering step by step instructions to users in an array of different formats, specific to the users accessibility issues. For anyone who suffers from colourblindness, the map will eliminate the use of red, green and blue, detailing mostly in a combination of bold and shaded colouring. Those who are blind would be able to utilise dictation and voice command functions to request directions to a particular area of the store and receive audible directions.
Once a customer has made their way to a particular section of the store (indicated by confirmation from the device), beacons become more individualised. For anything placed on shelves in high volumes i.e. supermarket products, a unique beacon will be placed directly in front of the product. A user will then place their device over the beacon and receive audible information on the item including, name, price and weight/volume of content.
To assist further the process of identifying features of a product, individualised beacons will be placed on items which do not get stocked in high volumes on shop floors. For example, shoes will feature beacons on the sole, a simple hover of a phone next to it will generate audible information on brand, colour and sizing options currently available in store. For clothing items beacons will be placed on coat-hangers and information regarding colour, material, size and price will be offered in an audible format for the user.
If needed a customer will be able to ask for assistance from an employee through the app, which utilises a customers location in store to send a request to the employee based app. Once received an employee can assist in transacting the product or providing further information including sizing etc.
Step Three : Utilise
In the past beacons have given us quick and efficient ways to shop, dine and collect, but this new addition shows the true potential behind these tiny little devices. What can be given to vision impaired shoppers is full control to shop with ease and confidence that they purchase exactly what they want.