Our towns and cities are much better set up for people who use wheelchairs than in the past. Dedicated parking spaces have become commonplace, while modern buses and trains often include areas for wheelchairs. A large number of public places like restaurants and bars also have wheelchair-friendly toilets.
With added help from improvements in wheelchair technology, people with physical disabilities are therefore enjoying ever greater levels of independence. Yet they still face significant obstacles to activities that are routine for the rest of us. A good example is shopping. Shelves that are beyond the reach of a wheelchair are a constant problem.
This profoundly affects their sense of autonomy, as a group of wheelchair users confirmed when we interviewed them. They didn’t like asking for help unless strictly necessary, such as to climb stairs or to get over a high kerb. They wanted to go to normal stores and interact with products like any other person, and they liked the idea of a technology that could help. This is what we set out to design.
We piloted a system for people of three different levels of impairment. The first group had full use of their hands. The second group had low hand mobility as a result of problems such as tremors. The third group could only use their hands for a limited set of actions, such as driving their wheelchair, and generally faced severe communication problems.
The system for the first group involved an app for their smartphones/tablets, taking advantage of the fact that most people own such a device. We created a DVD/CD/bookstore to trial the system on campus at Pompeu Fabra university in Barcelona, where my PhD was located. Users had to click on the app as they entered the store, which brought up a virtual shop designed to look like the store entrance.
Users would then make their way through the store like any other customer. When they came to a shelf with something they potentially wanted to buy, they had to point their device at it. The shelves were all fitted with augmented-reality technology that could communicate information via radio frequencies about the shelves’ approximate dimensions and they products they contained.
The user’s screen would display the shelf and they had to touch the area where the product was located. The app would then list the items in that area. The user could choose a product and either get information, such as price or expiry date, or make a purchase.
The shelves were set up so that all the relevant information was updating in real time, which was vital to ensure users weren’t misled about what they could potentially buy. Staff would assemble a basket of purchases for the user ready for when they reached the checkout.
Tablet/smartphone app in practice. Zulqarnain Rashid.
The people in our second group of users are generally unable to use a smartphone unaided, so we developed a prototype of a fixed touchscreen at a suitable height adjacent to the relevant shelf. The screen contained the large fonts and interfaces recommended by previous studies for people in this category.
The third group contained people too impaired for either of the first two solutions. Yet as our initial interviews confirmed, they have the same desire and motivation to use regular stores. We came up with a system that combined the smart shelves with a smart glass product like Google Glass. The system worked in a similar way to the first one, except that users selected either by voice command or by touching the side of the device.
Fixed screen option (left/centre) and smart glasses option (right). Zulqarnain Rashid.
We trialled with 18 users—seven each from our first and second categories and four from the third category. A preliminary evaluation showed promising results in improving their independence. It gave them an experience that proved close to online shopping but in the context of a bricks-and-mortar store.
The main smartphone/tablet system was the most successful. It benefited from the fact that most people already use smartphones and were well aware of technology. One user said:
These interfaces are helpful to me to do shopping by myself without asking or requiring the assistance of other people. I would like to have it available at real shops, and think that getting used to something like this is very easy, and it is an opportunity to be more independent.
The touch screen system was slightly less successful as it required some explanation and training, but participants were excited to know about the possibility. On the other hand, the smart glasses system required a lot of training and adjustments. We came to the conclusion that users must be familiar with using smart glasses in advance, so it might have more potential in the future than the present.
If some version of this system is to go mainstream, cost will be an obvious question. Smart shelves are the biggest outlay, but major retailers are beginning to use them anyway for purposes like stock counting and theft prevention. The idea is that our system could piggyback on top. It would then be for retailers to decide whether to use all three solutions or just one or two of them.
We’re not the only ones looking at solving this problem, though nobody has cracked it yet. Our next step is to establish a pilot shop in a public place where wheelchair users can access it easily. That will give us an opportunity to make further improvements, at which point we will be looking to roll the system out. Hopefully in the relatively near future, we will see strides in wheelchair shopping to match what has happened with parking and toilets.
Want the latest retail news delivered straight to your inbox? Click here to sign up to the weekly retailbiz newsletter.