Our group got to test a prototype of the "GlassHook" application, which is a mobile application that aims to guide visually impaired users through a building. There were two parts to the test, one was the Usability testing which was to be done on users from the intended target user population, and one was the usability evaluation which we as usability experts did by using the application ourselves.
As per the testing requirements set out by the developing team, we were to test on 3-5 users from two categories, partially blind users and completely blind users. This was not feasable as there was not enough time so the other team provided us with one user, who was from the completely blind cohort. We visited him in the MAB Mackay Rehabilitation center and went about the testing procedure by asking him to complete a simple task of using the application to navigate to a room by saying "Comp 202" which would guide them to a room where comp 202 is being taught. The results are summarized in the following documents;
Pre Test Questionnaire
Post Test Questionnaire
For the usability evaluation, all three members of our group separately tried out the application and filled out the usability evaluation forms as follows
A more detailed analysis of our results is in the next section.
Results and Analysis
User Testing Summary
Since we have only tested on one user the quantitative results are just provided in the data sheet above. We will also not be able to provide any comparison between users for this reason.
Our analysis of the result and observations are given below. We have divided the usability issues by stage of the application and inside each stage the issues are arranged according to order of importance
The rationality behind the ranking of importance is as follows
- High: These usability issues can cause critical problem to the main functionality of the application, such that the user could not proceed without someone else's help.
- Medium: These usability issues are important, but the user can keep the process without help
- Low: These things are not as important as the high and medium rank issues, but it would be helpful when they are developed.
Starting the application
"Once the application starts, you'll be prompted to tap and hold to activate your phone's Bluetooth transmitter."
1. After the user tap and hold, there was pop-up message like below which block functionality of the app as the user does not know that it has appeared [Importance : High]
3.At first, the phone setting set the screen automatically was turned off every 15secs. But the user does not know the screen is turned off, so the user
took a time to know why the application did not work well. è So when the user starts this application, it would be better to set the display turns on for a
long time, but because of battery problem, you can make the brightness of display, which is the main reason of battery consumption, to the lowest. [Importance: High]
2.The user felt difficulty to tap and hold, so he tried the tap and hold many times. [Importance : Medium]
"If you tap and hold the screen for an instant, a vibration cue should occur to signal that everything is set up."
1. The vibration cue did not work. It always uses sound cue even silent mode. [Importance : Medium]
2. The user cannot hear the sound cue well, so we tried to go up the sound to maximum, but the sound was so low. [Importance : Medium]
"After feeling the confirmation vibration event, you should be prompted to say the room number or the course number associated with your destination."
Provided that the speech recognition engine understands you and that the course code / room number is in the database, the application should reply back that a route is being calculated.
1. At first, we thought the beacon did not work and the route calculated is not accurate. So, we put all beacon on the straight line, but the route was always the same. (even without beacon, the route is the same) [Importance : High]Navigation
"Once the route is chosen, the application will first read an overview of the trip, and then go over each waypoint slowly. Upon approaching one of the waypoints on the correct route, a vibration will be triggered and the next waypoint's location will be revealed."
1. The user followed the route, but there was no cue to teach the user how long we have to go further or the distance left. [Importance : High]
3. The user walked a lot according to the calculated route, but there was no cue to tell if the user arrived the waypoints. [Importance : High]
2. The user deviated the calculated way, but there was no cue to teach user if the user goes a wrong way. [Importance : Medium]
"This continues until the final waypoint, the intended destination, is reached, at which point the application automatically shuts down."
If error occurred, the user was supposed to push ‘OK’, but the user does not know if the error occurred or not. So, if it error occurred, it would be
better to tell the user by sound for visual impaired user and when the user notify the error, it would be better not to push the button, but just to keep
going without any pushing button. [Importance : High]
Usability Evaluation Result Summary
After completing the evaluation separately, we all discussed our results and our biggest concern was that most of the evaluation criteria is not valid because the functionality of the prototype is very limited.
In further detail a summary of our joint consensus on each heuristic which the developing team listed as important is as follows;
- Visibility of system status: The system should always keep user informed about their where they are, through updated navigational cues within reasonable time.At no point in the application is it clear where the current location of the user is, physically or in the application. The textual display would not be useful for a visually impaired user so there should be some way in which the user can verify where in the app is he. Physically, the user should be able to find out their location or next steps at any time
- Match between system and the real world: The flow of the VUI should be intuitive to the user and the system should lead the user to their destination. The system should assist the user through comprehensive directional instructions, where the wording and concepts should be natural to the user.
There is only one output that the prototype gives during navigation i.e. “in 3 meters turn left”. While this was not enough to make a good jusgement about this heuristic, it is our belief that such instructions should be enough to guide a user through a building
- User control and freedom: Users should be able to tailor their experience based on how experienced they are with the application. The VUI gives expert users the ability to skip instructions and inexperienced users to repeat instructions with ease.
1. There was no option to ask the application to repeat the instructions. After the user says a destination, the app confirms to make to make sure it had heard the right thing, which is helpful incase he had said the wrong thing
- Consistency and standards: The vocabulary and flow of the instructions should be consistent throughout the whole system.Once again the prototype only says 2 sentences in total so it was not enough to be a fair test
- Help users recognize, diagnose, and recover from errors: Error messages are comprehensive and clearly indicate the problem to the users. The system will then suggest a new route starting from the user’s new position.There was no option to change a route once it had been set or to go back or to reset the application.
2. There was no option to skip an instruction
Usability Test Critique
The main problem we had in conducting the test was that there was additional equipment required to use the application i.e beacons and there was no information in the user manual or otherwise on how to connect those or how the app worked with them so one of the members of the developing team had to be present to set up the application
· The user manual was lacking overall and seemed to have been made before the actual prototype so important information on its functionality was missing.
· There was no official data collection sheet on the developer’s website so we had to create our own.
· Regarding the usability goals, it is not clear what the main focus of the developers is in the user testing i.e. if their main focus is on ease of learning or ease of use etc.
· The first goal “the user should be able to start navigation in less than 60 seconds to a desired destination” is ambiguous because it is not clear when the 60 seconds start and end (after they have successfully connected Bluetooth or after they open app for the first time).
· The biggest problem the user faced was in enabling the Bluetooth and bypassing the android conformation messages so a test on user completing that would have been good as well.
· Another good test would have been total number of times the user was completely stuck and had no idea how to go proceed
Usability evaluation critique
We think that all five of these criteria were well chosen as they are all important to evaluate an application with the functionality intended for this application. However, these criteria should have been chosen more carefully to align with the capabilities of the prototype that the team had prepared. Much of the criteria e.g. “User control and freedom” and “Help users recognize, diagnose, and recover from errors” and “Consistency and standards” were not measurable because the prototype did not have enough functionality.
An important measure of functionality that we think the team should also be evaluating is “Recognition rather than recall” This is an important part of such an application because the user has to know at all times where they are and at what screen of the application they are in e.g. if at any time the user needs to hear the instruction again or know their location there should be some way of doing that
- A suggestion for the overall design structure would be to have only one screen available to the user at all times. Connecting to the Bluetooth should be available as a voice command.
Instructions to user
- “Tap and Hold” instructions are unclear to the user. He taps the screen multiple times before holding down. A suggestion would be to change to “Keep screen pressed”, “Hold” or something similar.
- The user should be able to repeat the instructions that the application provided. Suggestions would be to use volume buttons or shaking the phone.
- The user should have the ability to turn off instructions. After a set number of uses the user will become familiar with the interface and will not require the instructions to be read out to them. The user should be able to turn off instructions for successive uses. A suggestion would be to have a command at the voice input e.g. “Instructions OFF”. In this scenario as there are currently only two screens to the app it can vibrate in a set pattern to indicate the currently active screen.
- Currently the display of the application has not addressed any needs of its user population. The application is targeted towards blind or partially blind users. The display doesn’t affect the completely blind, but it should be optimized to provide maximum information to the partially blind with minimum effort.
A suggestion would be to use large icons representing the current task i.e. Bluetooth symbol that covers the screen, Microphone symbol that covers the screen etc.
Knowledge of system state
- The phone locks the screen after a certain amount of time. The app should disallow the phone from doing so as there is no way for the blind user to know that the phone is currently locked and he has been taken out of the application screen.
There should be some sort of test for the user to check if his application is active currently. If he interacts with the application and receives no response continuously, he should be able to perform some action to check if his application is running. This can be holding down the volume button, or shaking the phone. He should get some haptic feedback if app is running.
- When the user requests to connect to Bluetooth an external popup opens to ask if the phone should allow the Bluetooth to be activated. A suggestion would be to include in the user manual to always ensure the Bluetooth is turned on before starting the application. A simple reminder at the start of the application would be helpful e.g. “Go to home and say ‘Okay Google! Turn on Bluetooth’.”
Feedback to user
- The application gives an audio earcon feedback to indicate that the user action has been registered. This might prove to be a problem in a noisy environment. Another issue this raises is that the test user continuously would tap and hold and was uncertain as to when his response was being registered. A supplemental haptic feedback would be useful.
- The application currently did not provide accurate route correction functionality. After stating the initial route, the application did not correct the user if he strayed from the route. A suggestion would be to use a combination of voice and haptic feedback.