[openstack-dev] [Horizon] [UX] Summary of Horizon Usability Testing and plan for Summit session

Liz Blanchard lsurette at redhat.com
Thu Apr 24 15:10:48 UTC 2014


Hi All,

One of the sessions that I proposed for the Horizon track is to review the results that we got from the Usability Test that was run on Horizon in early March. I wanted to share some of the background of this test and the high level results with you all so that we can start the conversation on this list and then continue with agreeing on next steps during Summit. There will be a few follow-ups to this e-mail from myself and Jacki Bauer which will propose some potential solutions to the high priority findings, so be on the look out for those :)

---Quick overview of Usability Testing...What is it? Why do it?---
Usability testing is a technique used in user-centered interaction design to evaluate a product by testing it on users. This can be seen as an irreplaceable usability practice, since it gives direct input on how real users use the system.

---Who was involved? What did we need to do to prepare?---
A number of user experience engineers from a bunch of different companies got together and helped plan for a usability test that would focus on self-service end users and the ease of use of the Horizon Dashboard as it exists for the Icehouse release. This effort spun off from the Persona work that we've been doing together. Some folks in the group are just getting into contributing to the design of OpenStack and doing a baseline usability test of Horizon was a great introduction to how the usability of the Horizon UI could continue to be improved based on user's direct feedback.

What we needed to get done before actually jumping into the testing:
    1) Agree on the goals of the testing.
    2) Create a screener and send out to the OpenStack community.
    3) Create a list of tasks that the user would complete and give feedback on during the testing.

---Who we tested?---
6 people who we considered to be "self-service end users" based on their screener responses.

---What were the tasks that were tested?---

Scenario 1: Launching an instance
Individual Tasks:
-Create a security key pair.
-Create a network.
-Boot from cirros image.
-Confirm that instance was launched successfully.
 
Scenario 2: Understand how many vCPUs are currently in use vs. how much quota is left.
Individual Tasks:
-Check out Overview Page to review current quota use/limit details.
 
Scenario 3: Take a snapshot of an Instance to save for later use.
Individual Tasks:
-Either Launch an instance successfully, or identify a running instance in the instance view.
-Choose to take a snapshot of that instance.
-Confirm that the snapshot was successful.
 
Scenario 4: Launch an instance from a snapshot.
Individual Tasks:
-Choose to either create an instance and boot from a snapshot, or identify a snapshot to create an instance from.
-Confirm that the instance was started successfully.
 
Scenario 5: Launch an instance that boots from a specific volume.
Individual Tasks:
-Create a volume.
-Launch an instance using Volume X as a boot source.

---When and how we ran the tests?---
These hour long tests were run over the first two weeks of March 2014. We focused on the latest bits that could be seen in the Icehouse release. The moderator (a UX researcher from HP) would ask the questions and the rest of the group would vigourously take notes :) After all of the testing was complete, we spent some time together debriefing and agreeing on the prioritized list of updates that would be best to make to the Horizon UI based on user feedback.

---What were the findings?---

High Priority
* Improve error messages and error message catalog.
* Fix Launch Instance workflow for end user and power user.
* Improve informational help information about form fields.
* Fix terminology. (e.g. launch instance, boot, shutoff, shutdown, etc.)
* Show details for key pair and network in Launch Instance workflow.
* Recommend a new Information Architecture.
 
Medium Priority
* Create UI guidelines (of best practices) for Developers to use.
* Improve Online Help.
* Provide clearer indication the application is working after clicking a button and the application doesn’t respond immediately.
* Ensure consistency of network selection. (Drag and drop of networks very inconsistent from the other pieces of the launch instance modal)
* Create consistency of visualizations and section of action button recommendations on Instance table.
* Suggest defaults for the forms entry fields.
* Provide Image information details during image selection.
 
Low Priority
* Allow users to edit the network an instance after launching instance.
* Resolve confusion around the split inline actions button.
* Explain what the Instance Boot Source field in Create Instance modal.
* Provide description/high level information about flavors for flavor selection.
* Make sorting clearer visually.
* Provide solution for subnet checkbox to improve usability.
 
Nice to Have
* Provide “Save as Draft” option in the wizard.
* Change security group default name to “Default security group”.

Well if you’ve read this far, thank you for your interest in this topic!! We look forward to sharing some design proposals over the next week and continuing the discussion on tackling some of these items at Summit. Please let me know if anyone has any questions or concerns.

Best,
Liz

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.openstack.org/pipermail/openstack-dev/attachments/20140424/1acaeb04/attachment.html>


More information about the OpenStack-dev mailing list