Akademy/2013/UsabilityWorkshop: Difference between revisions

From KDE Community Wiki
< Akademy‎ | 2013
mNo edit summary
No edit summary
 
(One intermediate revision by the same user not shown)
Line 5: Line 5:
==Process==
==Process==
The process works as follows:
The process works as follows:
The process works as follows: The user is put in front of the opening screen of the application/tool. The developer or usability expert guiding the process gives them a goal, something like 'play a song from Cold Play' or 'connect to the wireless network'. Then the following protocol is executed by the person guiding the process:  
The process works as follows: The user is put in front of the opening screen of the application/tool. The developer or usability expert guiding the process gives them a goal, something like 'play a song from Cold Play' or 'connect to the wireless network'. Then the following protocol is executed by the person guiding the process, asking the user:  


# Question: What is your first impression?
# What is your first impression?
# Question: What do you think you can do here?
# What do you think you can do here?
# Question: What would you do to reach your goal?
# What would you do to reach your goal?
# Question: What do you expect to happen when you do it?
# What do you expect to happen when you do it?
# Please do it... (let user execute action)
# Please do it... (let user execute action)
# Question: Did it do what you expected?
# Did it do what you expected?
# Question: (if not: was it better or worse?)
# If not: was it better or worse?
# If task is not completed yet, go back to 1
# If task is not completed yet, go back to 1


Line 28: Line 28:


==Results==
==Results==
* We tested KScreen. I turned out to be very confusing ;-)
We did 2 sessions, one in the morning and one in the afternoon (together spanning most of Tuesday). At any time, between 10 and 20 people participated.
* Also Plasma Media Center got tried out. See [[Plasma/Plasma_Media_Center/Akademy2013#PMC%20Usability%20test%20feedback|here]] for PMC feedback.
* We tested the new KScreen. I turned out to be very confusing, unfortunately the fixes/changes won't make it in the release part of Workspaces 4.11.
* Also Plasma Media Center got a good workout. See [[Plasma/Plasma_Media_Center/Akademy2013#PMC%20Usability%20test%20feedback|the PMC feedback]].
* The network manager plasmoid session was even put on video: [http://www.youtube.com/watch?v=FjuVND3VAoM part 1] and [http://www.youtube.com/watch?v=5BJ_e4jGpRA part 2]
* The network manager plasmoid session was even put on video: [http://www.youtube.com/watch?v=FjuVND3VAoM part 1] and [http://www.youtube.com/watch?v=5BJ_e4jGpRA part 2]
* And several other applications and tools.

Latest revision as of 09:08, 17 August 2013

User Usability Testing

The goal of this exercise is to get input from a user about the applications being tested. The test works by putting the user in front of the application, giving them a task and letting them execute it, guided by the developer asking questions.

Process

The process works as follows: The process works as follows: The user is put in front of the opening screen of the application/tool. The developer or usability expert guiding the process gives them a goal, something like 'play a song from Cold Play' or 'connect to the wireless network'. Then the following protocol is executed by the person guiding the process, asking the user:

  1. What is your first impression?
  2. What do you think you can do here?
  3. What would you do to reach your goal?
  4. What do you expect to happen when you do it?
  5. Please do it... (let user execute action)
  6. Did it do what you expected?
  7. If not: was it better or worse?
  8. If task is not completed yet, go back to 1

If the task is finished, asked what the user thought of it all, discuss improvements etc.

Notes:

  • This kind of testing is very good to find grave issues in applications; it is not very good at fine tuning the interaction with your application. Be a bit wary of the feedback from a single user: repeat the process at least once with somebody else to make sure that what is an issue for one person is really a problem. Ideally, run the test a few times with various people to increase the reliability of the feedback.
  • Try to closely stick to the protocol, not skipping steps. Of course, if a user has already answered a question in his answer on another one, there is no need to ask that question again...
  • make clear that the user is ONLY to execute a SINGLE action in step 5. This means that most of the time, you are checking the expectations and ideas of the user instead of watching him/her clicking around.
  • As guide you're not supposed to give any hints as to what button the user should click or what he/she should do, other than perhaps remind him/her of the end goal he/she has been given
  • Despite the above point, there is of course no reason to let the user helplessly waste his/her time. If the user get stuck, help them to get unstuck.
  • Yes, it can be a bit hard to stay disciplined, not giving hints or telling the user what to do... And to stick to the protocol instead of just quickly letting the user click to the end. Try to keep to the structure anyway, it works best.
  • Extra tip: it is very valuable to record the sessions. Often you see things later on that you were not aware off during the session...

Results

We did 2 sessions, one in the morning and one in the afternoon (together spanning most of Tuesday). At any time, between 10 and 20 people participated.

  • We tested the new KScreen. I turned out to be very confusing, unfortunately the fixes/changes won't make it in the release part of Workspaces 4.11.
  • Also Plasma Media Center got a good workout. See the PMC feedback.
  • The network manager plasmoid session was even put on video: part 1 and part 2
  • And several other applications and tools.