måndag 11 april 2016

Seminar #2 notes

The reading for this seminar presented concepts and tools for evaluating designs. A pillarstone of design evaluation is the fact that the earlier you discover issues, the less of a problem they are likely to pose, and the more likely it is that the design requirements will mature properly, meaning you make the right trade-offs. The reason for this isn't that designs are made to be faulty, but that the process can narrow one's focus too much, and this tunnel vision can often cause obvious design flaws. One example that comes to mind is the famous Cisco router that had the reset button aligned perfectly with a normal cable protector position:



http://cdn1.tnwcdn.com/wp-content/blogs.dir/1/files/2015/09/fn63697_01.jpg

This means that users who were extra cautious and used protected tabs on their networking cables got their whole system reset. Probably not the intention at all, but one may ask how even almighty Cisco could miss this.

How can such fails be avoided? The chapters present some methods, mostly involving external people.

Asking users
Through structured and semi-structured interviews, people from outside the development team can offer their two cents. These could be random people, intended users or field experts. It is also possible to conduct these interviews in groups. It's very important to avoid bringing in bias or direction, the point is to discover design flaws, and this requires a setting that's as neutral as possible.

Questionnaires
Questionnaires are based on the same principles as the interviews - they offer the possibility to ask a larger group, but remove the opportunity to ask follow-up questions immediately. Care must be taken in how the questionnaires are designed, so as not to impose any ideas. Also, the sample group must be representative of the design's intended users.

Inspections
Heuristic evaluation is a technique where an expert can assess the design through informal usability inspection. Parameters for this evaluation can be how the design complies with set design standards, how helpful errors can be etc. Another technique is the cognitive walkthrough, where a user's action process is simulated in order to find any issues or room for improvement.

In the end, the goal is to get good data to analyze. Also, design evaluation is constant work throughout the process, and iteration cycles that take evaluation data into account may very well have much better prototypes to work with in the next evaluation rounds, giving even more relevant data. Also, failing is an important part of the design - the reset button placement probably has a dedicated task group at Cisco by now.

Inga kommentarer:

Skicka en kommentar