Friday, March 11, 2011

Reading Response 8—Evaluation issue (cont'd)

We wanted to know what users thought of a site after using it. And that's best assessed by asking them (Nielsen, 2010).

The DECIDE framework suggests identifying goals and questions first before selecting techniques for the study, because the goals and questions help determine which data is collected and how it will be analyzed (Preece, p 379).

It really makes sense that the DECIDE framework gives us a useful guideline for evaluation phase. If you still remember, I quoted DECIDE framework in my last blog which caused me to conjunct it with my group project Feed Me Well (FMW). Here, I once again review my group’s initial ideas and considerations on evaluation phase, which would help examine my understanding on this framework and furthermore reinforces my skills of using it in the real world. Certainly, these considerations were included in the end of our production paper:
Based on tutorials-focus tasks we have done, in the next evaluation phase, our goals would be to evaluate the effectiveness, ease of learning and functionality of the tutorial, to examine how the tutorials help users understand and operate the watch if they need this kind of assistance. In addition, in terms of the questions from which our groups want to get the answers, before my group reach agreement, I just personally came up with several questions that probably fit our FMW evaluation phase regarding the tutorials. It is not a final nor complete one though, I just want to keep them here when they are still fresh in my mind. (My apologize to my groupJ for showing them here) If applicable, questions might be like these:
  1. In general, how do users like the tutorials?
  2. What is(are) users’ favourite feature(s) or part(s) regarding the tutorials?
  3. Is (are) there one (some) part(s) users don’t like on the tutorials?
  4. To what extent do users think the tutorials help them operate the watch?
  5. Are there any improvements for the tutorials? If so, what are they?
Thus, the questions showed above need to be figured out in the evaluation phase. Based on Nielsen’s suggestion, we can definitely get the answers by asking the tested users, knowing how they think of the tutorials from both macro and micro perspectives. We probably could carry out an interview and change the questions above into a questionnaire format. Therefore, the main evaluation techniques we plan to use is observing users, testing users’ performance and asking their opinions (interviewing users). For the detailed procedure and implementation, I think I’d better leave them to our final evaluation paper.J

I just came across an online book relating to this EDER course when I googled for evaluation materials. The name of the book is User-centered design stories: real-world UCD case files, written by Carol Righi and Janice James in 2007. It is the first user-centered design casebook, follows the Harvard Case study method. I attacked it here for providing myself with more materials and reference that might be available in my future practice. Please click here for direct link of this book.

No comments: