100% tevredenheidsgarantie Direct beschikbaar na je betaling Lees online óf als PDF Geen vaste maandelijkse kosten 4.2 TrustPilot
logo-home
Samenvatting

Summary Usability & User Experience Evaluation Tilburg University

Beoordeling
-
Verkocht
12
Pagina's
49
Geüpload op
21-01-2021
Geschreven in
2020/2021

Summary of all lecture slides Summary of the articles of the reading list Additional notes from the teacher














Oeps! We kunnen je document nu niet laden. Probeer het nog eens of neem contact op met support.

Documentinformatie

Geüpload op
21 januari 2021
Aantal pagina's
49
Geschreven in
2020/2021
Type
Samenvatting

Onderwerpen

Voorbeeld van de inhoud

USABILITY & USER EXPERIENCE
EVALUATION
TILBURG UNIVERSITY 2020/2021 – HENDRIK ENGELBRECHT




CONTENTS

Lecture 1: introduction ................................................................................................................................................................................... 2

What we measure ......................................................................................................................................................................................................................... 2

Introduction to evaluation ............................................................................................................................................................................................................ 4

Setting up a study ......................................................................................................................................................................................................................... 5
Intermediate level knowledge...................................................................................................................................................................................................... 6

The future of evaluation: five potential research directions.................................................................................................................................................... 7

Triangulation .................................................................................................................................................................................................................................. 7

Reading material lecture 1 .......................................................................................................................................................................................................... 8

Lecture 2: Quantitative methods & implicit measures ................................................................................................................................. 15

Performance measures ............................................................................................................................................................................................................... 15

User testing .................................................................................................................................................................................................................................. 18

log data & analytics ................................................................................................................................................................................................................... 18

implicit behavioral & emotional measurements ...................................................................................................................................................................... 19

Reading material lecture 2 ........................................................................................................................................................................................................ 22

Lecture 3: Qualittive methods ...................................................................................................................................................................... 33
say & think methods.................................................................................................................................................................................................................... 34

do & use methods ....................................................................................................................................................................................................................... 35

Expert review ............................................................................................................................................................................................................................... 36

prototype vs/ products .............................................................................................................................................................................................................. 36

Analysis of qualitative data ....................................................................................................................................................................................................... 37

Reading material lecture 3 ........................................................................................................................................................................................................ 38

Lecture 4: self-report methods...................................................................................................................................................................... 43

Brief overview of self-report methods ..................................................................................................................................................................................... 43

which methods are Quantitative and/or qualitative? ............................................................................................................................................................ 45

presenting your findings............................................................................................................................................................................................................. 45

Reading material lecture 4 ........................................................................................................................................................................................................ 46




1

,LECTURE 1: INTRODUCTION

WHAT WE MEASURE


USABILITY

Usability is a measure of how well a specific user in a specific context can use a product/design to achieve a
defined goal effectively, efficiently, and satisfactorily.

• The user can do what he or she wants to do the way he or she expects to be able to do it, without
hindrance, hesitation, or question
• Absence of frustration
• Usability is only an issue when it’s absent

Absence is sometimes only visible when innovation takes place.

Three core concepts of usability

• Effectiveness (task completion)
• Efficiency (time spent)
• Satisfaction (no discomfort, positive attitude)

Bailey’s human performance model

Bailey’s Human Performance Model states that there are three parts to be
considered: the task, the user, and the context.

Human Performance Model (HPM) is an attempt to integrate and study the factors,
and aspects influence the performance of a human during performing a job.

People performing in systems have in common that they are each somebody, doing
something, someplace.

All these factors are constantly in motion.



USER EXPERIENCE (UX)


Period Period Name Description

1940s – 1950s System reliability phase Evaluation efforts tended to focus on system reliability,
mostly in terms of how long a system would function without
failure (stability)


1950s – 1960s System performance phase Users shifted from engineers to programmers and computer
scientists. Evolution efforts focused on issues related to
systems performance, typically in terms of processing
speed.


1960s – 1970s User performance phase The introduction of non-specialist into the equation was a
major shift because it forced evaluators to be more
interested in evaluating the speed of the user rather than
the speed of the system.




2

, 1970s – 2000s Usability phase Evaluators started to develop methods for evaluating the
usability or “ease of use” of computer systems because “if
a system is not easy to learn, it would not be used”.
Accordingly, evaluation efforts were shifted to focus more
specifically on aspects of learnability in addition to speed
and efficiency.


2000s – now User Experience (UX) phase Shift from task-based performance to user affect and the
value of computer interaction in everyday life à designing
for pleasure rather than for absence of pain. Popular
conceptual frameworks: Hassenzahl & Norman à it seems
clear that a UX evaluation approach should be address to
both the hedonic and pragmatic dimensions of system use.


User Experience (UX) | “A persons’ perception and response resulting from the use and/or anticipated use of a
product system or service.” – “UX is a consequence of a user’s internal state (predispositions, expectations, needs,
motivation, mood, etc.), the characteristics of the designed system (e.g., complexity, purpose, functionality, etc.)
and the context within which the interaction occurs.


USABILITY & USER EXPERIENCE

• Usability is a part of User Experience
• User Experience is the “satisfaction” part of Usability

Usability is concerned with the “effectiveness, efficiency, and satisfaction with which specified users achieve
specified goals in particular environments” whilst user experience is concerned with “all aspects of the user’s
experience when interacting with the product, service, environment or facility”

Defined as a process: User experience UX design is the process of creating products that provide meaningful
and relevant experiences to users. This involves the design of the entire process of acquiring and integrating the
product, including aspects of branding design, usability, and function.


USER EXPERIENCE (UX) EVALUATION

User Experience evaluation refers to a collection of methods, skills and tools utilized to uncover how a person
perceives a system before, during, and after interacting with it.

UX time spans
Experiencing UX design involves few stages of time spans like a span of time with an experience during usage
and span of time during episodes of usage or non-usage. Mainly there are four types of time spans in UX




Anticipated UX | The time span before actual usage of product and while imagining product.
Momentary UX | Experiencing the change of feeling or emotions during interaction.
Episodic UX | Evaluation of specific episode of usage.
Cumulative UX | Perspective on entire system as a whole unit.



3

,USEFULNESS

“… the degree to which a product enables a user to achieve his or her goals, and is an assessment of the user’s
willingness to use the product at all.”

Unless something is useful, all other usability dimensions become mostly redundant (overtollig)!

Designing the right thing vs. designing things right




Doing things right à means doing things in the manner they are expected, needed, or required to be done.
Doing the right thing à means making the right choice, choosing the right path, and displaying high ethics and
moral rectitude.

Danger of early usability testing
Fixing problems instead of expanding solution space

Apply the right methods for the right phase
- Concept drawings do not provide deep insights
- Task-centered evaluation focus on the negative

INTRODUCTION TO EVALUATION


WHY EVALUATING?

• Inform design decisions (= answer questions)
• Identify and fix design problems
• Money (cost reduction, profit gains)
• Create a sense of innovation
• Generation of scientific / intermediate level knowledge à generalize


EVALUATION AND USER RESEARCH

There is considerable overlap between doing evaluations and doing user research, in terms of:

• Evaluations of existing (competitor) products can be a part of research
• Other than unearthing existing issues, evaluations also give input for future directions
• Shared methods (especially qualitative, e.g., interviews)


FORMS OF USABILITY TESTING

• Lab vs. field testing
• Remote evaluations – (a)synchronous
o Any usability testing method where the evaluator and user participants are not in the same
location
• Incidental vs. long-term



4

,INFORMAL EVALUATIONS

• Fast & cheap
• Less planning
• Loose recruitment (e.g., even colleagues)
• Less structured (e.g., talk about existing usability issues)
• Less formal output (report)

SETTING UP A STUDY


IN SHORT

What to consider

• Planning | Approach, methods & time
• Sample | Sampling technique, kind of users, size
• Ethics | Consent, data transparency, privacy (GDPR)


PLANNING

It’s important to come up with an overall plan:

• Which approach and methods at what time
• How to integrate results (triangulation)
• Distribute resources (money, time, participants) accordingly


SAMPLE

Participants

• Convenient sampling | recruiting general population you can get hold of
• Purposive sampling | recruiting people with one or more special characteristics
o Matching your target demographic (e.g., based on personas)
o Matching a particular skill, you currently need (e.g., design skills)
o Novices versus experts
• Depends on the question you want to answer

“Friendly” users

• Ambassadors of your product
• Committed users, generally known to your commercial teams
• Take pride in being involved
• Easy to approach (frequently)

Sample size

• Depends on the purpose of your evaluation, guidelines are:
o Summative/assessment study, emphasis quantitative: 5-7
o Formative/exploratory study, emphasis qualitative: 10-20
o Comparison study: around 30 participants per group/product
• Available resources (budget, time) have an influence




5

, ETHICS

• Research must be beneficial and cause no harm.
• Participants should never feel uncomfortable, physically or psychologically.
o Your participants will usually be slightly nervous, no matter what.
• Assess potential risks, real and perceived, and mitigate these.
o Consider the study from the participants’ perspective.
o Notify participants of potential (perceived) risks.
• Stress that you are evaluating the product, not the participant.
• Acknowledge the participants’ expertise.

Informed consent

• Participants have the right to be informed.
o Purpose of the study, procedure, risk, participants’ rights.
• Using deception, or working with children with vulnerable target groups is not allowed without permission
from an ethical board.
• Debrief participants about nature, results, and conclusions.

Right to withdraw

• Participants always have the right to withdraw.
o Without penalty.
o Without giving you a reason.
• Do not trick/persuade people into participation, especially if you know them personally.

Privacy & confidentiality

• Keep participation confidential
• No identifying information should be kept with actual data
o Names à use participant ID (and de-identify as soon as possible)
o Contact details
o ANY identifiers
• Store data securely, no links between participant IDs and names/context details

Validate and reliable data

• always ensure that the data you collect are accurate, valid and reliable
• never collect data you know are invalid or unreliable
• be transparent about this

INTERMEDIATE LEVEL KNOWLEDGE

The foundational observation to start from is this: the essence of
research is to produce knowledge, and the essence of design is to
produce artifacts. Artifacts themselves can be said to be knowledge
in the very simple sense that they answer the research question: “How
would you design an …”.

A complete artifact is a particular response to a particular situation,
and strictly speaking it is not necessarily meaningful in its entirety
outside that situation à abstraction has to take place from the level
of particular artifacts to a higher level, in order to produce a
knowledge yield that is applicable across a broader range of
situations.


6

Maak kennis met de verkoper

Seller avatar
De reputatie van een verkoper is gebaseerd op het aantal documenten dat iemand tegen betaling verkocht heeft en de beoordelingen die voor die items ontvangen zijn. Er zijn drie niveau’s te onderscheiden: brons, zilver en goud. Hoe beter de reputatie, hoe meer de kwaliteit van zijn of haar werk te vertrouwen is.
roosbesemer1 Tilburg University
Bekijk profiel
Volgen Je moet ingelogd zijn om studenten of vakken te kunnen volgen
Verkocht
57
Lid sinds
4 jaar
Aantal volgers
53
Documenten
7
Laatst verkocht
2 jaar geleden

3,0

2 beoordelingen

5
0
4
0
3
2
2
0
1
0

Recent door jou bekeken

Waarom studenten kiezen voor Stuvia

Gemaakt door medestudenten, geverifieerd door reviews

Kwaliteit die je kunt vertrouwen: geschreven door studenten die slaagden en beoordeeld door anderen die dit document gebruikten.

Niet tevreden? Kies een ander document

Geen zorgen! Je kunt voor hetzelfde geld direct een ander document kiezen dat beter past bij wat je zoekt.

Betaal zoals je wilt, start meteen met leren

Geen abonnement, geen verplichtingen. Betaal zoals je gewend bent via iDeal of creditcard en download je PDF-document meteen.

Student with book image

“Gekocht, gedownload en geslaagd. Zo makkelijk kan het dus zijn.”

Alisha Student

Veelgestelde vragen