ZeePedia

REQUIREMENTS: NARRATIVE AS A DESIGN TOOL, ENVISIONING SOLUTIONS WITH PERSONA-BASED DESIGN

<< USER MODELING: PERSONAS, GOALS, CONSTRUCTING PERSONAS
FRAMEWORK AND REFINEMENTS: DEFINING THE INTERACTION FRAMEWORK, PROTOTYPING >>
img
Human Computer Interaction (CS408)
VU
Lecture
23
Lecture 23. Requirements
Learning Goals
As the aim of this lecture is to introduce you the study of Human Computer
Interaction, so that after studying this you will be able to:
·  Understand the narratives and scenarios
·  Define requirements using persona-based design
It has already been discussed how to capture qualitative information about users.
Through careful analysis of this information and synthesis of user models, we can get
a clear picture of our users and their respective goals. It has also been explained how
to prioritize which users are the most appropriate design targets. The missing piece to
the puzzle, then, is the process of translating this knowledge into coherent design
solutions that meet the needs of users while simultaneously addressing business needs
and technical constraints.
Now we shall describe a process for bridging the research-design gap. It employs
personas as the main characters in set of techniques that rapidly arrive at design
solutions in an iterative repeatable, and testable fashion. This process has three major
milestones: defining user requirements; using these requirements to in turn define the
fundamental interaction framework for the product; and filling in the framework with
ever-increasing amounts of design detail. The glue that holds the process together is
narrative: use of personas to tell stories that point to design.
Narrative as a design tool
23.1
Narrative, or storytelling, is one of the oldest human activities. Much has been written
about the power of narrative to communicate ideas. However, narrative can also,
through its efficacy at engaging and stimulating creative visualization skills, serve as
a powerful tool in generating and validating design ideas. Because interaction design
is first and foremost the design of behavior that occurs over time, a narrative structure,
combined with the support of minimal visualization tools such as the whiteboard, is
perfectly suited for envisioning and representing interaction concept. Detailed
refinement calls for more sophisticated visual and interactive tools, but the initial
work of defining requirements and frameworks is best done fluidly and flexibly, with
minimal reliance on technologies that will inevitably impede ideation.
Scenarios in design
Scenario is a term familiar to usability professional, commonly used to describe a
method of design problem solving by concretization: making use of a specific story to
both construct and illustrate design solutions. Scenarios are anchored in the concrete,
but permit fluidity; any member of the design team can modify them at will. As
Carroll states in his book, Making Use:
Scenarios are paradoxically concrete but rough, tangible but flexible ... they
implicitly encourage `what-if'? Thinking among all parties. They permit the
articulation of design possibilities without understanding innovation ....
211
img
Human Computer Interaction (CS408)
VU
Scenarios compel attention to the use that will be made of the design product.
They can describe situations at many levels of detail, for many different
purposes, helping to coordinate various aspects of the design project.
Carroll's use of scenario-based design focuses on describing how users accomplish
tasks. It consists of an environment setting and includes agents or actors that are
abstracted stand-ins for users, with role-based names such as Accountant or
Programmer.
Although Carroll certainly understands the power and importance of scenarios in the
design process, there can be two problems with scenarios as Carroll approaches them:
·  Carroll's scenarios are not concrete enough in their representation of the
human actor. It is impossible to design appropriate behaviors for a system
without understanding in specific detail the users of the system. Abstracted,
role-oriented models are not sufficient concrete to provide understanding or
empathy with users.
·  Carroll's scenarios jump too quickly to the elaboration of tasks without
considering the user's goals and motivations that drive and filter these tasks.
Although Carroll does briefly discuss goals, he refers only to goals of the
scenario. These goals are somewhat circularly defined as the completion of
specific tasks. Carroll's scenarios begin at the wrong level of detail: User
goals need to be considered before user tasks can be identified and prioritized.
Without addressing human goals, high-level product definition becomes
difficult.
The missing ingredient in scenario-based methods is the use of personas. A persona
provides a sufficiently tangible representation of the user to act as a believable agent
in the setting of a scenario. This enhances the designer's ability to empathize with
user mental models and perspectives. At the same time, it permits an exploration of
how user motivations inflect and prioritize tasks. Because personas model goals and
not simply tasks, the scope of the problem that scenarios address can also be
broadened to include product definition. They help answer the questions, " what
should this product be?" and "how should this product look and behave?"
Using personas in scenarios
Persona-based scenarios are concise narrative descriptions of one or more personas
using a product to achieve specific goals. Scenarios capture the non-verbal dialogue
between artifact and user over time, as well as the structure and behavior of
interactive functions. Goals serve as filter for tasks and as guides for structuring the
display of information and controls during the interactive process of constructing the
scenarios.
Scenario content and context are derived from information gathered during the
Research phase and analyzed during the modeling phase. Designers role-play
personas as the characters in these scenarios, similar to actors performing
improvisation. This process leads to real-time synthesis of structure and behavior--
typically, at a whiteboard--and later informs the detailed look and feel. Finally,
personas and scenarios are used to test the validity of design ideas and assumptions
throughout the process. Three types of persona-based scenarios are employed at
different points in the process, each time with a successively narrower focus.
212
img
Human Computer Interaction (CS408)
VU
Persona-based scenarios versus use cases
Scenarios and use cases are both methods of describing a digital system. However,
they serve very different functions. Goal-directed scenarios are an iterative means of
defining the behavior of a product from the standpoint of specific users. This includes
not only the functionality of the system, but the priority of functions and the way
those functions are expressed in terms of what the user sees and how he interacts with
the system.
Use cases, on the other hand, are a technique that has been adopted from software
engineering by some usability professionals. They are usually exhaustive description f
functional requirements of the system, often of a transactional nature, focusing on
low-level user action and system responds--is not, typically, part of conventional or
concrete use case, many assumptions about the form and behavior of the system to be
designed remain implicit. Use cases permit a complete cataloguing of user tasks for
different classes of users, but say little or nothing about how these tasks are presented
to the user or how they should be prioritized in the interface. Use cases may be useful
in identifying edge cases and for determining that a product is functionally complete,
but they should be deployed only in the later stages of design validation.
Envisioning solutions with persona-based design
23.2
It has already been discussed that the translation from robust models to design
solutions really consists of two major phases. Requirements Definition answers the
broad questions about what a product is and what it should do, and Framework
Definition answers questions about how a product behaves and how it is structured to
meet user goals. Now we look Requirement Definition phase in detail.
Defining the requirements
The Requirement Definition phase determines the what of the design: what functions
our personas need to use and what kind of information they must access to accomplish
their goals. The following five steps comprise this process:
1. Creating problem and vision statement
2. Brainstorming
3. Identifying persona expectations
4. Constructing the context scenario
5. Identifying needs
Although these steps proceed in roughly chronological order, they represent an
iterative process. Designers can expect to cycle through step 3 through 5 several times
until the requirements are stable. This is a necessary part of the process and shouldn't
be short-circuited. A detailed description of each of these steps follows.
Step1: Creating problem and vision statement
Before beginning any process of ideation, it's important for designers to have a clear
mandate for moving forward, even if it is a rather high-level mandate. Problem and
vision statements provide just such a mandate and are extremely helpful in building
consensus among stakeholders before the design process moves forward.
At a high level, the problem statement defines the objective of the design. A design
problem statement should concisely reflect a situation that needs changing, for both
213
img
Human Computer Interaction (CS408)
VU
the personas and for the business providing the product to the personas. Often a
cause-and-effect relationship exists between business concerns and persona concerns.
For example:
Company X's customer satisfaction ratings are low and market share has
diminished by 10% over the past year because users don't have adequate tools
to perform X, Y and Z tasks that would help them meet their goal of G.
The connection of business issues to usability issues is critical to drive stakeholders'
buy-in to design efforts and to frame the design effort in term of both user and
business goals.
The vision statement is an inversion of the problem statement that serves as a high-
level design vision or mandate. In the vision statement, you lead with the user's
needs, and you transition from those to how business goals are met by the design
vision:
The new design of Product X will help users achieve G by giving them the
ability to perform X, Y and Z with greater [accuracy, efficiency, and so on],
and without problems A, B, C that they currently experience. This will
dramatically improve Company X's customer satisfaction ratings and leads to
increased market share.
The content of both the problem and vision statement should come directly from
research and user models. User goals and needs should derive from the primary and
secondary personas, and business goals should be extracted from stakeholder
interviews.
Step 2: Brainstorming
Brainstorming performed at this earlier stage of Requirements Definition assumes a
somewhat ironic purpose. As designers, you may have been researching and modeling
users and the domain for days or even weeks. It is almost impossible that you have
not had design ideas percolating in your head. Thus, the reason we brainstorming at
this point in the process is t get these ideas out our heads so we can "let them go" at
least for the time being. This serves a primary purpose of eliminating as much
designer bias as possible before launching into scenarios, preparing the designers to
take on the roles of the primary personas during the scenario process.
Brainstorming should be unconstrained and critical--put all the wacky ideas you've
been considering (plus some you haven't) out n the table and be the prepared to
record them and file them away for safekeeping until much later in the process. It's
not likely any of them will be useful in the end, but there might be the germ of
something wonderful that will fit into the design framework you later create.
Holtzblatt & Beyer describe a facilitated method for brainstorming that can be useful
for getting a brainstorming session started, especially if your team includes non-
designers.
Step 3: Identifying persona expectations
The expectations that your persona has for a product and its context of use is,
collectively, that persona's mental model of the product. It is important that the
representation model of the interface--how the design behaves and presents itself--
should match the user's mental model as closely as possible, rather than reflecting the
implementation model of how the product is actually constructed internally.
For each primary persona you must identify:
214
img
Human Computer Interaction (CS408)
VU
General expectations and desires each may have about the experience of using
·
the product
Behaviors each will expect or desire from the product
·
Attitude, past experience, aspirations and other social, cultural, environmental
·
and cognitive factors that influence these desires
Your persona descriptions may contain enough information to answer some of these
questions directly; however, you should return to your research data to analyze the
language and grammar of how user subjects and describe objects and actions that are
part of their usage patterns.
Some things to look for include:
·  What do the subjects mention first?
·  Which action words (verbs) do they use?
·  Which intermediate steps, tasks, or objects in a process don't they mention?
After you have compiled a good list of expectations and influences, do the same for
secondary and customer personas and crosscheck similarities and differences.
Step 4: constructing context scenarios
Scenarios are stories about people and their activities. Context scenarios are, in fact,
the most story-like of the three types of scenario we employ in that the focus is very
much on the persona, her mental models, goals, and activities. Context scenarios
describe the broad context in which usage patterns are exhibited and include
environmental and organizational considerations. Context scenarios establish the
primary touch-points that each primary and secondary persona has with the system
over the course of a day, o some other meaningful length of time that illuminates
modes of frequent and regular use. Context scenarios are sometimes, for this reason,
called day-in-the-life scenarios.
Context scenarios address questions such as the following
What is the setting in which the product will be used?
·
Will it be used for extended amounts or time?
·
Is the persona frequently interrupted?
·
Are there multiple users on a single workstation/device?
·
What other products is it used with?
·
How much complexity is permissible, based on persona skill and frequency of
·
use?
What primary activities does the persona need to accomplish to meet her
·
goals?
What is the expected end result of using the product?
·
To ensure effective context scenarios, keep them broad and relatively shallow in
scope. Resist the urge of dive immediately into interaction detail. It is important to
map out the pig picture first and systematically identify needs. Doing this and using
the steps that follow prevent you from getting lost in design details that may not fit
together coherently later.
215
img
Human Computer Interaction (CS408)
VU
Context scenarios should not represent system behaviors as they currently are. These
scenarios represent the brave new world of goal-directed products, so, especially in
the initial phases, focus on the goals. Don't yet worry about exactly how things will
get accomplished--you can initially treat the design as a bit of a magic black box.
Sometimes more than one context scenario is necessary. This is true especially when
there are multiple primary personas, but sometimes even a single primary persona
may have two or more distinct contexts of use.
An example text scenario
The following is an example of a first iteration of a context scenario for a primary
persona for a PDA/phone convergence device and service; Salman, a real-estate agent
in Lahore. Salman's goals are to balance work and home life, cinch the deal, and
make each client feel like he is his only client.
Salman's context scenario might be as follow:
1. Getting ready in the morning, Salman uses his phone to check e-mail. It has a
large enough screen and quick connection time so that it's more convenient
than booting up a computer as he rushes to make his daughter, Alia, a
sandwich for school.
2. Salman sees e-mail from his newest client who wants to see  a house this
afternoon. Salman entered his contact info a few days ago, so now he can call
him with a simple action right from the e-mail.
3. While on the phone with his client, Salman switches to speakerphone so he
can look at the screen while talking. He looks at his appointments to see when
he's free. When he creates a new appointment, the phone automatically makes
it an appointment with client, because it knows with whom he is talking. He
4. quickly keys the address of the property into the appointment as he finishes his
conversation.
5. After sending Alia off to school, Salman heads into the real-estate office to
gather the papers he needs for the plumber working on another property. His
phone has already updated his Outlook appointments so the rest of the office
knows where he'll be in the afternoon.
6. The day goes by quickly, and he's running a bit late. As he heads towards the
property he'll be showing client, the phone alerts him that his appointment is
in 15 minutes. When he flips open the phone, it shows not only the
appointment, but a list of all documents related to client, including e-mail,
memos, phone messages, call logs to client's number, and even thumbnail
pictures of the property that Salman sent as e-mail attachments. Salman
presses the call button, and the phone automatically connects to client because
it knows his appointment with him is soon. He lets him know he'll be there in
20 minutes.
7. Salman knows the address of the property, but is a bit unsure exactly where it
is. He pulls over and taps the address he put into the appointment. The phone
downloads directions along with a thumbnail map showing his location
relative to the destination.
8. Salman gets to the property on time and starts showing it to client. He hears
the phone ring from his pocket. Normally while he is in an appointment, the
phone will automatically transfer directly to voicemail, but Alia has a code she
can press to get through. The phone knows it's Alia calling, and uses a
distinctive ring tone.
216
img
Human Computer Interaction (CS408)
VU
9. Salman takes the call--Alia missed the bus and needs a pickup. Salman calls
her daughter and ask her that he will pick her up after 30 minutes.
Now how the scenario remains at a fairly high level, not getting into too specific
about interface or technologies. It's important to create scenarios that are within the
realm of technical possibility, but at this stage the details of reality aren't yet
important.
Step 5: identifying needs
After you are satisfied with an initial draft of your context scenario, you can begin to
analyze it to extract the persona' needs. These needs consist of objects and actions as
well as contexts. It is preferred not to think of needs as identical to tasks. The
implication is that tasks must be manually performed by the user, whereas the term
needs implies simply that certain objects need to exist and that certain actions on them
need to happen in certain contexts. Thus, a need from the scenario above might be:
Call (action) a person (object) directly from an appointment (context)
If you are comfortable extracting needs in this format, it works quite well; you can
separate them as described in the following sections.
Data needs
Persons' data needs are the objects and information that must be represented in the
system. Charts, graphs, status markers, document types, attributes to be sorted,
filtered, or manipulated, and graphical object types to be directly manipulated are
examples of data needs.
Functional needs
Functional needs are the operations that need to be performed on the objects of the
system and which are eventually translated into interface controls. Functional needs
also define places or containers where objects or information in the interface must be
displayed.
Contextual needs and requirements
Contextual needs describes relationships between sets of objects or sets of controls, as
well as possible relationship between objects and controls. This can include which
types of objects to display together to make sense for workflow or to meet specific
persona goals, as well as how certain objects must interact with other objects and the
skills and capabilities of the personas using the product.
Other requirements
It's important to get a firm idea of the realistic requirements of the business and
technology you are designing.
·  Business requirements can include development timelines, regulations, pricing
structures, and business models.
·  Technical requirements an include weight, size, form-factor, display, power
constraints, and software platform choices.
·  Customer and partner requirements can include ease of installation,
maintenance, configuration, support costs, and licensing agreements.
Now design team should have a mandate in the form of the problem and vision
statements, a rough, creative overview of how the product is going to address user
217
img
Human Computer Interaction (CS408)
VU
goals in the form of context scenarios, and a reductive list of needs and requirements
extracted from your research user models, and scenarios. Now you are ready to delve
deeper into the details of your product's behaviors, and begin to consider how the
product and its functions will be represented.
218
Table of Contents:
  1. RIDDLES FOR THE INFORMATION AGE, ROLE OF HCI
  2. DEFINITION OF HCI, REASONS OF NON-BRIGHT ASPECTS, SOFTWARE APARTHEID
  3. AN INDUSTRY IN DENIAL, SUCCESS CRITERIA IN THE NEW ECONOMY
  4. GOALS & EVOLUTION OF HUMAN COMPUTER INTERACTION
  5. DISCIPLINE OF HUMAN COMPUTER INTERACTION
  6. COGNITIVE FRAMEWORKS: MODES OF COGNITION, HUMAN PROCESSOR MODEL, GOMS
  7. HUMAN INPUT-OUTPUT CHANNELS, VISUAL PERCEPTION
  8. COLOR THEORY, STEREOPSIS, READING, HEARING, TOUCH, MOVEMENT
  9. COGNITIVE PROCESS: ATTENTION, MEMORY, REVISED MEMORY MODEL
  10. COGNITIVE PROCESSES: LEARNING, READING, SPEAKING, LISTENING, PROBLEM SOLVING, PLANNING, REASONING, DECISION-MAKING
  11. THE PSYCHOLOGY OF ACTIONS: MENTAL MODEL, ERRORS
  12. DESIGN PRINCIPLES:
  13. THE COMPUTER: INPUT DEVICES, TEXT ENTRY DEVICES, POSITIONING, POINTING AND DRAWING
  14. INTERACTION: THE TERMS OF INTERACTION, DONALD NORMAN’S MODEL
  15. INTERACTION PARADIGMS: THE WIMP INTERFACES, INTERACTION PARADIGMS
  16. HCI PROCESS AND MODELS
  17. HCI PROCESS AND METHODOLOGIES: LIFECYCLE MODELS IN HCI
  18. GOAL-DIRECTED DESIGN METHODOLOGIES: A PROCESS OVERVIEW, TYPES OF USERS
  19. USER RESEARCH: TYPES OF QUALITATIVE RESEARCH, ETHNOGRAPHIC INTERVIEWS
  20. USER-CENTERED APPROACH, ETHNOGRAPHY FRAMEWORK
  21. USER RESEARCH IN DEPTH
  22. USER MODELING: PERSONAS, GOALS, CONSTRUCTING PERSONAS
  23. REQUIREMENTS: NARRATIVE AS A DESIGN TOOL, ENVISIONING SOLUTIONS WITH PERSONA-BASED DESIGN
  24. FRAMEWORK AND REFINEMENTS: DEFINING THE INTERACTION FRAMEWORK, PROTOTYPING
  25. DESIGN SYNTHESIS: INTERACTION DESIGN PRINCIPLES, PATTERNS, IMPERATIVES
  26. BEHAVIOR & FORM: SOFTWARE POSTURE, POSTURES FOR THE DESKTOP
  27. POSTURES FOR THE WEB, WEB PORTALS, POSTURES FOR OTHER PLATFORMS, FLOW AND TRANSPARENCY, ORCHESTRATION
  28. BEHAVIOR & FORM: ELIMINATING EXCISE, NAVIGATION AND INFLECTION
  29. EVALUATION PARADIGMS AND TECHNIQUES
  30. DECIDE: A FRAMEWORK TO GUIDE EVALUATION
  31. EVALUATION
  32. EVALUATION: SCENE FROM A MALL, WEB NAVIGATION
  33. EVALUATION: TRY THE TRUNK TEST
  34. EVALUATION – PART VI
  35. THE RELATIONSHIP BETWEEN EVALUATION AND USABILITY
  36. BEHAVIOR & FORM: UNDERSTANDING UNDO, TYPES AND VARIANTS, INCREMENTAL AND PROCEDURAL ACTIONS
  37. UNIFIED DOCUMENT MANAGEMENT, CREATING A MILESTONE COPY OF THE DOCUMENT
  38. DESIGNING LOOK AND FEEL, PRINCIPLES OF VISUAL INTERFACE DESIGN
  39. PRINCIPLES OF VISUAL INFORMATION DESIGN, USE OF TEXT AND COLOR IN VISUAL INTERFACES
  40. OBSERVING USER: WHAT AND WHEN HOW TO OBSERVE, DATA COLLECTION
  41. ASKING USERS: INTERVIEWS, QUESTIONNAIRES, WALKTHROUGHS
  42. COMMUNICATING USERS: ELIMINATING ERRORS, POSITIVE FEEDBACK, NOTIFYING AND CONFIRMING
  43. INFORMATION RETRIEVAL: AUDIBLE FEEDBACK, OTHER COMMUNICATION WITH USERS, IMPROVING DATA RETRIEVAL
  44. EMERGING PARADIGMS, ACCESSIBILITY
  45. WEARABLE COMPUTING, TANGIBLE BITS, ATTENTIVE ENVIRONMENTS