User-Centred Design
Status: Complete Category: Design Default enforcement: Soft Author: PushBackLog team
Tags
- Topic: design, ux, research
- Skillset: design
- Technology: generic
- Stage: discovery, refinement
Summary
User-Centred Design (UCD) is a design philosophy and iterative process in which the needs, goals, behaviours, and constraints of end users are the primary driver of product decisions. Rather than building what is technically convenient or internally assumed to be useful, UCD grounds every decision in validated user understanding.
Rationale
Software that engineers find intuitive is frequently software that only engineers find intuitive. The assumptions embedded in a product by people who understand it deeply are often invisible to people encountering it for the first time. User-Centred Design makes those assumptions explicit and tests them against reality before they are built into production code.
The cost of correcting a design error found in user research is negligible. The cost of correcting it after it has been engineered, QA’d, and shipped is significant. UCD is fundamentally a cost management practice, not merely an empathy exercise.
Guidance
The UCD process
UCD is iterative rather than linear. The core loop:
- Understand — Research who the users are, what they need, and what their current experience looks like
- Define — Frame the problem to be solved in user terms, not feature terms
- Ideate — Generate design solutions against the defined problem
- Prototype — Build the minimum fidelity model needed to test the most important assumption
- Test — Put the prototype in front of users; observe, listen, and measure
- Iterate — Incorporate findings and repeat
The loop does not end at release. User research in production (analytics, session observation, support analysis) feeds back into the next cycle.
User research methods
| Method | Fidelity | Best for |
|---|---|---|
| Stakeholder interviews | Low | Establishing initial direction, organisational constraints |
| User interviews | Low-medium | Understanding mental models, workflows, pain points |
| Contextual inquiry | Medium | Observing actual use in real context |
| Surveys | Low | Quantifying attitudes and behaviours at scale |
| Usability testing | Medium-high | Validating specific interface decisions |
| A/B testing | High | Measuring behavioural impact of two design variants |
| Analytics review | High | Understanding aggregate behaviour patterns in production |
Personas and jobs-to-be-done
User research output should be synthesised into artefacts the product team can reason from:
User personas describe representative user archetypes — their goals, behaviours, pain points, and context. They prevent “the user” from meaning “someone like us.”
Jobs-to-be-done reframe features as user outcomes: “When [situation], I want to [motivation], so I can [expected outcome].” This framing keeps the team focused on what the user is trying to accomplish rather than the feature mechanism.
Information architecture
Before any visual design begins, the structure of the product should be resolved:
- What are the primary tasks the user needs to accomplish?
- How do they navigate between them?
- How is information grouped and labelled?
- What is the hierarchy of content on each screen?
Card sorting and tree testing are lightweight methods for validating information architecture before it is built.
Prototype fidelity
Match prototype fidelity to the question being asked:
- Paper / whiteboard — layout and flow validation; fast, cheap
- Wireframe — structural and navigational validation
- Interactive prototype — interaction and task completion validation
- Functional prototype — performance and integration validation
Never build high-fidelity prototypes for questions that a whiteboard can answer.
Common failure modes
| Failure | Description |
|---|---|
| Assumed users | Product decisions made for a hypothetical user nobody has actually spoken to |
| Research theatre | User research conducted but findings not incorporated into decisions |
| Single-pass research | Research done at the start of a project and never revisited |
| Designing for power users | Interface optimised for people who use it all day; unusable for new users |
| Conflating usability with aesthetics | Visual polish addressed while fundamental usability problems remain |
Examples
User interview script skeleton
Interview guide — Checkout flow research
Participant: [role, context]
Duration: 45 minutes
1. Warm-up (5 min)
• Tell me about your role and how you use [product] day-to-day.
• How often do you [task under study]?
2. Current experience (15 min)
• Walk me through the last time you [task]. What were you trying to accomplish?
• Where did you start? What did you do next?
• Were there any moments where you felt stuck or uncertain?
• What would have made that easier?
3. Concept exploration (15 min)
[Show prototype or screenshot]
• Before you click anything — what do you notice first?
• What would you expect to happen if you clicked [element]?
• Try to [task]. Think aloud as you go.
4. Wrap-up (10 min)
• Is there anything else about this experience that we should know?
• What would make this 10x better for you?
Key interview principles: ask about behaviour, not opinions. “What did you do?” yields more signal than “What would you do?”. Ask “Why?” at least once per answer.
Jobs-to-be-done framing example
Feature framing (to avoid):
“As a user, I want to filter results by date so that I can find recent items.”
JTBD framing (preferred):
“When I’m investigating a suspected data issue, I want to narrow the dataset to the time window where the problem occurred, so I can quickly isolate which records are affected without wading through unrelated data.”
The JTBD framing surfaces the context, the motivation, and the expected outcome. A filter UI is one solution — a pre-built “recent issues” view might be better. The feature-first framing forecloses that conversation.
Prototype fidelity decision matrix
| Question to answer | Prototype needed | Why |
|---|---|---|
| Does this navigation structure make sense? | Paper / whiteboard | No visual polish needed; structure is the question |
| Can users complete this 3-step flow? | Clickable wireframe | Interactions matter; visual design does not yet |
| Does this microcopy reduce support requests? | High-fidelity HTML | Exact wording and context required |
| Does this feature increase conversion? | A/B test in production | Behavioural data at scale needed |
Related practices
Part of the PushBackLog Best Practices Library. Suggest improvements →