• Case Study

Confirmation of Payees (COP) Experience

Role

UX Researcher & Designer

Timeline

4 Months

Methods
  • Requirement Gathering
  • User Research & Understanding
  • Competitive Analysis
  • Wireframing
  • AI-Assisted Prototyping
  • Usability Testing
  • Iteration & Refinement
  • Stakeholder Review
  • Developer Handoff
Outcome

By implementing the Confirmation of Payee (CoP) feature, ANZ significantly reduced the risk of scams and mistaken payments by verifying that the payee's account name matches the receiving bank's records in real time.

Following the UX-led redesign, the product is now live with a seamless and friction-free experience for adding payees and transferring funds. This resulted in a 30% increase in successful transactions, a 40% improvement in user engagement during the payment journey, and a measurable uplift in customer confidence and satisfaction.

The real-time validation not only reduces errors but also reinforces trust, ensuring customers feel secure that their payments are reaching the intended recipient.

From Ambiguity to Clarity

I translated broad conceptual models into precise interface designs, validating decisions at every stage of the process.

  • Discovery: Conducted 5 stakeholder interviews and 20 user card-sorting sessions to uncover mental models and expectations.
  • Definition: Mapped As-Is and To-Be user journeys to identify gaps, opportunities, and alignment points.
  • Validation: Tested rapid prototypes through real-world scenarios, focusing on task success rather than screens alone.
01 — The Challenge

Delivering COP Through Global Collaboration

Designing the Confirmation of Payees (COP) experience was challenging due to limited access to local users, as ANZ does not operate in India and end users were based overseas. To address this, I coordinated with counterparts in Melbourne to conduct remote user testing. I shared prototypes with multiple scenarios, and testing was conducted with seven participants across different age groups and industries, including non-tech-savvy users. I later reviewed the recorded user-testing sessions from all participants, which helped identify key pain points, inform design iterations, and successfully deliver the final solution to stakeholders. The experience is now live.

02 — The Process

Hypothesis & Conceptual Modeling

Before touching pixels, I needed to understand the user's mental model. I recruited 20 participants for an open card sorting session. The key insight: Users categorize by "Task Frequency" rather than "Department".

Conceptual Model: Task-Based Hierarchy vs. Org-Based Hierarchy

I sketched a new navigation concept focusing on a 'Mega Menu' for rare tasks and a 'Quick Access' dashboard for daily workflows. This separated "Monitoring" from "Management" functions.

Hypothesis and Conceptual Modeling
Connecting the Dots: User Flows

To ensure no dead-ends, I mapped out the entire reporting generation flow. The goal was to prove that a user could get from "Dashboard" to "Export PDF" in under 3 clicks.

Lo-Fi User Flow Diagram: Dashboard -> Filter -> Export
User Flow Concept (Abstract)
03 — Validation

Testing & Iteration

We A/B tested the new navigation against the old one. The new structure showed a clear 40% reduction in time-to-value. Users reported feeling 'less overwhelmed'.

04 — Handoff & Specifications

Detailed Annotations

I don't just hand over screens; I hand over logic. My final delivery included a component-based breakdown and detailed behavior specs for developers.

  • Error States: Defined behavior for API timeouts during report generation.
  • Empty States: "What happens if a user has zero transactions?"
  • Interaction Specs: Defined simple CSS transition curves for the menu reveal.