Mini Case Study

Dashboard Redesign

2x upgrade rate from a two-day dashboard redesign

Mini Case Study

Dashboard Redesign

2x upgrade rate from a two-day dashboard redesign

Mini Case Study

Dashboard Redesign

2x upgrade rate from a two-day dashboard redesign

Mini Case Study

Dashboard Redesign

2x upgrade rate from a two-day dashboard redesign

Background

When I joined NewRetirement (now Boldin), design was new territory. We needed early momentum, but data and research practices were still maturing. I started with what we did have—common sense and heuristics—and targeted the dashboard as a high-visibility first win.

The original dashboard design (BEFORE)

The original dashboard design (BEFORE)

Before

The original dashboard design (BEFORE)

Problems

The dashboard was full of information… and short on answers.

Unclear primary takeaway

The dashboard showed lots of high-value metrics, but didn’t answer the user’s core question: Am I on track—and what matters most right now?

High cognitive load

Dense terminology, crowded layouts, and legend-dependent visuals assumed financial expertise—making the experience intimidating for many users.

Insight without action

The dashboard surfaced insights, but rarely translated them into clear next steps—leaving users unsure how to improve their plan.

Solution

I ran a two-day design sprint with the CEO, CTO, and Head of Product to define a dashboard iteration that was simple to implement and ready for A/B testing.

New design

Features

Outcomes

As engineering built out the new dashboard, our PM implemented the tracking needed to measure the impact of our A/B test. Once the experiment launched, we saw an early leading indicator: more users navigated to My Plan to finish their plan.

A/B test results

Plan completion

12%

increase in people completing their plan

+7pp; 58% → 65%

Plan completion

12%

increase in people completing their plan

+7pp; 58% → 65%

Plan completion

12%

increase in people completing their plan

+7pp; 58% → 65%

Plan completion

12%

increase in people completing their plan

+7pp; 58% → 65%

Plan completion

12%

increase in people completing their plan

+7pp; 58% → 65%

Coach engagement

70%

increase in people interacting with coach suggestion

+17pp; 24% → 41%

Coach engagement

70%

increase in people interacting with coach suggestion

+17pp; 24% → 41%

Coach engagement

70%

increase in people interacting with coach suggestion

+17pp; 24% → 41%

Coach engagement

70%

increase in people interacting with coach suggestion

+17pp; 24% → 41%

Coach engagement

70%

increase in people interacting with coach suggestion

+17pp; 24% → 41%

Upgrade Rate

2X

increase in PlannerPlus upgrades on day-1

+1.7pp; 1.7% → 3.4%

Upgrade Rate

2X

increase in PlannerPlus upgrades on day-1

+1.7pp; 1.7% → 3.4%

Upgrade Rate

2X

increase in PlannerPlus upgrades on day-1

+1.7pp; 1.7% → 3.4%

Upgrade Rate

2X

increase in PlannerPlus upgrades on day-1

+1.7pp; 1.7% → 3.4%

Upgrade Rate

2X

increase in PlannerPlus upgrades on day-1

+1.7pp; 1.7% → 3.4%

The test delivered meaningful lifts across engagement and revenue: +12% plan completions, +70% interactions with coach suggestions, and 2x PlannerPlus subscriptions. We rolled the redesign out more broadly—and used the win to unlock bigger design investments.

© 2026 Rachel Diesel

© 2026 Rachel Diesel

© 2026 Rachel Diesel

© 2026 Rachel Diesel

© 2026 Rachel Diesel