Optimizing complex data entry

Redesigning the information input process to make it 75% easier to use.

Optimizing complex data entry

Redesigning the information input process to make it 75% easier to use.

Optimizing complex data entry

Redesigning the information input process to make it 75% easier to use.

Optimizing complex data entry

Redesigning the information input process to make it 75% easier to use.

Background

At the start of my tenure at NewRetirement (now Boldin) it became clear to me that one of the core issues with our product was the way users could edit information in their financial plan. The existing experience, called "My Plan" was impossible to use on a mobile device. The team knew there were issues, but they weren’t sure how to fix them.

Original Data Entry Experience

Original Data Entry Experience

Original Data Entry Experience

Deciding what to optimize for

As we looked at the opportunities in front of us, we decided to optimize for a couple of things:

Mobile traffic

Could a user on their mobile device easily edit their financial plan?

Plan completion

Could users easily understand how to complete their financial plan and did they feel motivated to complete their plan?

core user acceptance

Would our existing user base view this change as a positive enhancement?

Initial testing

Before starting any design work, I tested the existing UX flow of the product to suss out any other UX issues that needed to be addressed. Our initial task ease rating was an average of 2.8/5.

Establishing the page structure

The stakeholder team held a couple strong beliefs about the user experience:

  • Seeing the output of your edits (in the form of multiple charts and graphs) after making them was essential

  • All information on the page was valuable. They didn’t want to remove much.

As the founding designer, I had to balance well-intentioned opinions with design best practices. I leaned into the choices that genuinely improved the user experience, then used those wins to help the team make tougher tradeoffs elsewhere.

2 Column Structure

Rather than removing the right (output) column, I opted to make it larger so that desktop users (who were in the majority) could easily see the impact of plan edits — something users liked in the initial testing of this part of the site.

This made the left column (input column) tighter — which allowed me make a strong case for progressive disclosure and in turn would enable us to make the UX fully responsive on mobile.

Summary Rows

Summarizing data and having an explicit edit function allowed us to responsively display information to users, moving us away from the table-based structure we had previously.

They also created a clear line between viewing mode and editing mode. This solved a major usability issue by enabling us to give clearer user feedback about when information was being saved and applied to their plan.

Before

Before

Before

Before

After

After

After

After

Add and Edit Flows

To reduce cognitive overload and provide additional contextual education to users I proposed breaking our add and edit flows into a couple of steps (similar to what we did successfully with onboarding). This allowed us to only present relevant information to users and be more supportive as they set up their financial plan.

Additionally, there were some dimensions of financial planning that required us to capture more information to ensure accuracy (example: employer 401k contributions), this structure allowed us to do that without breaking the interface.

Looking at the information architecture

Tuning the experience

Now that the page structure was established, I had to take a look at how My Plan was organized.

Reorganization & Relabeling

We wanted to reduce the number of pages in My Plan and group similar information to help solve some of the category overlap issues.

By relabeling some categories we were able to make it easier for users to find what they were looking for. We went from 11 My Plan pages to 8.

Handling Ages and Dates

Unlike many other financial products, our planner was focused on projecting financial information. We relied on users inputing dates accurately to provide a solid calculation. But the way we labeled dates was inconsistent and confusing.

Date ranges

Our team aligned around having a set way to label date ranges (Start age and Stop age). This tested well. We also tested to determine if a user’s mental model was to include the entire month or stop at the first of the month so we could make calculations in a way that was aligned with what our users thought.

Communicating End of life

While Retirement age was easily understood by users, the concept of Goal age was not. Some users thought it was a proxy for Retirement age when in reality it was how long a user expected to live. Our team didn’t want this term to be morbid, so we opted to change Goal age to Longevity age which was more easily understood by users.

Multi-category Mapping

To address split mental models around categories like rent, we designed a system that intelligently surfaced the same data across multiple sections. This ensured users could find information wherever they expected to, while maintaining a single source of truth behind the scenes.

Embedded Contextual Support

The original help experience pulled users out of context—sending them to a separate page instead of layering guidance where they needed it. This interruption in flow led to frustration and task drop-off. We design a way to surface embedded, contextual support which kept users in the experience reducing frustration and drop-off.

Elevating Plan Completion

In early testing, users responded well to seeing how updates affected their forecast. To build on that, I added Plan completion to the right-hand column so they could clearly track progress through the planning flow.

We also added Start buttons to any incomplete plan dimensions, making it obvious what they still needed to do to finish their plan.

Testing the new experience

While I tested more discreet UX concerns in Figma prototype form, I knew that people seeing their own data in the planner would greatly impact their perception of the user experience. Our team decided to do the bulk of our testing during development. I worked closely with engineering to spin up a test environment where we could test the most stable version of their latest code to ensure we were delivering a great experience.

New users

Because it was important to us to appeal to people who were interested in financial planning, but not necessarily super financially literate we designed a series of tests in which new users would go through onboarding first then complete a series of tasks in the My Plan section of the planner.

User Testing Flow

User Testing Flow

User Testing Flow

User Testing Flow

We conducted 120 unmoderated user tests.

Desirability

79%

wanted to use the planner for their financial planning

Desirability

79%

wanted to use the planner for their financial planning

Desirability

79%

wanted to use the planner for their financial planning

Desirability

79%

wanted to use the planner for their financial planning

Task completion

>80%

completed their tasks successfully

Task completion

>80%

completed their tasks successfully

Task completion

>80%

completed their tasks successfully

Task completion

>80%

completed their tasks successfully

Users appreciated the ability to see the impact of changes to their data on their plan, several felt like the product was better than the tools they had used before.

The biggest issue in completing tasks was our existing navigation.

Existing users

Before fully launching we put the experience into beta on our site. This meant our current users could opt-in to the new experience for a period of time. While the experience was in beta we invited 11 users to participate in a more formal test of the new experience.

7 viewed the change as positive, 2 were neutral, 2 viewed the change as a negative.

Users liked the bigger right column graphs, summaries felt easier to read, and information felt like it was organized more logically. They also liked that they could see changes more clearly. Many folks commented that it felt much cleaner and they liked the layout.

One last change

Updating our navigation

Navigation was seen as the biggest usability issue we had when we tested the updated My Plan functionality. We knew we had to address this before calling this project “done”.

We decided to shift from a horizontal navigational structure to a vertical, left navigation that made the links much more apparent to users.

Old Navigation

New Navigation

This change had a very positive impact on usability.

  • Users found things 30% faster

  • Had a 100% success rate

  • Task ease went from 3.5/5 with old nav to 4.9/5 with new nav

This change had a very positive impact on usability.

  • Users found things 30% faster

  • Had a 100% success rate

  • Task ease went from 3.5/5 with old nav to 4.9/5 with new nav

This change had a very positive impact on usability.

  • Users found things 30% faster

  • Had a 100% success rate

  • Task ease went from 3.5/5 with old nav to 4.9/5 with new nav

Impact

Plan completion

95%

of users were able to successfully complete their plan

Plan completion

95%

of users were able to successfully complete their plan

Plan completion

95%

of users were able to successfully complete their plan

Subscription rate

114%

increase in subscription rate to PlannerPlus

Subscription rate

114%

increase in subscription rate to PlannerPlus

Subscription rate

114%

increase in subscription rate to PlannerPlus

Core user acceptance

81%

of existing users viewed the change as positive or neutral

Core user acceptance

81%

of existing users viewed the change as positive or neutral

Core user acceptance

81%

of existing users viewed the change as positive or neutral

Ease of Use

75%

increase as measured by average task ease rating

+2.1pp; 2.8 → 4.9

Ease of Use

75%

increase as measured by average task ease rating

+2.1pp; 2.8 → 4.9

Ease of Use

75%

increase as measured by average task ease rating

+2.1pp; 2.8 → 4.9

Mobile usage

83%

of mobile users rated their ease of use as 4+ (of 5)

Mobile usage

83%

of mobile users rated their ease of use as 4+ (of 5)

Mobile usage

83%

of mobile users rated their ease of use as 4+ (of 5)

Plan completion

95%

of users were able to successfully complete their plan

Ease of Use

75%

increase as measured by average task ease rating

+2.1pp; 2.8 → 4.9

Core user acceptance

81%

of existing users viewed the change as positive or neutral

Mobile usage

83%

of mobile users rated their ease of use as 4+ (of 5)

Subscription rate

114%

increase in subscription rate to PlannerPlus

© 2026 Rachel Diesel

© 2026 Rachel Diesel

© 2026 Rachel Diesel

© 2026 Rachel Diesel