Reimaging Enterprise Field Tools
Translated fragmented workflows into a unified execution model, aligning product architecture with real user behavior and future scalability.
Challenge
Field teams struggled with a fragmented platform, slow updates, complex navigation, and unclear task guidance. This was reflected in negative NPS scores and consistent user feedback. As a result, adoption remained low and nearly 90% of platform features were underutilized.
My Role
As the lead designer on this initiative, I also led the research effort to better understand how the platform supported real field workflows. My responsibilites included:
- Defining and leading a multi-level research initiative across field teams and regional leadership
- Conducting 30+ in-depth user interviews and contextual inquiries to understand behaviors, motivations, and workflow challenges
- Faciltating a Jobs to be Done workshops (JTBD) with 100+ participants to identify gaps between task importance and satisfaction
- Synthesizing qualitative insights and quantitative usage data to uncover adoption patterns and workflow inefficiencies
- Analyzing behavioral analytics to uncover adoption patterns and workflow inefficiencies
- Running prototype validation sessions with 50+ users to test and refine proposed workflow improvements
Research Overview

I began by analyzing NPS feedback and internal user input through affinity mapping, which highlighted consistent issues around usability, navigation complexity, and delayed updates. I then mapped the application architecture to understand how the structure of the platform contributed to daily workflows.
To validate these patterns, I conducted in-depth interviews with field reps and team leads to uncover motivations, behaviors, and pain points in their daily routines. Click-path analysis confirmed where users spent time, where they abandoned tasks, and which features were underutilized.
JTBD workshops were then done across additional field teams and regional managers to quantify critical workflow gaps across all levels. Identifying critical tasks and prioritizing opportunities for redesign. Early prototype testing of proposed solutions allowed us to ensure improvements aligned with real user behavior before moving into design execution.
Problem Landscape
Although the platform offered broad feature coverage, NPS remained negative, with significantly more detractors than promoters. Feedback pointed to navigation complexity, delayed updates, and the need to switch between multiple systems to complete core tasks. For field teams operating in time-constrained environments, these inefficiencies compounded quickly.
Behavioral data reinforced the issue. While the platform contained a wide range of features, 85-90% of them saw little to no engagement. Activity was concentrated around compliance-driven workflows: account lists, objectives, visits, and surveys. Even then, interactions were often superficial. Reps would tap into features to register activity, then immediately transition to a separate operating system that offered more reliable features.
This created a structural tension: the platform was contractually required for compliance, but operationally secondary for execution.

Trust futher deteriorated due to delayed updates and reporting lag. Field data surfaced days after submission, causing managers and support roles to rely on manual interpretation and secondary communication. Reps frequently deferred to team leads or upper management for direction rather than relying on the tool itself.
As one sales representative noted:
I mainly use it as a launching point to get into accounts.
And from a leadership perspective:
My sales team is not so hot on it. They see it as an extra step between them and selling the account.
To compensate, teams created workarounds, manually tracking objectives, visits, and surveys outside the system. Supporting materials were often rebuilt locally to avoid friction. Instead of enabling execution, the platform had become an administrative layer.
The core issue was not a lack of features. It was a misalignment between product structure and real-world workflow.
Research Strategy
I approached the investigation using a layered, mixed-methods strategy to identify both behavioral and systemic pain pints, triangulating insights across qualitative and quantitative sources.
Affinity Mapping
I synthesized NPS feedback and internal input, revealing that users experienced low adoption, delayed updates for complaince-tracked items, long load times, and limited task guidance. Patterns suggested that platform complexity and performance issues were pervasive and across field workflows.
Site Mapping
This helped clarify the structural problem. The application architecture was cluttered and required excessive clicks to access critical workflows, including task management, survey completion, and compliance tracking, amplifying user frustration.
User interviews
Interviews with 30 participants across field and team leadership roles deepened our understanding of user needs. Key pain points centered on minimizing extraneous features, centralizing task and objective tracking, and streamlining guidance to create a more seamless experience.
Click Path Analysis
Validated behavioral patterns. Users primarily navigated between core account views, while roughly 90% of other features went largely unused. This confirmed that low-value areas created friction and adoption gaps.
JTBD Workshops
With 100+ participants quantified gaps between satisfaction and importance, highlighting workflow deficiencies in planning, goal preparation, execution, and post-activity review.
Prototype Testing
Utilizing Figma and Lysnna helped provide early validation. Redesigned dashboards and task-tracking screens scored highly in satisfaction, with most users able to quickly identify incomplete objectives and missing items, and most noting significant improvement over the previous workflow.
This redesign encapsulates the changes needed to optimize task management. Including performance metrics and account insights is a welcomed addition that could help teams immediately by providing actionable information for each account. - Field User
By layering quantitative data, qualitative synthesis, and early prototype validation, I was able to identify precise workflow failures, confirm user behavior patterns, and prioritize interventions that would meaningfully improve adoption, efficiency, and data reliability.
Insights
Quantified Workflow Gaps
The largest gaps between importance and satisfaction were in route planning, task execution, and post-activity review, clearly highlighting where the platform fails to support field workflows and providing actionable priorities for improvement.
Core Feature Focus & Adoption
Users focus almost entirely on a few essential areas while most features remain unused, leading to workarounds and reliance on other tools. Adoption is strongly tied to perceived value and alignment with critical workflows.
Trust & Data Reliability Gaps
Delays in updates and reporting create distrust, forcing users to rely on supervisors or manual tracking. This systemic issue reduces efficiency and accountability across teams.
Learnings
What Worked Well
The multi-method research approach effectively uncovered critical workflow gaps, adoption patterns, and trust issues. By combining qualitative insights, click-path analysis, and JTBD validation, I synthesized a clear picture of user needs and priorities, providing a strong foundation for actionable recommendations. Early prototype testing further validated design directions and aligned improvements with real user behavior, reinforcing the value of iterative, evidence-driven research.
What I'd Do Differently
I would focus on building ongoing trust with users by maintaining continuous engagement throughout the project. Regular check-ins and feedback sessions would ensure that research not only addresses business priorities but also genuinely supports the field teams, creating stronger alignment and richer insights over time.
Key Takeaway
Research doesn't just validate assumptions; it challenges them. The gap between what we thought users needed and what they needed was significant. This project reinforced the importance of talking to users early and often, and using data to drive decisions rather than intuition alone.
Want to learn more?
I'd be happy to walk through this project in more detail and discuss how my approach could benefit your team.
Get in Touch