Nine surveys. One month. $200k in identified savings. When the Enterprise Architecture team needed to evaluate how software portfolios supported mission-critical work, I didn't just design a survey, I redesigned the research infrastructure itself.

My Roles
Lead UX Researcher · Lead Survey Designer
Timeline
1 month
Tools
Qualtrics · Teams · GitHub · MS Office · Mural
9→1
Separate surveys consolidated into one adaptive instrument
75%
Reduction in cognitive burden for employees
$200k
Identified client savings from the first deployment cycle

The Enterprise Architecture team was operating with fragmented, low-quality data across nine separate survey instruments, creating survey fatigue, inconsistent outputs, and an inability to make confident portfolio decisions.

I was brought in as Lead UX Researcher and Lead Survey Designer to solve not just the survey design problem, but the underlying research operations problem.

I made an early strategic call: this wasn't a survey refresh project, it was a consolidation and systems design project. I mapped nine surveys to their underlying business questions, identified redundancies, and designed a single adaptive instrument that used filtering logic to route employees to only the questions relevant to their office and software context.

Suggested Next Steps

  • Set an auditing schedule
  • Review applications and adjust as needed

The following screens walk through the key design decisions embedded in the consolidated survey instrument. Each decision was made to reduce cognitive load while improving data quality and scalability.

Design Decision

Office Choice, Replacing Nine Surveys with One

Employees could choose their office to automatically narrow down the applications list, eliminating the need for nine separate surveys entirely. This single filtering mechanism was the foundational design choice that made everything else possible.

Office selection screen, top
Office selection dropdown, employees choose their unit
Office selection screen, bottom
Portfolio mapped to office selection
Employees could choose what office they are a part of to narrow down their applications list, instead of taking 9 separate surveys.
Design Decision

"Do You Use This Software?", Progressive Filtering

After selecting their office, employees were shown only the software relevant to their portfolio. A simple yes/no per application further filtered the list, routing each person to only the questions that applied to their actual tool usage.

Software use question, left panel
Software relevance filter, left panel
Software use question, right panel
Software relevance filter, right panel
"Do you use this software?", By using this technique the user is able to filter down their list of software even further, reducing cognitive load at every step.
Design Decision

Current Use vs. Future Use, Preventing Code-Switching

To prevent code-switching friction, questions were broken into two clearly separated sections: current use and future use. Mixing these question types forces respondents to mentally shift frames repeatedly, increasing error rates and fatigue. Separating them keeps the cognitive mode consistent throughout each section.

Current use section header
Current use section, respondents evaluate software they use today
Future use section
Future use section, separated to prevent context-switching
To help prevent code switching, the questions were broken up into current use and future use sections.
Design Decision

Standardized Answer Scale, Consistent, Exportable Data

All answers used a consistent dropdown scale of 0–5. This standardization was intentional: it ensures every response is comparable across offices, software, and audit cycles, enabling clean data export and analysis without manual normalization.

Survey answer scale header
Standardized 0–5 scale applied across all software evaluations
Survey answer scale in use
Dropdown implementation, consistent format across all questions
All answers were dropdown on a scale of 0–5, enabling streamlined, consistent data export across all offices and audit cycles.
Design Decision

Automation Opportunity Questions, Surfacing Insider Knowledge

Employees often know when a software change is approaching, and they frequently have ideas about how to improve their own efficiency. I built in open-ended automation opportunity questions so that qualitative insight could be captured without requiring a separate research initiative.

Follow-up open-ended question
Automation opportunity questions, capturing employee-generated efficiency insights
Employees often know when a software change is approaching, so why not ask them? How would they improve their own efficiency if they could?
Design Decision

Opt-In Follow-Up, Enriching Quant Data Without a Separate Study

At the end of the survey, respondents were given the option to volunteer for a follow-up qualitative conversation. This opt-in mechanism enriches quantitative data with depth, without requiring a separate recruitment process or research initiative.

Survey completion screen
Survey completion confirmation
Can we talk with you? opt-in screen
"Can we talk with you?", voluntary follow-up opt-in
Follow-up detail confirmation
Detail confirmation for opted-in respondents
Added opt-in follow-up for qualitative depth, so quantitative data can be enriched without a separate research initiative.

The consolidated survey identified multiple applications suitable for sunset, resulting in $200k in identified client savings within the first deployment cycle.

The instrument was designed to scale, the same tool now serves multiple offices, reducing research overhead for every future audit cycle. Key outcomes of the design:

  • Identified client savings: $200k from applications identified for sunset
  • Cognitive burden reduced: 75% fewer questions per employee through adaptive filtering
  • Surveys consolidated: 9 instruments collapsed into 1 evergreen instrument
  • Evergreen design: Runs on a regular auditing schedule without rebuilding each cycle
  • Scalable across offices: Serves multiple offices with the same instrument
Supporting documentation can be viewed during in-person meetings.
Due to the confidential nature of the project, supporting documentation is not provided within the case studies, but can be viewed during an in-person interview upon request.