Information vs. Cognitive Overload
Designing a Survey That Doesn't Tax Employees
Nine surveys. One month. $200k in identified savings. When the Enterprise Architecture team needed to evaluate how software portfolios supported mission-critical work, I didn't just design a survey, I redesigned the research infrastructure itself.
The Enterprise Architecture team was operating with fragmented, low-quality data across nine separate survey instruments, creating survey fatigue, inconsistent outputs, and an inability to make confident portfolio decisions.
I was brought in as Lead UX Researcher and Lead Survey Designer to solve not just the survey design problem, but the underlying research operations problem.
I made an early strategic call: this wasn't a survey refresh project, it was a consolidation and systems design project. I mapped nine surveys to their underlying business questions, identified redundancies, and designed a single adaptive instrument that used filtering logic to route employees to only the questions relevant to their office and software context.
Suggested Next Steps
- Set an auditing schedule
- Review applications and adjust as needed
The following screens walk through the key design decisions embedded in the consolidated survey instrument. Each decision was made to reduce cognitive load while improving data quality and scalability.
Office Choice, Replacing Nine Surveys with One
Employees could choose their office to automatically narrow down the applications list, eliminating the need for nine separate surveys entirely. This single filtering mechanism was the foundational design choice that made everything else possible.
"Do You Use This Software?", Progressive Filtering
After selecting their office, employees were shown only the software relevant to their portfolio. A simple yes/no per application further filtered the list, routing each person to only the questions that applied to their actual tool usage.
Current Use vs. Future Use, Preventing Code-Switching
To prevent code-switching friction, questions were broken into two clearly separated sections: current use and future use. Mixing these question types forces respondents to mentally shift frames repeatedly, increasing error rates and fatigue. Separating them keeps the cognitive mode consistent throughout each section.
Standardized Answer Scale, Consistent, Exportable Data
All answers used a consistent dropdown scale of 0–5. This standardization was intentional: it ensures every response is comparable across offices, software, and audit cycles, enabling clean data export and analysis without manual normalization.
Automation Opportunity Questions, Surfacing Insider Knowledge
Employees often know when a software change is approaching, and they frequently have ideas about how to improve their own efficiency. I built in open-ended automation opportunity questions so that qualitative insight could be captured without requiring a separate research initiative.
Opt-In Follow-Up, Enriching Quant Data Without a Separate Study
At the end of the survey, respondents were given the option to volunteer for a follow-up qualitative conversation. This opt-in mechanism enriches quantitative data with depth, without requiring a separate recruitment process or research initiative.
The consolidated survey identified multiple applications suitable for sunset, resulting in $200k in identified client savings within the first deployment cycle.
The instrument was designed to scale, the same tool now serves multiple offices, reducing research overhead for every future audit cycle. Key outcomes of the design:
- Identified client savings: $200k from applications identified for sunset
- Cognitive burden reduced: 75% fewer questions per employee through adaptive filtering
- Surveys consolidated: 9 instruments collapsed into 1 evergreen instrument
- Evergreen design: Runs on a regular auditing schedule without rebuilding each cycle
- Scalable across offices: Serves multiple offices with the same instrument
Due to the confidential nature of the project, supporting documentation is not provided within the case studies, but can be viewed during an in-person interview upon request.