City of San Antonio - Web Team

Using Survey Insights to Make City Websites Work Better for Residents

Role: User Researcher, Data Analyst

Skills: Survey Research, Data Analysis, Quantitative Research, Qualitative Research -Thematic Coding of Open-Ended Responses, Data Visualization

Tools: Microsoft Excel, Microsoft Office, Google Suite

Executive Summary: 

In 2022, I partnered with Firecat Studio as a researcher to help the City of San Antonio (COSA) transform how it understands and responds to resident needs online. As the City prepared to launch a redesigned website, leaders needed a reliable, data-driven way to measure user experience and set a clear baseline for improvement across both SA.gov and SanAntonio.gov.

We reviewed resident surveys placed on high-traffic landing pages, calculated experience scores using a structured scoring system, and analyzed open-ended comments with COSA’s usability tagging method. I created data visualizations and scorecards that made the findings easy to understand, and helped document a repeatable five-step process for ongoing survey analysis.

The result was a clear, actionable view of how residents experience City services online and a sustainable process to keep improving. This work gave City teams the insights they needed to make services faster, easier, and more accessible, reduce the demand for staff assistance, and strengthen public trust in local government.

ABOUT THE CLIENT AND PROJECT

The COSA Web Team asked Firecat Studio to analyze user feedback collected through short surveys embedded across various landing pages on SA.gov and SanAntonio.gov. The goal was to develop a repeatable methodology for processing survey results and calculating baseline experience scores, both by topic and at the site level, that could help track progress as the city transitioned from its legacy site to a new platform.

One of the survey questions on the SanAntonio.gov website landing page

PROJECT GOALS

The goals of the project were to:

  1. Establish a baseline customer experience score for different content areas

  2. Enable regular reporting of survey findings

  3. Make qualitative and quantitative insights accessible through the City’s research repository

  4. Document a clear process for analysis and sharing of results with city stakeholders

WHAT WE DID

SanAntonio.gov website homepage

SA.gov website homepage

We reviewed survey data collected across both websites between January and April 2022. These surveys included three quick multiple-choice questions and an open-ended feedback prompt. We:

  • Created a scoring system using weighted values to quantify responses

  • Calculated experience scores for each survey with 15+ responses

  • Developed a formula to calculate site-wide experience averages

  • Tagged and organized qualitative comments to identify recurring themes

  • Uploaded results and representative quotes to the City’s research repository

  • Documented a repeatable 5-step process for exporting, analyzing, tagging, and sharing insights

METHODOLOGY

Scorecard with survey data for SanAntonio.gov website page

We developed a scoring rubric that assigned numeric values (1–5) to survey responses and averaged results across three key questions: overall experience, ability to find information, and helpfulness of content. Open-ended feedback was coded using COSA’s usability tags, then summarized by frequency and key themes.

Artifacts included:

  • Charts and scorecards for all surveys with ≥15 responses

  • A consolidated spreadsheet of experience scores

  • Comment collections uploaded to City’s research repository

  • Recommendations for survey cadence, stakeholder sharing, and future improvements

OUTCOME & IMPACT

Standardized Measurement

Created a consistent way to measure how well the City’s websites meet resident needs. This gave leaders clear data to see what’s working, what isn’t, and where to make changes that save staff time, improve services, and build public trust.

Actionable Insights

Reviewed and organized resident feedback to find the most common user pain points. These insights helped the City fix problem areas, make information easier to find, and help more residents complete tasks on their own.

Foundation for Ongoing Improvement

Set up a repeatable process for reviewing survey results and sharing them across departments. This allows the City to track improvements over time, respond quickly to new issues, and keep services simple and accessible for everyone.

REFLECTION

This project taught me a lot about the value of structure, consistency, and clarity when working with large sets of feedback data. I learned how important it is to design analysis processes that not only make sense in the moment but can also be repeated and trusted over time, especially when different teams rely on them to guide future decisions.

Although AI tools are increasingly used for survey analysis today, they weren’t available to us at the time. Even if those tools had been available, we brought something AI can’t replicate: context, collaboration, and an understanding of how real people interact with city services. We made intentional decisions about how to interpret feedback and which patterns mattered most, drawing on empathy, domain knowledge, and thoughtful conversation.

This experience reminded me that while tools evolve, the human side of research, listening carefully, asking the right questions, and connecting insights to real-world impact will always matter.