Passport Search Results
- Increased clickthroughs
- Increased statistics views
- Increased analysis views
- Decreased searches per user
- Decreased bounce rates
- Decreased time on page
- Decreased task time
- Analytic Benchmarking
- Contextual Inquiry
- Prototype Testing
Product design and development are different at every company. Some companies operate with a "fail fast"mentality, where quick and agile changes are the norm. Other companies have structures requirements that require changes be slow and deliberant. In the latter environment, it becomes crucial that your designs have a high impact on the user experience and require as little development effort as possible.
Passport, the flagship product for Euromonitor International, collected and displayed data from many different sources, each with varying infrastructures and capabilities. Most design changes required multiple sprints, and since each sprint was two weeks, even small changes would take a month or more to implement.
I was tasked with updating the search process. In this product, there were two ways to find specific content: a) build a query using a category tree or b) conduct a keyword search. Both methods led to a search results page.
Understanding the Problem
First, I wanted to understand what exactly was happening during this process and on this page specifically. Using Google Analytics, I was able to define and quantify benchmarks including:
- Bounce Rate (per page)
- Time on Page
- Clickthrough Rate
- Pageviews per Content Type
- Avg searches per user per visit
Once I had a good understanding of what exactly users were clicking in the system, I wanted to understand why that behavior was occurring. I conducted 10 contextual inquiries with users to understand the goals, needs, and pressures they encountered while looking for information in the system. I found that users:
- Often couldn't find the specific type of content they wanted
- Doubted accuracy of results (thought content was missing)
- Conducted the same search repeatedly in any given session
- Mostly only used 1 or 2 statistic types
- Were unaware that many content types existed
- Spent lots of time trying to parse the information on the page
Design and Testing
Once I was to get a better understanding of the problem, I created a set of goals by which to measure the effectiveness of the design.
- Increase clickthrough rate
- Decrease search time
- Decrease bounce rate
- Increase user confidence in results
- Increased awareness and usage of all content types
- Implement incrementally over 3 - 4 sprints
Since the system was complex and I was unfamiliar with the level of development effort, I engaged the project managers and developers on team to help me understand how and why the existing design was created and why pitfalls should be avoided in a redesign.
Calculating Effort and Testing Impact
I created multiple concepts to address the pain points uncovered and consulted with the team to understand the feasibility and effort level of each design change. Then, I conducted 10 usability tests to understand which changes were most effective.
With this information, I was able to understand both the effort and impact of the proposed changes and go with design changes that would have a high impact, but were a low effort.
This process resulted in design that allows for faster development and greater understanding of how the changes would impact users. In addition, I was able to accurately quanitify the changes and measure the effectiveness of the design after implementation.