A robust filter component was created in order to address long-standing user pain points when searching through the video library.
I was responsible for iterating on various filter interactions, supporting the UX Research team, and defining filter patterns for the final filter component.
The existing version of filters was restrictive and didn’t include many options that were highly requested by users. The amount of content users were sifting through was overwhelming and the search process was time consuming.
The team wanted to create a reusable filter component that could be configured to fit the context it would be applied to.
Based on previous user feedback, I worked on mocking up different explorations following the Senior designer’s direction. We explored different ways of applying, editing, and displaying filter categories. These explorations also included edge cases such as overflow states, responsiveness, saving, and applying previously set filter parameters.
One version of the explorations was selected to turn into a high-fidelity prototype to use in user research sessions. I created two user journeys to cover the possible tasks users may use the filters for in their usual work day. The prototypes were created in Figma and made use of the auto animate and component features.
After the User Research team presented their findings, these were translated into changes to the final design. Users reacted positively to the visible filter categories and the ability to easily edit filters. A majority of the changes focused on filling any gaps in the filter categories.
After launch, the time spent on search is reduced. Users now have more filter options making content discovery much easier for users. The filter patterns are now used in multiple workflows and set a standard for advanced filters.