Analytics

In order to continue addressing the needs of increasingly diverse event organizers, Eventbrite decided to create a brand new analytics tool, designed and built completely from the ground up to deliver more user control and greater insights to those who need it most. I led user research, and then collaborated with one other designer to explore, refine, and polish both the end-user experience for the new analytics tool, and the back-end systems that now power it.

 

My Role

I had the great pleasure to work with Shakhina Pulatova (Product Manager), Josh Price (Product Design), and a small but impressive engineering team led by Nick Popoff for this project. As the sole UX Architect for the project, my role was to drive information architecture and usability, conduct user research, and help communicate how they system should be built to ensure it would meet user needs today and tomorrow.

 

Goals & Role

The principle goal of this initiative was to provide event organizers a robust and dynamic view of the performance of their events. Secondary goals included enabling organizers to export/share with their partners, providing organizers with sufficient control over how they "slice and dice" their information, and building responsive and reusable UI components.

 

Research

Pinning our survey of the landscape to the wall.

I performed all aspects of research, sometimes on my own and other times partnering with Shakhina (PM) or Josh (PD). Research could be divided into three types: discovery, iterative validation, and post-release.

 

Discovery

In the beginning, I worked with Shakhina (PM) to figure out what our organizers needed. We tag-teamed leading interviews with organizers of all stripes, often learning that the variance of user needs was very large. We also probed into exactly what types of information were important to organizer, when they needed that information, and why. The result of these interviews was a huge range of data points the system would need to support, as well as a need to keep everything so simple that organizers would be able to digest what they were seeing at a glance.

I also worked with Josh (PD) in these initial phases to survey the landscape of existing analytics tools and information displays. How did others display different types of complex data? What do Edward Tufte and Stephen Few say about this or that type of data? Before putting pen to paper, at all, we first pinned our findings to the wall (literally) and reviewed this with internal stakeholders and our engineering partners, to see what insights and ideas they might have.

 

Iterative Validation

As we began to design in earnest, I would set up regular feedback opportunities. Sometimes this took the form of inviting internal stakeholders to assess what we had thought up or ironed out, while other times it meant assembling "paper" prototypes in InVision or actual paper and running usability testing with real organizers. Validating designs as we iterated meant that we could identify gaps or successes early and use that input to take the designs to the next level.

 

Post-Release

I describe post-release research in greater detail later in this post. However, going into the project, we knew we would want to release in stages and conduct user research on the live product to learn from each stage's release.

 

Design Exploration

In this project, Josh (PD) and I honed a partnership in design that remains one of my favorite approaches. We set out by identifying project principles. These included things like "simple" and "powerful," and helped keep us on track from beginning to end. Over the course of the project, we worked side-by-side, making sure to balance my focus on IA and usability with Josh's focus on visuals and interactions. In effect this meant that I was leading design during early phases of design, while we were still getting our heads around the basic data types, modes, and layout. As the project went on and our focus shifted more to the details of each interaction, colors (and other signals), and typography, Josh gradually took the lead. But no matter which of us was holding the reins, we both remained respectful and deferential to each other's perspective and insights.

Early on, while Josh (PD) got started exploring components like UI inputs and chart styles, I distilled all the different types of information into the basic data types and facets that the system would need to handle. A lot of this was done on whiteboards and spreadsheets, with the result being organized structure out of loosely-stated end-user needs:

With any new UI component (i.e. one not already in our product styleguide), we started with the small screen and worked our way up to a fully-fleshed-out vision. While organizers on Eventbrite's platform at the time were known to work primarily on desktop, we had learned during initial research that getting information about their events was something that was important on the go, as well. Even if our users had told us they only want to use this on large screens, I found that by designing for small screen first, it encouraged me to give greater attention to ease-of-use and clarity of the interface and displays.

 

Defining the System

Once I had a firm grasp of the types and facets of data to be used in the analytics tool, and had verified these with our users, I then set to work defining the system requirements with the engineering team that would build this. This mean work with both front-end and back-end engineers to make sure that the infrastructure that powered the tool would match the needs of its users. Getting conceptual alignment first was key, but we also made sure implementation details about in how we rendered the data on the back-end would be compatible with the user interface and all of its planned behaviors.

Meanwhile, Josh and I also made sure to define each of the modular components in a way that was meaningful to the engineers who would build it. By starting with rough, early versions and conveying intention, our engineering partners were able to start investigating various charting libraries and experimenting with different data presentation approaches. Josh led the charge on defining the front-end components with high precision, while I work in more of a support role: making sure we headed off usability risks early, and fleshing out corner cases when some behavior or state of a component was ambiguous.

With each iteration, Josh and I made sure we validated our design not only with users but also with these engineers, so that we could ensure we stayed realistic about what we could (and could not) build.

 

Building & Refining

There wasn't a moment when design stopped and development began. By involving engineers in the design process early on, by the time I had the IA in a good spot, the engineers had already laid out a basic framework for how to build the system. I spent significant time over several iterations exploring different approaches to layout and controls. Early research had indicated that the main focus would initially be on the chart, with the more detailed tabular data a lesser (but still very important) priority. The chart needed to be interactive, but in order to slice and dice the data, organizers would need controls that were too complex to put in the chart itself.

The sample of designs shown here represent only one evolution of the layout and interactions. Nearly every design iteration, also involved a lot of divergent exploration that often fell to the cutting room floor once the "right" solution emerged.

 

Post-Release

In order to learn as much as possible as we built, the team decided to build and release the analytics tool in discrete phases, where each release would unlock a new data type and one or more dimensions for slicing that data. These phases were prioritized to align with the largest organizer needs first, starting with "Sales, grouped by Ticket Type" along with a time series multiple-line chart. Next on the list were "Attendees, grouped by Location" with a map chart, and "Listing Visits, grouped by Traffic Source" along with a donut chart.

 

Continuing Validation Research

In the initial release, we rolled out to a select group of "alpha" organizers, whose first exposure to the tool was in a formal usability test. After that test, we granted them access to the tool on their real accounts and asked them to report back to us with any additional feedback. This short alpha period also allowed our engineering team to ensure that the infrastructure was performant — a significant aspect of the user experience for most organizers, especially when dealing with very large data sets.

After the initial alpha period, we opened the tool up to general use. In the product navigation, the tool was shown as "Analytics (Beta)" and, when accessed, was accompanied with a heads-up to users that the tool was still in development and to please send us any feedback via a simple Google Form. This feedback was instrumental in supplementing our formal research. As a result of this input, we bumped up work on cross-event reporting and exporting data.

Update: A year later, the tool is much closer to completion. As long as it is in beta, we have decided to keep the feedback prompt in place. It continues to provide both confirmation and criticism that the team uses to assess how best to move forward.

 

Reflection

Following the initial release, Eventbrite decided to shift its focus to other priorities. This new analytics tool proved such a resounding success with organizers, however, that a dedicated team has continued to build out the capabilities of the tool. The backlog of implementation work to be done is huge, as we designed a complete event analytics system for our organizers.