AiTRK is an advertising data reporting and analytics platform that connects disparate network information to display a cohesive overview of campaign performance.
Ai Media Group needed a refreshed reporting platform for its advertising data to drive growth, sales and client acquisition. The existing application was dated, slow, desktop only and was often times unreliable.
Our team was tasked with creating a modern and scalable application to serve the needs of Executives, Marketing Directors, Business Analysts, SEO Managers, and Sales Associates to support client and revenue growth.
It was projected that a new platform could help propel an $8M dollar increase of annual revenue over three years by providing a clearer picture of data, a more robust ad spend and ROI toolset, and a more engaging aesthetic to a technology and data focused audience.
Being a small team we were granted a generous timeline to complete the project and since we knew we’d be iterating over data structures and visualization strategies, we were allowed to define our roadmap in an agile way.
Provide enhanced UI tools to simplify, consolidate, and streamline data consumption to reduce user workload and friction.
Enhance reporting capabilities for employees and clients that will maximize ROI on ad spend.
Update customer journey attribution modeling to drive value and growth opportunities.
Simplify the navigation and reporting structures.
Develop a user-centered design foundation with a clean and modern design language.
Enhance data visualizations and graphics export functionality for presentations and QBR’s.
Increased filtering capabilities and end user control of data segmentation.
User interviews made it obvious there were numerous challenges to solve. Despite this, some clear standouts illuminated common themes from a majority of responses I received. I divided the responses up between user needs and pain points so I could establish a sense of what was working and not working inside the existing application. I also mapped these responses to a priority matrix so our team could get a sense of what the roadmap, effort and constraints would look like. This guided our MVP target and helped align stakeholders on outcome expectations and timelines.
There was not a good use of whitespace and overall layout. Users would like more visualizations clustered together for better analysis because it could help uncover additional revenue opportunities.
Users needed a better display of data trends over time. Day over day and month over month comparisons with granular filtering would be helpful and make the experience of comparative analytics more insightful.
It was hard to process tabular data or views where data is too dense. Users need better ways to summarize top line KPI’s that are more readable and empower them to dive deeper depending on their individual needs.
There was too much manual effort required to piece together various data points to tell a holistic advertising narrative. Considerations should be taken to aggregate the right pieces of data together which would save vast amounts of time.
In addition to the user feedback, I wanted to examine the existing application to look for further improvements concerning user flow, site usability, layout, consistency in data presentation, and visual design. I used the existing application extensively and documented my audit findings so I could fully understand the new data I was working with and get a sense of navigating the dashboards.
I spent additional time interviewing our senior developer and CIO to get some historical context on why this application was built, why they made the decisions they did and gather a clear understanding of the tech stack that would support our new platform. I needed to know what the constraints would be in order to design the best solution I could without getting constant pushback from the dev team wasting time over numerous iterations of concepts.
The existing navigation was disjointly organized and required multiple interaction steps to change reports or filter date ranges within a report. This also made wayfinding difficult because it wasn’t always apparent how the report was being filtered.
Many of the reports contained redundant or shared information. There were opportunities to reduce the redundancy and create a more focused approached to data storytelling.
There are inaccuracies in data points between the report UI display and the CSV downloads of the report. We can shore up the discrepancies and make sure that what we display is always available in the download.
There wasn’t a clearly defined typographic system to support the underlying data and color use could be better optimized for continuous, divergent, or sequential data visualization applications.
My first goal was to reduce complexity by realigning the site map and reporting structure. The primary object of data within AITRK is a Program that is comprised of various network information. Users were also allowed to categorize and group Programs into Groups that aggregated the performance of media channel strategy-based campaigns.
One major obstacle was that there wasn’t a one-to-one match of reports; some reports that were available to single Programs weren’t always available in a Group view. I wanted to create more consistent top-level reports by using the concept of Programs and Groups as a filter and decoupling it from the report selection.
I hypothesized this would reduce the complexity, cognitive load and lead to a frictionless experience. Users would have a better chance of selecting reports and finding and remembering data locations by exposing all available data points in each version. This would also make the process of analyzing data more efficient by saving users time.
The Program and Group reports were segmented out into different navigation tabs and required multiple click interactions to make a selection. Additionally, some dashboards were not exposed anywhere else within the navigation structure and only made available after clicking deep into other reports.
My primary goal was to rethink how the navigation was laid out and expose as much as I could within the UI without interaction while also considering how we could scale the application in the future when adding more reports.
I started by sketching and wireframing a few different concepts and presented them to the team and core users to get a sense if the direction was on target.
While both wireframes scored reasonably high in regard to findability and task completion with our testing groups, ultimately the tabbed navigation won out in the end due to wayfinding enhancements, low screen size cost, low interaction cost, and ease to implement.
Now that the navigation and user flows were in place, it was time to start experimenting with ways to visualize and tell a data story. Additional interviews were conducted to extract ideas on which KPI’s were most important, which specific kinds of visualizations were needed, and how users would like to interact with the data. I also tested out various visualization options with user groups to confirm they were easy to understand, useful and provided clarity.
I wanted to create a top-down storytelling concept that was data-driven, meaningful, relevant, and provided a compelling narrative. At the heart of any report is an aggregated object of data that joins disparate information or represents an overview of a topic and is typically shown using tabular views.
Realizing users were accustomed to viewing tables as the main source of truth while allowing them to filter, sort, compare, and contrast the data, I wanted to ensure it remained a key component in the dashboards. Every other visual component should treat this table as its parent object maintaining a singular, trustworthy system of record.
I would create a repeatable, thematic pattern where each report would start with a summary extracting and visualizing the big picture KPI’s into aggregate, time, correlation or comparison views. Next we would give a user granular details control and allow them access to deeper dives as they select, interact with and assemble data into customizable experiences. Lastly, we would end a report dashboard with the most detailed and familiar view of the tabular raw data.
In addition to this structure, my senior developer and I defined how the data would flow into visualizations in the report, what the dependencies would be and how they would be organized into templates, consume from the design libraries, and what the data consumption/programming models would look like.
Before I came into the project, the team had decided to base the UI on the Material Design Lite framework. They wanted something that had established components, development patterns and was built for cross-device and cross-browser compatibility. Ai Media also has a strong advertising partnership with Google so it made sense for them to provide a consistent design alignment with one of their most trusted partners.
In addition, this decision would take some of the heavy UI design and development work off our plate and get us to the finish line faster while also allowing us to customize and provide our own personality and tone to the product.
At this point I started exploring various applications of material design with time and comparison series visualizations and tabular data to get a sense of how to tie the Ai Media branding into a new product identity. I also explored dark theme variations as well as client-branded, white-label dashboards.
In each of these visual iterations I also explored various layouts for the client, program and date selections. This included the ability to filter by active, disabled or inactive accounts, and calendar/date interactions. I also added the ability to compare data from different date ranges for trend analysis.
I decided on a blue-green hue as my primary base color since it has a natural compliment to Ai Media’s brand red palette and reflects a tranquil, balanced and reflective mood. It carries energy yet isn’t overly powerful so it gives way to the data presentation layer.
From there I developed accent shades and tints and established a greyscale palette for typography treatments that utilize various transparencies which create harmonious relationships between the background and text color.
We also decided to allow a user to select between a light and dark theme so the base color palette had to be modified slightly to account for contrast variations with regard to accessibility.
Finally, I selected system messaging colors for informational, warning, success and error states.
Selection of data visualization color was the final and most critical piece to the color puzzle. Not only did these colors have to find a relationship to the base UI Accent color, but they had to represent various data sets in meaningful ways for drawing distinctions, comparisons, and identifying trends.
I planned on this system being flexible enough to satisfy most needs while not being overly complex. Choosing bright, saturated colors helped find relevance to the base accent color and assists users on focusing on the most important aspects of a dashboard; the data not the UI.
Starting with a 10 step Sequential Single-Hue group, these colors became a guidepost to inform and deliver Categorical, Multi-hue and Divergent palettes for different data visualization applications.
After piecing together the base-level design components, it was time to move into report creation, prototyping, and testing. I started off each report design by summarizing the KPI’s, creating concepts and layouts, and building a mockup to showcase and test interaction patterns with the data presentation or features.
Along this process, I would check in multiple times with the development team and key stakeholders. It was important to me to get feedback early and frequently as I worked through the report creation. I would often present 2 or 3 alternative variations to users, observe their successes or failures in using a feature, and have them guide me through their thoughts and emotions when doing so.