Inside Drivit

How to make data-driven design work for you

By Gonçalo Farinha

February 10, 2020 · 6 min read

As creatives, there are few things we hate more than hearing, “Make the logo bigger,” “Put everything above the fold,” and “Make the words pop!”

Whether you’re a product designer, a copywriter, an art director, or any creative role, if you’re good at your job, you have a rhyme and reason for doing what you do—every single time. You’ve thought through all potential angles. You can defend your choices. You can convince someone to back out of their feedback, or even better, help them make the feedback more constructive. But what do you do when you hear, “The data says it’s bad”?

You do better.

In this digital era, everything can be, and if your company is smart, is tested. Google’s Internal Data Report from 2018, as cited by Croud, tells us that 50% more conversions were generated from data-driven creatives. 50% more. It’s hard to compete with numbers like that, no matter how precious your work.

Whether you’re building a better onboarding flow, making updates to your checkout process, or creating the next big campaign, here are three ways to make data work for you.

Test continuously

Working at an analytics company, we have no shortage of data to tap into. When our team launches campaigns, we create the iterations we want to test and see what performs best. For our Guide to Product Metrics campaign, we launched three variations of the same banner ad, each with a different colored background, to see what would get the highest click-through-rate. In a few day’s time, the yellow was the clear winner, and we optimized our ad spend towards it. In the next round, we took the yellow and tested different creative elements like adding in a textured background. These kinds of low-risk, high-reward tests can give you valuable insights to take into future work.

s = "syntax highlighting"
print s

Get feedback, faster

On our recently launched conversion campaign, we had a hypothesis that centering the headline and hero image would help guide the viewer down the page. But we needed data to prove it. So we created a second iteration where the headline and hero were inline at the top, all other elements remaining the same. We had two pages that were on-brief, on-brand, and work we were proud of, no matter the winner. At the start, our control page was the clear frontrunner. But as the test ran, we learned that the variant drove 26% more click-throughs. In just a couple week’s time, our team had the data we needed to optimize our campaign for maximum impact.

Source: https://www.pexels.com/photo/abstract-art-circle-clockwork-414579/
Source: https://www.pexels.com/photo/abstract-art-circle-clockwork-414579/

Working at an analytics company, we have no shortage of data to tap into. When our team launches campaigns, we create the iterations we want to test and see what performs best. For our Guide to Product Metrics campaign, we launched three variations of the same banner ad, each with a different colored background, to see what would get the highest click-through-rate. In a few day’s time, the yellow was the clear winner, and we optimized our ad spend towards it. In the next round, we took the yellow and tested different creative elements like adding in a textured background. These kinds of low-risk, high-reward tests can give you valuable insights to take into future work.

Simple, right? And, at its core, the idea is pretty basic. But, oddly enough, it’s quite easy to both under-estimate its implications and over-estimate its complexity

Billy Whited

On our recently launched conversion campaign, we had a hypothesis that centering the headline and hero image would help guide the viewer down the page. But we needed data to prove it. So we created a second iteration where the headline and hero were inline at the top, all other elements remaining the same. We had two pages that were on-brief, on-brand, and work we were proud of, no matter the winner. At the start, our control page was the clear frontrunner. But as the test ran, we learned that the variant drove 26% more click-throughs. In just a couple week’s time, our team had the data we needed to optimize our campaign for maximum impact.

Your perfect partner for

usage-based insurance