Saturday, 20 August 2011

Sugarshots Results: The Call to Action - iMediaConnection.com

Sugarshots Results: The Call to Action - iMediaConnection.com


The Test Objective


Campaign Details:
Client: Sugarshots, Inc.
Agency: Basement, Inc.
Ad Network: 24/7 Real Media
Ad Serving + Tracking: Atlas DMT
Site Analytics: Think Metrics

Useful links:

Tracking Report:

To determine if featuring a call to action in the form of a button on a banner will increase response rates.
Despite its reputation for being a creative and innovative field, advertising has always had its share of conventional thinking. Advertising classes teach aspiring creatives the difference between the right way to create an ad and, if not exactly the wrong way, then the not-so-right way to create an ad. There are right places to put the logo, and wrong places. Good uses of type, and bad.
One of the better known rules, or conventions, is the call to action (CTA). To many direct marketing practitioners, not having a CTA is the equivalent of a yellow pages ad without a phone number.
At times, the CTA may be a legitimate claim. There’s limited supply, or the offer ends soon. However, often in online advertising, ‘click here’ or ‘learn more’ are tossed onto the ad just because conventional wisdom says to do so. But are those CTAs really bringing anything to the party? That’s what our test this week will try to determine.
To recap the creatives, we’re testing two ads that are identical except for one has a call-to-action button that states ‘learn more,’ and the other has no call-to-action button. To review the intro to this test, refer to Monday’s article.
Charts & Analysis

This test is primarily a response measurement. Does our ‘learn more’ CTA add anything that will increase the banner’s response rate? The most accurate way to calculate this is with the click-visits to the landing page metric. In this case, we’re using the home page for the landing page. For more information on this metric and why we’re using it, please refer to the explanation of the metrics terms.
In the click-visits column on Chart 1, we see a 27 percent lift in performance for the ad without a button. That’s not only a sizeable increase, but the impression counts and differences in responses place these results in a 99 percent confidence interval. Clearly, from a response standpoint, the button not only isn’t adding to the banner’s performance, it’s actually detracting by a considerable amount.
Chart 1: Home Page Click-visits
Creative DescriptionImpressionsClicksClickthrough Rate (%)Click-visits
Strategy 1:Button391,7502970.076173
Strategy 2: No button391,5953420.087220
TOTAL783,3456390.087393
I mentioned that this test is primarily a response test. The button/no-button factor is almost entirely based on immediate visual impact. And that’s what the click-visit data is covering.
Of course, we don’t want to take anything for granted, no matter how trivial it may seem. So we’ll take a look at some broader data, the page-views that each ad generated. Chart 2 shows us that the two ads are driving almost the same number of page views per click-visitor. Metrics-savvy readers will also notice that an average number of page views per visitor of 1.86 isn’t very high. In an upcoming test we’re going to try to increase that.
Chart 2: Click-visit Page View Totals
Creative DescriptionImpClicksClickthrough Rate(%)Home Page VisitsPage Views - Total
Click-visitsClick-visitsPer Click-visitor
Strategy 1: Button391,7502970.0761733261.88
Strategy 2: No button391,5953420.0872204061.85
TOTAL783,3456390.0873937321.86
I would expect two ads so similar, except for the CTA, to produce similar results once people are on the site. The expectations the ads have created are identical.
Now let’s look at purchase intent. Just to make sure our no-button visitors are as interested in purchasing as our button ad visitors, we’ll compare the visit to purchase page conversion rates for the button and no-button ads.
Chart 3 reveals that the banners are generating an almost equal purchase intent effect, as well. And while our purchase page visits data is still a little sparse, the results are on trend.
Chart 3: Click-visit Traffic to Purchase Page Conversion Rate
Creative DescriptionHome Page VisitsPurchase Page VisitsVisit to Purchase Page Conversion Rate
ClicksClicksClicks
Strategy 1: Button1732514%
Strategy 2: No button2202913%
TOTAL3935414%
This test has shown that the ad without a button performed at a considerably higher rate than the banner with the ‘learn more’ button for our key metric in this test, the click-visit response rate. Additional metrics further validated the findings.
Here’s a quick recap of what we’ve learned.
  • Response Rates: The no-button ad produced an increase in performance of 27 percent above the button ad, at a confidence interval of 99 percent.
  • Depth of Visit: The two ads performed almost identically in page views per visitor. This would be expected for ads so similar strategically and emotionally.
  • Purchase Intent: A similar response comparison as with Depth of Visit, above.
  • Target Audience Continuity: The response rate data held consistent across the general target audience breakdowns. Data levels were sparse in some areas.
Why did the no-button ad perform better? There could be several reasons.
From a graphic design standpoint, the less information a piece of communication has, the more likely it is to pop off the page and get noticed. For that reason alone, the no-button ad would be more likely to capture someone’s attention, and thus more likely to draw a higher response rate.
In Mies van der Rohe’s immortal words, ‘Less is more.’
However, there’s another angle to this which is worth considering. We’re in an era where the consumer is increasingly in control of his media. Consumers are growing accustomed to reading articles when they want to, watching streaming videos when they want to, and downloading music when they want to. And of course, all of that is happening online.
Furthermore, today’s typical online viewer is quite different from many of the newbies of just a few years ago. He's far more sophisticated. So while button/no-button tests may have produced opposite results a few short years ago, a lot has changed since then.
Thus, when our web-savvy viewer sees an ad and then notices the ad’s spurious attempt at generating urgency, maybe he's a little turned off. Perhaps we’re talking down to him, in a subtle but recognizable way.
The viewer knows you can click on a banner. And he knows that there’s more information readily available online. Which means that when we make a call-to-action that states ‘click-here’ or ‘learn more,’ we’re quite possibly putting a parenthetical ‘stupid’ right next to it.

0 comments:

Twitter Delicious Facebook Digg Stumbleupon Favorites More