In early June 2012, the Yoga-pedia user-base surpassed the
60,000 user mark and, in spite of the app’s apparent popularity, I had to admit
that I was not being particularly effective in monetizing this growing
community. The Mobile Yogi’s
business model has always been about white-labeling the fee-based “A Pose for That”
for yoga studios, retailers and other wellness-centered businesses – Yoga-pedia
was conceived simply to accelerate adoption of A Pose for That (see Increasing
App sales with Analytics: Free apps versus trials) and, while Yoga-pedia
does generate some ad revenue; the ads themselves were really intended to be
little more than an irritant and, therefore, another reason to upgrade (the
paid app is ad-free of course). …but I had a growing sense that this community was
most likely an under-utilized (and therefore undervalued) asset; but how to turn
this hunch into revenue?
I needed to find something that I could market other than an
app upgrade.
First, I took
stock of what I already knew about my users. I realized that the analytics that
I had been collecting up until this time had certainly helped to improve user
experience and software quality, I really had very little insight into my users’
interests and desires beyond their interactions with my app’s specific
features.
In fact, all I knew for sure about my users was:
a) That my
users all have an interest in yoga (they use my app) and
b) When they were in my app, they are actually thinking
about yoga at that moment in time.
– But how to parlay on my limited insight into something
actionable?
It was time to make the leap into A/B testing. A/B testing is about randomly showing a user different
versions of a page – the (A) version or the (B) version – and tracking the
changes in behavior based on which version they saw. The (A) version is the
existing design; and the (B) version is the “challenger” with one element
changed.
The element I chose to vary was the “contextual ads” being
served up, in this case, by Microsoft’s adcenter. The B “challenger” was a set of 3 different
faux ads promoting yoga clothing, yoga training, and yoga retreats as follows:
I randomized the displays by generating a random number
between 1 and 4 as follows:
public static int MyAdOrServeAd ()
{
int MyAdOrServeAd = 0;
Random randomObj = new
Random();
MyAdOrServeAd = 0;
MyAdOrServeAd = randomObj.Next(4);
return MyAdOrServeAd;
}
And then set System.Windows.Visibility for each of
the four controls (AdCenter and my three test topics above) to .Visible or .Collapsed
accordingly, e.g.
if (RandomAd == 3)
{
adControl3.Visibility = System.Windows.Visibility.Visible;
My_AdRetreat.Visibility = System.Windows.Visibility.Collapsed;
…
The result
is the following A/B behavior:
And of
course, I used Dotfuscator to inject analytics to track:
- · Which “ad” was displayed in which page and
- · What action (if any) users take with each flavor of ad.
I added the
method, WhatAdWhen, that takes 3 string parameters, the page being shown, the
ad being displayed, and the event, e.g. New Ad, Ad Engaged (clicked), or Ad
Error (only applicable to AdCenter served ads).
The method does nothing, but I grab the parameters and send it to the
Runtime Intelligence Service using Dotfuscator to inject the instrumentation.
The method
is:
private void WhatAdWhen(string page, string
ad_type, string event_type)
{
return;
}
The call
looks like: WhatAdWhen("Main", "Clothing",
"New Ad");
And then
attach the following extended attribute with Dotfuscator to track each call
The * is
all I needed to include to have Dotfuscator send all three values to the
Runtime Intelligence Service… So what did I see after a few days in production?
Perhaps not
surprisingly, yoga-focused promotions outperform contextual ads by almost 10X –
given my current volumes, I expect to generate over 3,000 highly qualified
clicks per month.
Now when I
last checked, Google estimates that it would charge $150 per month for 175
clicks on “yoga clothing.” Put another way, Google would charge roughly $3,000
per month for the leads I can generate for my yoga business clients today
(focusing only on US-based users – not global). Now that’s the seed of a
business – build well-defined, highly sought-after communities (like yoga
consumers) and then find ways to connect these communities to businesses that
value them most. …and this compliments perfectly the original plan of
delivering white-labeled yoga apps… with this evidence, we have already
recruited some small yoga businesses to actually reduce their Google ad
spending and divert those funds to pay for both white labeled yoga apps and the
promotion of their businesses to my growing cadre of mobile yogis.
If you don’t
measure, you can’t improve – and with in-app A/B testing, you can measure more
than just software adoption – you can glimpse inside the mind and motivations
of your users – and that is the key to any successful venture.
No comments:
Post a Comment