Skip to main content

Case Study: A/B Testing

During my consulting stint for an online electronics retailer, we came accross a "situation" where we needed to run an A/B test. Below is a description of the test and the results.

Background

The online retailer had 3 web sites that related to this case.

1. Site A - A main portal site (the branded site with the most visitor traffic) with direct links to their other sites

2. Site B - The retail site where visitors could shop for and purchase the electronics

3. Site C - The educational site where visitors could learn about the products before they made a decision to purchase

In this case, the marketing team decided that they wanted to push the awareness a particular new camera product. To do this, the team created an internal banner and placed it on Site A (the high trafficked portal site). Upon clicking on this banner, visitors were sent to Site C so that they could educate themselves about the camera product in more detail.

Once on Site C, if the visitor was interested in purchasing the product, they could click on a "Buy Now" link that took them to the camera's product detail page on Site B.

The "Situation"

After the banner was running for a short time, a major question arose.

1. Were we losing revenue by sending the visitor directly to Site C (the educational page) first instead of sending them directly to Site B (the purchase page)?

The asumption was, that by sending the visitor first to Site C, the visitor had to take an extra step in their browsing path in order to reach the purchase site. Why not just send the visitor directly Site B from Site A and save the extra step?

The reason was because Site B and Site C were run by two different groups within the company. The business goal for Site C was to get as much visitor traffic to their educational pages while the business goal for Site B was to generate as much revenue as possible from their pages.

This of course led to some internal corporate headbutting. The group that ran Site C wanted a direct link from Site A because it helped them meet their traffic goals. But the group that ran Site B felt that revenue and conversions were being shorted if the visitors went to Site C first.

Resolution

To resolve this problem we decided to run a test and let the data speak for itself. The two groups agreed on running a two week test. The first week the camera banner on Site A linked directly to Site C (the educational site) and second week the banner linked directly to Site B (the purchase site).

Assumption: With this test, we wanted to see whether or not total revenue was significantly affected if visitors went through Site C first (A --> C --> B) instead of going directly to Site B (A --> B).

I first measured the revenue generated from the Site C "Buy Now" link to Site B (we tagged the "Buy Now" link with tracking code). This was a little tricky to do because I had to measure the percentage of traffic that only came from the camera banner (there were other ways in which visitors could reach that camera learning page). I factored in that percentage when gathering the revenue total from the "Buy Now" link.

Site C --> Site B

The second week, I measured the revenue generated from the banner on Site A to Site B (we tagged the camera banner with a different tracking code). Again, I had to measure the percentage of traffic that just came from the camera banner (there were other ways in which visitors could reach that camera purchase page).

Site A --> Site B

Findings

With my results in hand, I found that by sending visitors through Site C first (A --> C --> B), there was a drop in overall revenue compared with sending visitors directly to Site B (A --> B). However, this drop was not the significant drop off that the managers of Site B were expecting. The assumption that, sending visitors through site C was killing conversions and sales, was false.

Not only was there not a significant drop off in revenue, but I found that the AOV (Average Order Value) was actually higher (compared A-->B test) when visitors were sent through Site C (A--> C --> B) first. Visitors bought more or bought higher priced ticket items when sent through the educational site first.

Conclusion

In optimizing a web site, you're going to run into certain situations where assumptions may arise from fellow employees about what currently works and what doesn't. The great thing about web analytics is that allows you to test out those assumptions and find actionable solutions to help answer these assumptions. Make sure that you constantly test and make changes based on the data results....not assumptions.

In this case, upon reviewing my findings, the company decided that sending visitors through the educational site did not have a significant negative impact on sales and conversions. We also found that by sending visitors through the educational site that it actually helped to increase average order value.

Comments

Anonymous said…
Hi Matt,
Nice story. Thanks for writing it.

Interesting problem with three sites.

Did your site have enough traffic and conversions to make the results statistically valid? Did you have any trouble with the participants accepting the results?
Matt Lillig said…
Hi Jim,

Great question.

The site definitely had enough traffic (millions of visits a day) to make the results valid.

There wasn't much of a problem with the participants accepting the data because both groups respected the results. The company spent a lot of money on the analytics tool and knew it could provide answers. They just didn't know how to pull out the right data and put it into actionable information.

I think they were actually relieved to put all of the assumptions to rest when I showed them the results. We also learned something new about the average order value going up when sending visitors through the educational site first.

Popular posts from this blog

Hey retailers, it's time to step up your coupon game!

Hate to be the bearer of bad news retailers but I can't find your coupons online.  I've run countless coupon related searches (on all of the major search engines) for some of your largest brands such as Staples, Target, McDonalds, Home Depot, and CVS to name a few. To be quite honest, it's a lousy consumer experience. All I see are countless affiliate sites that claim to offer me coupons from the retailer but RARELY deliver.  Sites like Techbargains.com serve up links which you think will provide you with a coupon but all they do is redirect you to the retailer's home page.  I invite you to try it.  Click on one of the green 'Activate Coupon' buttons to access the particular coupon: http://www.techbargains.com/staplescoupons.cfm .  It tells you to 'Please wait while we activate your deal' as it downloads who knows what to your system.  Just when you get excited about being presented with a coupon that you can print out and take into the ret...

Monetizing Big Data In Yahoo Web Analytics

There's been a lot of talk/press already around Yahoo's new CEO, Scott Thompson, and his thoughts on big data at Yahoo.  Here are a few examples that have already been written since yesterday: Yahoo's New CEO Has Data Focus Can Yahoo's New CEO Thompson Harness Big Data Analytics? The Key To Yahoo's Long Term Health? Data, Says New CEO Rest assured though, Yahoo has some of the best big data geniuses around.  A company like Yahoo has to have some smart data guys when you're bringing in over 700 million monthly unique visitors to your properties each month.  Ever heard of Hadoop?  "The most well known technology used for Big Data is Hadoop. Hadoop is used by Yahoo, eBay, LinkedIn and Facebook. Google uses a proprietary version of Hadoop" And Hortonworks, a big data company that  spun out from Yahoo this past summer expects that "Half the worlds data will be on Hadoop in 5 years" So if Mr Thompson is looking to monetize "big d...

Analyzing Analyst Struggles

Raise your hand if you are an analyst and you've run into one of these issues before: Data Discrepancies:  Your stakeholder approaches you and says, "I noticed the data in your report doesn't match the data in this other report.  Which one is correct?" No Data:   Your stakeholder asks for some data but after some further research, you realize that product team never implemented the proper tracking.  Zero.  Zilch.  Nada. Blocked Access:   Your stakeholder asks for a dashboard that requires integration from multiple data sources.  You go to query the data but realize you don't even have access to one of the databases.   Ad Hoc Request Madness:   Multiple stakeholders ask for data, dashboards, and insights all at the same time.  And they want it yesterday. Groundhog Day Questioning:   Your stakeholders ask the same questions over and over and over again. Below I've laid out some of the Top Analyst Struggles along with the impacts ...