Tuesday, November 14, 2006

Case Study: A/B Testing

During my consulting stint for an online electronics retailer, we came accross a "situation" where we needed to run an A/B test. Below is a description of the test and the results.

Background

The online retailer had 3 web sites that related to this case.

1. Site A - A main portal site (the branded site with the most visitor traffic) with direct links to their other sites

2. Site B - The retail site where visitors could shop for and purchase the electronics

3. Site C - The educational site where visitors could learn about the products before they made a decision to purchase

In this case, the marketing team decided that they wanted to push the awareness a particular new camera product. To do this, the team created an internal banner and placed it on Site A (the high trafficked portal site). Upon clicking on this banner, visitors were sent to Site C so that they could educate themselves about the camera product in more detail.

Once on Site C, if the visitor was interested in purchasing the product, they could click on a "Buy Now" link that took them to the camera's product detail page on Site B.

The "Situation"

After the banner was running for a short time, a major question arose.

1. Were we losing revenue by sending the visitor directly to Site C (the educational page) first instead of sending them directly to Site B (the purchase page)?

The asumption was, that by sending the visitor first to Site C, the visitor had to take an extra step in their browsing path in order to reach the purchase site. Why not just send the visitor directly Site B from Site A and save the extra step?

The reason was because Site B and Site C were run by two different groups within the company. The business goal for Site C was to get as much visitor traffic to their educational pages while the business goal for Site B was to generate as much revenue as possible from their pages.

This of course led to some internal corporate headbutting. The group that ran Site C wanted a direct link from Site A because it helped them meet their traffic goals. But the group that ran Site B felt that revenue and conversions were being shorted if the visitors went to Site C first.

Resolution

To resolve this problem we decided to run a test and let the data speak for itself. The two groups agreed on running a two week test. The first week the camera banner on Site A linked directly to Site C (the educational site) and second week the banner linked directly to Site B (the purchase site).

Assumption: With this test, we wanted to see whether or not total revenue was significantly affected if visitors went through Site C first (A --> C --> B) instead of going directly to Site B (A --> B).

I first measured the revenue generated from the Site C "Buy Now" link to Site B (we tagged the "Buy Now" link with tracking code). This was a little tricky to do because I had to measure the percentage of traffic that only came from the camera banner (there were other ways in which visitors could reach that camera learning page). I factored in that percentage when gathering the revenue total from the "Buy Now" link.

Site C --> Site B

The second week, I measured the revenue generated from the banner on Site A to Site B (we tagged the camera banner with a different tracking code). Again, I had to measure the percentage of traffic that just came from the camera banner (there were other ways in which visitors could reach that camera purchase page).

Site A --> Site B

Findings

With my results in hand, I found that by sending visitors through Site C first (A --> C --> B), there was a drop in overall revenue compared with sending visitors directly to Site B (A --> B). However, this drop was not the significant drop off that the managers of Site B were expecting. The assumption that, sending visitors through site C was killing conversions and sales, was false.

Not only was there not a significant drop off in revenue, but I found that the AOV (Average Order Value) was actually higher (compared A-->B test) when visitors were sent through Site C (A--> C --> B) first. Visitors bought more or bought higher priced ticket items when sent through the educational site first.

Conclusion

In optimizing a web site, you're going to run into certain situations where assumptions may arise from fellow employees about what currently works and what doesn't. The great thing about web analytics is that allows you to test out those assumptions and find actionable solutions to help answer these assumptions. Make sure that you constantly test and make changes based on the data results....not assumptions.

In this case, upon reviewing my findings, the company decided that sending visitors through the educational site did not have a significant negative impact on sales and conversions. We also found that by sending visitors through the educational site that it actually helped to increase average order value.

2 comments:

Anonymous said...

Hi Matt,
Nice story. Thanks for writing it.

Interesting problem with three sites.

Did your site have enough traffic and conversions to make the results statistically valid? Did you have any trouble with the participants accepting the results?

Matt Lillig said...

Hi Jim,

Great question.

The site definitely had enough traffic (millions of visits a day) to make the results valid.

There wasn't much of a problem with the participants accepting the data because both groups respected the results. The company spent a lot of money on the analytics tool and knew it could provide answers. They just didn't know how to pull out the right data and put it into actionable information.

I think they were actually relieved to put all of the assumptions to rest when I showed them the results. We also learned something new about the average order value going up when sending visitors through the educational site first.