A/B testing on drivetime.com

Summary: I've helped design more than 75 experiments for drivetime.com and worked with our analytics teams, developers, and product team to draft hypotheses and interpret results.

My role: Strategy, user research, wireframing, UI design, UX design

Improving above-the-fold engagement on product details page

Objective

The goal of this experiment was to redesign our vehicle detail pages to increase conversion for our customers who are more vehicle-focused, while also bringing the page up to our current design standards.

Process

I used customer data and feedback to identify the most important elements on the page and prioritized those in the redesign. These included the year/make/model, photos, price, mileage, and key features. I also discussed our business goals with stakeholders to determine what actions we wanted the user to take.

I also focused heavily on competitor research. We had fallen behind our competitors on functionality and information display on our vehicle details pages and we had a lot of room for improvement. The main focus of my research was the core functionality and design of Carvana, Carmax, and CarGurus.

Results

All six variations outperformed the control by a high margin. While variation 3 had the highest click through rate, variation 6 ultimately had the highest lead capture rate and a significant increase in session-to-sale rate so was declared the winner.

20% increase in lead capture (15% > 18%)

5.7% increase in session to sale (.52% > .55%)

Testing a step-by-step conversion form

Objective

DriveTime's primary conversion path is our online financing form that allows users to see their down payment, monthly payment, and APR. This form spans two pages, each of which counts as a separate type of lead submission (step one as a contact lead and step two as a financing lead).

The goal of this test was to break this form into a "step-by-step" approach where each question would be on its own page. We wanted to make the process feel more natural and conversational, while at the same time learning what questions were major drop off points.

Design

The first step in this process was to examine our current question breakdown and compare to user feedback around which questions they didn't feel comfortable answering. We used that analysis to define the order and grouping of the questions on our form.

I took the approach of easing the user into the more invasive questions, starting with low impact questions and working toward the questions that had been identified as the biggest pain points. We ended the form with the contact info section because users had told us that those were the questions most likely to make them exit our form as they didn't want to be contacted.

I designed this as a multivariate test so that we could fully understand the effect of the changes we were making without overcomplicating the development process. I combined different options for progress trackers, verbiage, and CTAs.

Results

Though we did see an increase in total form completions, it was not enough to make up for the loss in step one/contact lead submissions from the control form so this test was not considered a winner.

1.3% increase in form completion (44.5% > 45.1%)

Increasing click through rate on marketing banners

Objective

On our vehicle search results page (VSR) we have banners placed regularly among the vehicle listings to direct the user to different conversion paths. The three different CTAs available are to get their financing terms, get a trade-in value for their car, or schedule a visit at a dealership.

The goal of this test was twofold: 1) determine if an updated banner design would increase conversions and 2) gauge which CTAs were most effective.

Design

I created two new banner designs for this test, in addition to a new experience that excluded the banner to schedule a dealership visit. I was hoping to see if different styles of CTA (button vs link) and different imagery had an effect on user interaction rates.

Results

All of the variations we tested performed better than the control experience. The final winner improved both the financing form conversion rate and the session to sale rate for users originating from VSR.

16% increase in financing form conversion (5.12% > 5.99%)

6.2% increase in session to sale (.28% > .30%)