I designed a new conversion form for DriveTime.com, transforming it into a step-by-step "wizard" experience. After testing we ultimately kept our control form, but the new design resulted in a 1.3% increase in completions and gave us insight into what fields were causing friction for our users.
My role: User Research • Competitive Research • UI/UX Design
One of DriveTime's core product offerings is real online approval for vehicle financing. However, in order to do this we had a somewhat daunting lead form: two pages, multiple questions, and we need to ask for sensitive information (including social security number). As you can imagine, that was a tough sell.
Going into this experiment, our goal was twofold:
Our hypothesis was straightforward: If we make the lead form more conversational, with each question on its own page, more users will complete the form and get their financing approval. The primary KPI for the A/B test was form completions — secondary KPIs included individual page completions (progress through the form), exit rate, and user drop off points.
Before digging into more specific user or competitive research I wanted to make sure I had a solid grasp on the current landscape around conversion form best practices. I looked into case studies, think pieces, and UX community forums to pull out some of the top tips.
To dig into competitor research, I looked at banking apps, other vehicle financing flows (like LendingTree), and other buy here pay here dealerships (Carvana, CarMax). I was specifically focused on how they were talking to their users, how they were indicating progress, and how they were breaking up their forms into different steps.
All of the questions in our form were necessary for building a credit profile and giving our customers an online approval, so we needed to dig into which questions in particular might be giving them pause. We were able to gather some key pieces of information based on data and user insights:
Our first step was deciding how to order the form questions. Since we knew that phone number and SSN were the questions that gave customers the most trouble, I moved those to the end and decided to start with ZIP code to give the form a low barrier to entry.
After the order of the form fields, the main elements we wanted to work on were the buttons, the progress bar, and the verbiage. We decided to run a multivariate experiment in order to test multiple versions of each element and get the best possible combination.
I created different designs for the step-by-step form and worked with our copywriter to come up with verbiage options. We gathered feedback from our senior leadership and product teams and pared down the concepts until we had our final multivariate setup.
We wanted to test the following:
We saw a 1.3% increase in form completions. While this was an improvement on the number of people who completed both pages of our previous form, we also needed to account for the first page of our form. By changing the setup, we were losing about 3.3% of people who only completed the first page and could also be contacted as leads. Because the improvement didn't clear that threshold the new form was not considered a winner.
The best combination of elements ended up being a percentage based progress bar, a combination of short headline with context, and the CTA without a back button.
One of the most important things we were able to learn was exactly where people were dropping out of the form. This helped us prove and disprove some of our earlier thoughts. The first step of the form continued to be the highest dropoff point and SSN was also quite high. However, phone number and email were among the lowest (which could be attributed to us moving them to the last step).
We were able to use the dropoff points and best performing UI elements to ideate on further experiments for our conversion path, including a single page form and moving the phone number to the second page.