I am going hog wild with A/B testing now that I have an easy way to do it. I thought I’d share some of the results:
- Login as guest link vs. No login as guest link: There is no significant difference in the number of trial signups I get if I put a discrete login as guest link on the registration form. There is, however, a massive decrease in the number of guests (obviously, they have to decline by at least 50%, right?). While I don’t get nearly any economic value out of guests, I reverted to allowing them.
- Asking for permission to email versus not : There is no significant difference in the number of trial signups I get if I show or hide the two checkboxes for signing up for the mailing list. Note that I still ask people for their email address (which I use for a username, figuring this is easiest for non-technical people to remember) and that I, stupidly, included the “We do not spam you” verbiage in the block excised for the test. That should have been left in both alternatives. Ahh well, I’ll do it again properly later.
- Reordering buttons: Reordering the bingo card pages (see example) to include the Make Your Own Cards (which leads to a trial registration dialogue) over the Download These Cards buttons increases the conversion to the trial from 11.90% to 19.82%, which was significant at the 90% confidence level. Yay, a positive result! I ended the test and adopted that behavior as the default.
What I’m testing currently:
- A new version of the purchasing page, which is specific to the online version only, for people who are logged into the online version (versus the current purchasing page). Obviously I’m looking for actual sales as a conversion here.
- The effect of the text “It is quick and easy to get started!” versus “There is nothing to download and no credit card required.” on the landing page. (Quick and easy are two concepts that have worked very well in my AdWords before.)