Enter your email and download
"The Art of Finding Winning Products"
Building a website that generates enough traffic to generate sales is no easy task. And if you’ve been growing your ecommerce business for a few months now, you’ve likely been able to bring in a decent amount of revenue for your business. But the work doesn’t stop there. Once you’ve had some proof of concept, it’s time to maximize your sales by doing A/B testing to increase your conversions. In this article, we’ll share what A/B Testing is, why you should run them, our very own experimentation workflow, an example we’ve run here at Spocket, and much more. So, let’s dive in.
A/B testing, sometimes called split testing, is an experimentation method in which you compare a new variation to a control. The control would be the current layout, color, subject line, or anything you’d like to see a lift on for conversions. The new variation is something that’s drastically different than the control.
Split tests are typically performed on subject lines in emails, buttons on your website, copy changes on a landing page, color schemes, or layouts of a website design.
For example, you might’ve heard that red is the best color for call-to-action buttons. That theory came out of a split-test. And while it can be true on some websites, others might find other colors convert better depending on the layout. You just have to test it and see for yourself as what works for one site might not work for your own. There’s too many variables at play.
A/B tests have a direct impact on conversions. By choosing things to split test things that impact conversions, such as your website layout, copy, or call-to-action buttons, you can maximize conversions on your website. For example, you might notice that the wording on your call-to-action button only converts at 0.04% and want to run an experiment to see if it could lift higher with a different word set. You can try another call-to-action and run an experiment to see which wins out. If there’s a lift, you’d use the new variation. You can then run another experiment comparing another copy to the new winner to continue trying to lift the conversions on your page.
A few years ago, a popular experiment to run was whether adding a video to your landing page could add more conversions. Considering how popular video content is these days, it’s no surprise that video content on landing pages converts better, especially for videos with good production value. You can run experiments on so many things, which will help you better understand what customers really want and need to feel comfortable with buying from your website. The more experiments you run, the more you’ll understand what helps customers with their buying decisions, so that you can sell to a greater pool of people.
It’s so common for us to change copy, designs, and layouts based on our personal preference. Sometimes things just look or sound better with our personal touch. Unfortunately, that nicely sounding copy doesn’t always convert better, though. A/B testing helps remind us that what matters at the end of the day is the data: does it bring more sales? By split-testing, you’ll be making data-informed decisions to ensure your business is as profitable as it can be. By reviewing analytics tools, looking at sales reports, and seeing the results of your AB test, you’ll have a better overall understanding of whether or not a change on your website impacts your main key performance metrics (KPIs).
Here at Spocket, we have our own experimentation workflow that we use to run A/B tests. You can use this workflow to help you run split tests efficiently.
The first stage of your experimentation workflow for A/B testing is to focus on research. During this period, you’ll collect insights that you can use to test ideas. You’ll research ideas to determine if something is worth testing or not.
After compiling all of your ideas, you need to decide what to prioritize. If an experiment will affect your entire website, you’ll only be able to run one experiment at a time to ensure the best results. During this phase, you’ll create a roadmap which will help you determine which A/B tests to start with. You might also look into which requirements or resources are needed for each experiment before going ahead with it.
During this phase of split testing, you’ll need to round up all your stakeholders. Depending on the size of your company, there may be multiple people working on these split tests with you. For example, in most organizations, the three groups that work on A/B tests test to be: design, engineering, and content. You’ll review the test with all stakeholders during this process.
During this part of A/B testing, the split test is live. This isn’t the time to make a quick change to an area of a page as you might affect the experiment. You’ll want to block people from making changes to the page during this time for the most accurate results. Inform your team not to make any changes to every page the experiment is running on with a firm timeframe.
In the final stage of A/B testing, you’ll determine which of the two options is the winner. If you find your control is the winner, great, it means you don’t need to make any changes. If the variation is the winner, you’ll likely want to implement that option to benefit from the boost in conversions. You might also want to create a document where you keep all of your experiments and results in the same place, so that as team members join you’ll be able to give people access to the context they need.
A/B tests should run for between one to two weeks. It helps to cover every day of the week, at minimum, so you don’t have any abnormalities. However, not every week is the same. For example, the week of Black Friday is typically not a good time to run an experiment. Many ecommerce companies run a blackout period during that week where no development changes are done during their peak season to prevent any possible sales loss. So aim to run your experiment, during a one to two-week period outside of any big holidays like Christmas, Easter, Mother’s Day, or even 4th of July, sales events, or any other abnormalities.
At Spocket, we’ve run countless A/B tests to better understand dropshippers and marketing as a whole. What we’ve learned might interest you too!
In one experiment, we wanted to see a lift in conversions on the Shopify App Store. So, we implemented changes to our images to detail the profit margin earned on premium products, highlight custom invoices using your own brand’s logo, highlighted shipping times and discount on annual plans, and mentioned our extensive collection of products. Overall, each image highlighted a key benefit of why people should use Spocket. The result? We were able to increase conversions from 35% per week to 40% per week.
With countless Shopify Apps available in the App Store, you can easily run an A/B test using apps that help simplify the process for you. Choose the Shopify App with the features you need. Each app offers something different, so do your own due diligence when choosing a tool that’ll help you perform accurate split tests.
Using Shogun’s page builder, you can do A/B testing on your Shopify store. You can publish up to 500 pages, depending on your plan, each with a custom layout. The drag and drop page builder allows for maximum customization. And so, it’s only natural that you’d want to see which layouts actually convert the best. The A/B testing feature is available on Shogun’s Measure plan for $99/month. Plus, with an analytics suite to boot, you’ll be able to measure your experiments with ease and with data. If you’d like to learn how to run A/B tests with Shogun, the team has created a tutorial just for this feature that you can check out here.
Approach your online store like a scientist. With PreviewX’s app, you’ll be able to determine which product’s version or theme converts better. If you’re thinking of investing in a new theme, you can first split test the theme to see if it converts better than what you have now. You can make sure that you only use the theme that’s scientifically proven to convert better. Changing attributes on your products like images can also be used to make sure you convert better on product pages too. So, if your product has multiple colors, the one that converts best can be highlighted on your collection and product page to help generate more sales to that product.
FigPii’s app is all about better understanding your customers using data. This app goes beyond split testing, and includes heat maps, visitor replays, polling, and more. When it comes to A/B testing, you won’t need to code a single line with their user-friendly interface. You can launch A/B tests on your site directly and target your homepage, cart page, product page, or any landing page on your site. You can also segment by device too. Lastly, you’ll be able to choose the metrics you want to base your experiment on, such as conversions, average order value, or revenue per visitor.
Trident’s app specializes in A/B testing on your Shopify store. You can easily test products, images, titles, product descriptions, and more. This no-code friendly app allows you to understand the impact of your experiments without needing a stats degree. The tool is equipped with analytics, so you can make informed decisions about your experiment results.
A/B testing can level up the conversions on your website without requiring you to increase your website traffic. It’s all about trying new things to help increase conversions. Whether you’re running a split test to choose the best product image or to increase conversions on your homepage, a two-week A/B test can help you achieve a higher conversion rate. And once you’ve increased it, you can add more traffic knowing that you aren’t losing more sales than you should. If you’re interested in learning more about split tests, feel free to leave a comment.
Shreyas is a Director of Product & Data at Spocket. He has 12+ years of Engineering, Program, Product, and CX experience. He has built multiple 0 to 1 products for startups to enterprise companies with expertise in the marketplace, subscriptions, and e-commerce products. At Spocket, he leads the global product organization, including product management, design, and data.