Today Iām sharing something special.
Since you all loved my previous A/B test breakdown that blew everyoneās mind a few weeks ago, I had to bring you more split-testing ideas!
My friend Casey Hill (CMO at DoWhatWorks) gave me a sneak peek at his companyās database of A/B test results from companies like Apple and Spotify. In this weekās newsletter, we get to see a few of their most powerful test recommendations.
Letās go š”
Guest post by Casey Hill
A/B testing is mostly a waste of time.
Between 2020-2024, a shocking 89% of all tests failed to beat the control. Thatās not a typo.
Why does it happen? š
Most teams are obsessed with running tons of tests instead of the right tests. They trust opinions over evidence. They guess instead of knowing.
Letās fix that right now.
Here are 7 surprising A/B tests that reveal what actually works in 2025 ā
1. Clarity beats simplicity āļø
Weāve all heard ākeep it simpleā ā but the data shows a different story:
Clarity wins.
Take this test from Peloton. The winning version doesnāt have a simpler design⦠It has a clearer one.
The winner has two specific CTAs instead of one generic one:
āSee Detailsā (get specs, read reviews, etc)
āQuick Addā (add this exact item to cart)
Why it works: āShop nowā is ambiguous when youāre already on a product page. The winning version creates clear expectations about what happens when you click.
2. Carousels actually work š¼ļø
Contrary to popular belief, carousels in hero sections consistently outperform static images.
The New York Times saw this in action:
Turns out, carousels reduce cognitive load by showing one element at a time. They also create a sense of progression, guiding users on a journey that deepens engagement.
Both manual click-through and auto-toggle versions perform well (with auto-toggle potentially reducing friction on mobile).
3. āBenefit-firstā CTAs actually lose š¹ļø
This shocked me.
Iāve always told clients to focus on benefits in button text. I was wrong.
Look at Appleās test:
āStart Listeningā (the benefit) lost to āTry it Freeā (the clear action).
Hereās why: You canāt directly start listening without a trial first. Thereās a gap between expectation and reality.
The same pattern shows up across other tests. Buttons with benefit-focused text like āElevate your marketingā or āSave time nowā consistently lose to crystal-clear options:
ā āStart Free Trialā
ā āBook a Demoā
ā āSee Pricingā
The bottom line: CTAs should set clear expectations about what happens next, not sell benefits.
4. Messaging is going back to basics āŖ
Remember when Mailchimpās homepage said simply āSend better emailā?
Around 2020, they shifted to complex messaging: āDo it all with Mailchimp - Bring your audience data, marketing channels, and insights together so you can reach your goals fasterāall from a single platformā.
Oof.
This pattern is everywhere in SaaS: A company starts with a simple product that solves one problem well. Then they scale, add more features, expand globally, and their messaging gets muddier.
But in 2025, weāre seeing a major reversal:
Top companies are returning to simple language focused on their core value prop.
For Mailchimp, āGet down to business and grow salesā lost to the much clearer āTurn emails into revenueā.
5. Competitor comparisons kill conversion š„
Hereās a shocking test from Square:
They wanted to see if competitor comparisons on pricing pages would help conversion.
It tanked. š
The problem is that nobody trusts you when you talk about competitors. Those grids where you check every box and competitors miss key features donāt drive conversions.
Instead, try these alternatives:
Be strategic about placement (pricing pages are the wrong place)
Focus on your unique category position (like Klaviyo does with āthe only CRM designed for B2Cā)
Highlight ādefensibleā features your competitors donāt have
If you must mention competitors - show the value of all tools in the category and work hard to make it a more objective take.
6. Customer logos actually hurt conversion š°
This one is wild:
Nearly 70% of the top 100 SaaS brands include customer logos on their sites.
Yet when Asana tested head-to-head, the version without the logos won:
What works instead:
Specific references to press coverage (āAs seen in TechCrunchā) with links to articles
Detailed testimonials with context (āSince implementing Webflow, our monthly spend on web dev has decreased 70%ā)
Case studies organized by industry with detailed problem/solution breakdowns
7. Strikethrough pricing breaks trust š š»āāļø
Strike prices ($99.99 $29.99) feel manipulative to modern consumers.
Spotify tested this extensively and found a clear winner:
Why? When you show you can slash prices by 70%, customers question the real value. If you can make it $30 today, why not $20 tomorrow? The moral of the story is: donāt show the original price.
If you enjoyed these testing insights (I know I did), follow Casey and his team!
See you next week āļø
Tom
āIf you enjoyed this guest article, please tap the Like button below ā„ļø Thank you!