What we learn from A/B tests on demo forms from Rippling/Zendesk/Brex and others
Rippling recently wrapped up an interesting A/B test on their demo form ⬇️ They wanted to know if a center-framed form or a right-aligned form + content on the left would win. At DoWhatWorks, we find that the isolated form on page (vs. form + additional side-by-side content) tends to win. Besides Rippling, we see this from Zendesk, Brex, Elevenlabs, and dozens of others. Having tested this myself (and studied heatmaps), and talking with teams that have seen lift with the center form, I think a lot of this is tied to time-to-completion. With the additional content, folks spend more time digesting your framing and looking at your talking points (and potentially even seeing flags that worry them, like 4.4 stars on a social proof icon) before they get to form fill out. Obviously, with any of these trends, there is a lot of nuance (what is the traffic source, how does it vary by industry, how do multi-step forms factor in etc.). In addition, for individual tests like this Rippling test, when there are more variables changed at once, that can muddy things. But this is where having a large data pool across dozens (even hundreds of tests) on the same website elements can help us reduce the noise and increase the signal. Have you tested this on your demo form before? What did you find?