We have updated our Privacy Policy and Privacy Options

Got It

Optimizing an Exit Point: A Shocking Surprise


As a conversion expert who has managed numerous campaigns, I enjoy the surprise of results.  I especially like the good kind of surprise that is a huge unexpected win, but I also like the shocking kind of surprise that comes when my assumptions were not correct.  It gets my wheels turning and requires me to think of things in a different way, which I love.  I’ve been immersed in a campaign that has done just that, so I wanted to share this experience with you.

Optimizing an Exit Point- Exit Signs

We work with one of our partners to optimize a user account page on their site.  This was the first time that this funnel has been tested, so it is/was somewhat uncharted territory. Though there were assumptions, nobody was truly sure of the “who, what, when, where and why” of this traffic.  Who comes here, what do they do, what is their intent when they get here, where do they come from, and why were they prompted them to come here?  Users have many options to change account information (upgrade, keep their plan, downgrade, cancel) once they get here, so finding the answers to these questions was very important.

This created an awesome testing opportunity for a few different reasons:

1) all ideas were on the table to be tested and

2) we’d discover new, data-based info to share with the organization that would change the way they do business.

The challenge was that initial strategy would be guided by assumptions instead of actual data.  This made it imperative that the test architecture allowed us to learn what makes this audience tick, as it was very possible that assumptions would be proven wrong once the data rolls in.

So, we got started with strategy development.  My eyes lit up at the opportunity for this page.  It gave us some analytical eye candy, with miles and miles of testing ideas.  We’ve got experience in this arena, so we incorporated some proven, winning ideas along with simple best practices.  It was strictly a design test (with no changes to copy) so that our changes could be isolated and evaluated.

Here is a page from DIRECTV that is similar to our clients page. Click on the image to expand:

Strategy and creative were approved, HTML was built and the campaign was set up, QAed and ready to launch!  Data started pouring in and confidence levels were approached, so it was time for the big reveal.  No-brainer, right?

And then we saw that conversion data was…FLAT.  What?!  The Challenger was much better than the Control – or was it? Queue the exact moment of surprise that inspired this post.

Why was it flat?  We incorporated ideas that we knew would have a positive impact and we incorporated best practices that the users would have expected to see.  It’s not that it was prettier, as we know pretty doesn’t always win.  It was just technically and fundamentally a better page.  So, we dove into the data to see what was going on.

Data revealed that this page is a huge exit point for users.  They come here with a high intent to take a negative action (like cancelling or downgrading their account).  By using UX principles and best practices, we inadvertently made it easier for users to take a negative action on this page.

This campaign is just so interesting to me because it makes me think of things in a different way.  My clients typically focus on increasing conversions.  In this case, the goal was to reduce those negative actions.  To increase conversions, you optimize the page.  What do you do to reduce conversions— de-optimize?   So, what do conversion experts do when they don’t fully understand?  Keep digging!

Further analysis revealed that though results were technically flat in terms of conversions, our format was actually costing the company less money.  We identified that this page is ultimately going to be a source of negative action, by nature.  Users may be taking negative actions, but they took less costly actions on the challenger experience.  This means that we were able to roll out an entirely new page without hurting any metrics (a pretty big feat), and the best practices we put into place were truly helping to stop the bleeding.  Though the results were technically flat, this was huge.

Knowing the “who, what, when, where and why” of this page allowed us to refocus our strategy, and the thorough analysis of the data allowed us to determine that the challenger was a better page to proceed with and continue to optimize.  I continue to reflect on this campaign and love the surprise it brought to me.  I hope this gets your wheels turning, too!