A/B Testing Success with NCTM

A/B Testing Success with NCTM
National Council of Teachers of Mathematics logo on a blue background

Share:

The National Council of Teachers of Mathematics (NCTM) brought on McKinley Advisors in 2014 to help with various marketing efforts, including events marketing for their year-round conference services/professional development opportunities. These campaigns, developed by McKinley, all involve consistent email messaging. An imperative aspect to successful email marketing is the ability to track and analyze the metrics of deployed messages.

By using marketing automation through NCTM’s email service provider, McKinley has been able to develop more efficient email campaigns for NCTM’s initiatives. Marketing automation also plays an integral role in A/B testing implementation.

A/B Testing is a marketing research method used to determine the most effective ways to communicate messages. By testing split-audience groups in real-time, this method is a non-disruptive and accurate way to determine what works best.

PROBLEM

NCTM needed a way to determine the best way to impact members and potential event attendees, grabbing their attention, enticing them to want more information and ultimately, to take the call to action. While NCTM was regularly sending communications to their members, there was no set standard for the “from:” field or the subject lines. Therefore, it was difficult to determine the reason for the success of an email.

Before McKinley developed a strategy for testing, NCTM was mainly sending one-off messages to their audience, without measuring what was most effective. Specifics were not being tested,  or tracked, so the results weren’t measurable, and common variables couldn’t be defined.

STRATEGY

McKinley developed a plan to test the quality of subject lines based on the open rate of the emails. Using NCTM’s service provider McKinley started scheduling messages with two different subject lines. When running a clean A/B subject line test, the content of the email stays the same. However, the subject lines differ enough in a way that can directly affect the open rate of the respective emails.

Breakdown of an A/B subject line test:

– Create an email message for the campaign, with a specific call to action.
– Make two versions of the single email message, with two different relevant subject lines, keeping the actual content of the email the exact same for a clean testing strategy.
– Determine the main variable difference between the two subject lines. This helps to form a specific hypothesis. For instance, is it testing the overall length? Or whether including the
association’s name increases open rates?
– Decide which list size should be tested for the initial A/B round of deployment. For example, testing 10% of the overall list with the A-Test, and another 10% with the B-Test.
– Using the metrics of that initial test, determine the winning send by open rate, and schedule this version to go to the remaining 80% of the overall list to maximize impact.

RESULTS

McKinley tested different hypotheses through the 2017 Annual Meeting campaign. One subject line difference that McKinley tested was the consequence of having the event name in the subject versus something more general/ambiguous. For example:

A Test: “The Clock is Ticking on Early-Bird Savings for NCTM Annual!”
B Test: “The Clock is Ticking! Register Soon to Save”

This experiment yielded results that showed a higher open rate for the A-Test, which included the name of NCTM Annual at 29.2%, compared to a 27.2% open rate of B-Test.

Another example which yielded this same result tested the following subject lines:

A Test: “Solving Math Ed’s Only “Real World” Problem at NCTM Annual”
B Test: “The Only “Real World” Problem We Face in Math Education”

With this test, NCTM experienced a 34.5% open rate with the A version, and a 31.1% open rate with the B version.

McKinley also thought in the broader sphere that the “from:” field impacted open rates in correlation with the subject line. McKinley formed a hypothesis stating that when the “from:” field was simply “NCTM”, including “Annual Meeting” in the subject line was important in yielding the highest possible open rates. This hypothesis was proven to be accurate with multiple tests, early in the campaign.

A Test: “Find Out Who’s Speaking at the 2017 Annual Meeting” – 28.6% open rate
B Test: “Meet This Year’s Featured Speakers!” – 26.9% open rate

A Test: “2017 Annual Meeting: The Year’s Biggest Math Ed Event” – 34.3% open rate
B Test: “Don’t Miss 2017’s Biggest Event in Math Education” – 30.6% open rate

Creating these tests not only helped McKinley and NCTM understand the bigger picture regarding email metric success, it also lifted the overall open rates for the event campaign. The average open rate for Annual Meeting 2017 emails was 10% higher than the average open rate for Annual Meeting 2016 messages.

Below is a graph depicting the average open rate of all Annual Meeting 2016 emails (without A/B testing) alongside the average open rate of all Annual Meeting 2017 emails (with implementation of A/B testing):

 

CONCLUSION
Due to the A/B tests McKinley conducted during the Annual Meeting campaign, the hypotheses stated were proven to be correct – and from this information, McKinley is now able to strategically create more effective emails, using the results from the subject line and “from:” field tests. This testing also helped pave the way for more in-depth testing, with these newly established standards for success.

There is immense value in A/B testing throughout campaign management. Research must be done to ensure the value of your messaging reaches your audience effectively. A/B testing provides a way to accurately evaluate the quality of your messaging with a non-invasive test in real time to your audience.