How A/B Website Testing Can Improve Student Recruitment

Last updated on: October 10, 2023

clock icon 9 minute read
clock icon 9 minute read

If your college is evaluating its student recruitment efforts, your website may be a good place to start your assessment. Your school’s website presents program offerings and delivers key information to prospective students, but is it effectively converting website visitors? Or are improvements needed?

website testing how to test website multivariate testing

Implementing A/B website testing is one way to find out. Also referred to as split testing, A/B testing is a method of comparing two different versions of a web page to discover which one is most effective at improving your student conversion rates and other goal metrics. It is done by randomly serving two versions of your website to different visitors, but with just one variation to the website (e.g. button color, headline copy, form location, etc.). Fifty percent of your website visitors will interact with version A, and the other fifty percent will interact with version B, allowing you to you measure the results in real time and see which version generates the most desirable results.

To give you a better understanding of the impact A/B testing can have on your student recruitment efforts and what the implementation process entails, we spoke with Rafael Zorrilla, our Director of Digital Experience, about how to test website effectiveness via this method.


Q: Why should universities A/B test their websites? What can they learn and is the ROI worth the effort it takes?

A: There are two key reasons universities should A/B test their websites: the users are changing, and the demands of the user—how they engage with the internet—are changing, meaning schools need to evolve to stay competitive .

Today, you have a lot of people considering mobile first—that’s user demand. People are using smartphones and tablets over desktops, and that makes for a very different website experience. To stay relevant, you should evolve with your target students’ behaviors and preferences so you’re not left behind on your user interface. You don’t want to only offer desktop experiences when many people (especially younger generations) are using a mobile phone to access your website. You’ve got to meet that user demand, and to do that effectively you need to change the way you present information and design your website experience, which is where A/B testing comes into play. With testing, your school can gain insight into how prospective students navigate and use your site in both the desktop and mobile formats.

The second thing that makes A/B website testing so critical is the need for a competitive edge. One of the biggest drivers for an organization is having a unique or differentiating point, and higher education is no different. Testing and experimentation is a data-driven way to inform the evolution of your site, capture missed opportunities, reduce friction points, and deliver more value to your students. You’re not going to do that by standing still. You need to keep moving forward and looking to improve – even when you have a great brand and reputation. A/B testing can act as a guide to help you uncover missed opportunities to better engage with prospective students. Unlike multivariate testing, A/B testing allows you to test incrementally to see how individual changes affect user behavior.

Q: How long should a typical A/B test last? What’s the ideal sample size for the best results on an A/B test?

A: In general, the larger the sample size, the greater the test’s accuracy. My recommendation for colleges and universities is to avoid tests shorter than two weeks, because in the education industry the days of the week really vary. People behave differently from the weekday to the weekend, so you want to run the test through a couple of Mondays and Sundays, plus any holidays. Yes, you want a large sample size, but you also need to temper that with a little time. Also important: both versions of your website must be tested in the market at the same time so they are subject to the same market conditions. This normalizes any external factors that could influence a test that isn’t A/B.

Q: What are some examples of specific features on university websites that can be tested and improved with A/B testing?

A: The whole point of A/B testing a website is to make sure it’s meeting the users’ needs and achieving your goals. You need to make the immense amount of content usually found on university websites more digestible and intuitive for the prospective student to process, which can reduce their confusion and website drop-off. You need to look at the smallest of details and the most complex. For example, this can translate to the positioning, wording, and design of the call-to-action (CTA) button on a request for information (RFI) form. In A/B tests of CTA button wording that we’ve run on our university partners’ websites, we’ve seen that the words “Request Now” or “Learn More” generate more clicks than the word “Submit.” These kinds of words resonate with users by helping them recognize that there’s something more than just a transactional exchange going on with the form, and this reduces user anxiety around why they should click.

In addition, we’ve also A/B-tested the design of RFI forms to determine if the number of input fields (the number of questions asked of the users) impacts the frequency at which a prospective student submits a form. From the tests, we’ve learned that the number of input fields in a form does matter, as more students submitted forms if they were shorter. With that information we advise our university partners to keep the number of fields in an RFI form to a minimum—around seven or eight depending on the school—and to minimize the amount of unnecessary questions asked. When a user sees a short form of just seven fields, he/she thinks “Oh, I can do this really quickly.” Through testing, we have proof (not just hunches) that prospective student conversion rates increase when RFI forms are shorter.

Q: Let’s say I’m a university looking at my own website. How would you recommend that I perform an A/B test?

A: There’s a three-step process in website testing: ideation, building the experiment, and analyzing the results. Ideation is probably the most crucial part, because you want to start with a hypothesis that can be tested and show data analytics. You’ll have to sort through your analytics platforms, like Google Analytics, to see how users are engaging with your website. All of these factors will help you understand what users are doing, who they are, and their friction points. Universities need that analytical foundation to identify which website elements are possible to compare performance results on. Once you have that analytical foundation, you can run experiments using a platform, like Optimizely, that can launch an A/B test quickly.

Q: In your experience, why might schools struggle to A/B test?

A: I think there are a couple of reasons. I think the mistake that many organizations make—and this isn’t just in higher education—is that they isolate testing and experimentation to one department or single resource. Instead, a testing culture needs to be instilled throughout the whole organization from the top down because it’s going to take a team effort and doesn’t work in a vacuum. Testing is dependent on resources and access to the right technology, and without organizational support for that, testing would be a monumental task.

The other issue is patience. We want results today, but tests always take time. A large sample size requires more time. Some schools, especially large universities, don’t have an issue with sample size. However, some smaller schools may need more time to obtain a large enough sample size. This can be frustrating and lead to test times being cut short. But when organizations start rounding corners, they start losing statistical significance or confidence on their tests. There are ways to get around statistical confidence, but you don’t want to build on a false positive and put something out into the market that you know could be detrimental over time. You want to build on a solid, proven foundation.

Q: If you were talking directly to a school about A/B testing, is there anything you feel like they should know prior to doing it?

A: Objectivity can be a challenge. Everyone goes into a test with a bias, right? Staying unbiased during a test is a challenge, but you need to stay patient and let the data speak. Remove the subjectivity, and stick to data-driven objectivity.

Another thing to be aware of is that not all tests are winners. Sometimes you don’t have a favorable result. You may need to tweak the experiment several times to have a clear, favorable outcome. There’s a little bit of art and science involved: although testing is definitely data-driven, there is also a need for solid business judgment.

Last but not least, it’s important to “know when to fold ’em.” Sometimes, you’ve got to know when to end a test or give up on an idea if it’s not working out.

Q: When a university is conducting A/B testing on their website, which metrics should they focus on?

A: This is very important, as you need to identify specific success metrics to analyze your test results and inform decisions. You can look for something that impacts your college or department’s goals the most, like inquiries from prospective students. Most of the tests we run for our university partners’ websites are based on student queries and applications, which are great indicators of how well the website is doing its intended job and meeting student needs.

But often, you need to measure the less tangible results of your website test, like engagement and education. Are your website visitors more engaged? Are you better educating users? To determine this, you can look at secondary metrics like time on site and average pages per session, which are a good indicators of how engaged your website user is. Typically, the longer they’re on your site, the more likely it is that they’re reading about and researching your school. Our tests have shown that the more pages a user visits, the more likely they are going to convert. These are just a few of the metrics colleges should be tracking to improve the functionality and experience of their website.


Making A/B testing of your college’s website a priority can help you stay competitive, evolve to meet user needs, and convert more prospective students. For additional marketing, recruiting, and enrollment-related recommendations, like how to improve your prospective student conversion rate with website copy and design and how data can help with enrollment operations, visit our Resources page.

  • Let's Talk.

    Complete the form below, and we’ll be in contact soon to discuss how we can help.

    If you have a question about textbooks, please email sscteam@wiley.com.

  • By submitting your information, you agree to the processing of your personal data as per Wiley's privacy policy and consent to be contacted by email.

  • This field is for validation purposes and should be left unchanged.