Benchmarks for Higher Ed Admissions Websites: AI Visitng 1,500+ Websites - Part 1
The student experience has many different faces.
A few weeks ago, I visited over 1,500 college websites to collect deposit information for our 2025 College Deposit Database. And I realized something: navigating these websites to find information was often frustrating.
From the perspective of a prospective student, if I had this much trouble, what does that mean for the students just starting their college search? What benchmarks should universities be aiming for if this is their digital first impression?
I decided I wanted to dig deeper and try to answer that question.
So—with the help of a little technology (I’ll release a full post later walking through exactly how I built it)—I created an AI-powered crawler. It acts like a prospective student visiting those 1,500 college websites and attempting to find important admissions-related information on its own. Meaning there was no manual browsing on my part—just automated, scalable data collection.
This post is the first in a three-part series focused on the prospective student journey and the role UI/UX plays in that process. We’ll look at the data behind how institutions present themselves to students on their websites and examine how those design decisions could potentially affect both student experiences and enrollment outcomes.
Let’s dive in.
Hypothesis on User Experience
My initial hypothesis was a basic, but so so should finding information for a prospective student on your site:
The fewer clicks it takes to complete common action items, the more user-friendly the site.
The better the overall UI and UX, the more likely students are to apply organically.
Obviously, there’s no way to perfectly separate organic from paid application growth, but by looking at changes in application numbers across both a one-year and three-year timeframe can give a solid read on whether a school’s website reflects a student-centered philosophy.
The goal was to get information on three key metrics:
How many clicks it takes to Apply
How many clicks it takes to Schedule a Campus Visit
How many clicks it takes to Request Information
In addition to those interaction metrics, each institution’s User Interface design was scored—looking at whether the layout, content, and messaging prioritized prospective students.
In the final section, we’ll combine all of these results to get a holistic sense of how institutions are performing. Are these website flows helping or hurting engagement?
Let’s break it down.
Click to Find an Apply Button
The first metric tracked was how many clicks it takes for a student to reach the application button. The goal wasn’t to determine whether the button linked to the Common App or a school’s internal system—that varies too much from school to school—but instead, how quickly a prospective student can find an obvious call-to-action like "Apply Now."
To detect the button, the AI was instructed to look for any visible variation of an "Apply" button or link. When you click this button, some schools link directly to their internal application, some to the Common App, and some to a middle page. This meant there still might be extra clicks to apply, but it made it a consistent item to look for. Regardless, the key is visibility. From a student’s perspective, having an easily accessible "Apply" button right on the landing page is a win.
Across all the benchmarks analyzed, this was the most conclusive.
Application | Number of Schools | Median 1-Yr Application % Change | Median 3-Yr Application % Change |
0 | 1391 | 5.51% | 12.50% |
1 | 65 | 4.12% | 24.30% |
2 | 32 | 1.71% | 31.16% |
AI Error | 50 | 5.51% | 12.50% |
Most institutions had an apply button directly on their homepage. The median number of clicks required was 0, and the mean was just 0.06. Out of the 1,500 schools, 1,391 had the apply button immediately accessible when you loaded their website.
Clicks to Visit Campus
Campus tours are a key part of the college decision-making process, offering students a tangible sense of what life might be like at a given institution. So the second metric tested was how many clicks it takes to schedule a campus visit.
Initially, the goal was for AI to go deeper—evaluating the types of questions on the form, how long the forms were, and what kind of visit options were available. But it quickly became clear the forms were too inconsistent and the AI often got tripped up. So the focus got simplified: just count how many clicks it takes to reach a visit scheduling form or calendar.

Fortunately for my AI crawler, many schools using Slate or Salesforce had relatively consistent calendar interfaces that made detection easier.
Tour Schedule | Number of Schools | Median 1-Yr Application % Change | Median 3-Yr Application % Chang |
1 | 244 | 6.06% | 10.12% |
2 | 718 | 6.25% | 13.56% |
3 | 345 | 3.77% | 13.91% |
4 | 148 | 4.43% | 14.45% |
5 | 73 | 1.66% | 9.21% |
AI Error | 10 | 10.11% | 38.28% |
Across all institutions, the median number of clicks to reach a visit scheduling tool was 2, and the mean was 2.39. This makes sense—most experiences follow a path like: click "Visit Us," choose your tour type, then arrive at the calendar.
Notably, no school had a visit calendar directly embedded on their homepage. However, schools that beat the average—requiring just 1 click—typically had a clear "Visit Campus" or "Schedule a Tour" button that brought users straight to a calendar or form, often optimized specifically for undergraduate admissions.
Clicks to Request Information
Inquiring and collecting prospective student data is a vital part of the multi-year engagement process. The goal is to get students into your communications funnel early—and that starts with making it easy for them to request more information.
This was the third major metric tracked: how many clicks does it take to reach a form where a student can submit their information?
Number of Clicks | Number of Schools | Median 1-Yr Application % Change | Median 3-Yr Application % Change |
0 | 89 | 4.24% | 4.68% |
1 | 485 | 6.73% | 15.73% |
2 | 455 | 6.31% | 13.42% |
3 | 194 | 3.31% | 13.93% |
4 | 67 | 7.02% | 16.24% |
5 | 33 | 5.10% | 13.67% |
6 | 10 | 2.12% | 27.40% |
AI Error | 205 | 4.25% | 4.68% |
The median number of clicks required was 2, but the mean was slightly lower at 1.81. This suggests more institutions had a smoother experience here compared to tour scheduling. Many schools had a simple “Request Info” button that linked directly to a form without intermediate pages. In fact, 89 institutions had their request info form placed directly on their homepage.
But here’s where it got interesting. When comparing this data to acceptance rates and application changes,there was no clear correlation. Just having the request form front and center didn’t automatically translate to more applicants.

But these metrics can’t be looked at in isolation. You have to layer them together to see the full picture. And it seemed like one-click access—not zero—was actually the sweet spot.
Digging deeper revealed that many of the schools that placed the form directly on the homepage were targeting nontraditional students, continuing education seekers, or certificate programs. Which made sense—they’re likely trying to reduce friction as much as possible for students who might be enrolling later in life or exploring options.
That’s not to say having a form front and center is bad. It’s just a reminder that your landing page’s primary job is to communicate clearly,and that includes making space for more than just the form itself. Striking the right balance between visibility and experience often varies based on audience.
UI Ratings for Higher Ed Sites
Out of all the metrics explored, this was the most subjective—but also the most revealing.
Institutions range widely in terms of design, branding, and visual experience. The goal here wasn’t to find perfect scores, but whether AI could reasonably assess the quality of a school’s website from a prospective student’s point of view.
Surprisingly, it actually worked well. As results came in, the system started picking up on subtle but important cues: layout, structure, clarity of CTAs, student-centered language, and overall visual hierarchy. In fact, it even helped surface a few under-the-radar schools with standout user experiences.
Here’s how the AI approached it:
Load the homepage and give a UI score.
Navigate to an admissions or “prospective student” page and give a second score.
Repeat this 2 more times across different core pages.
Average the results for a final score, on a scale from 0 to 10.
UI Rating Scale (0–10, Prospective-Student Focus)
0: Cluttered legacy design; no apply CTA, confusing hierarchy
1: Outdated layout; minimal student focus or apply prompts
2: Basic CSS; inconsistent styling; apply path hidden
3: Generic modern style; application link present but not prominent
4: Clean interface; student info sections visible but subdued CTA
5: Standard responsive design; clear menu; apply CTA in header
6: Polished layout; highlighted CTA; student guidance sections
7: Engaging visuals; apply CTA on landing; intuitive student flow
8: Highly refined; clear guided paths; interactive student-focused elements
9: Near flagship quality; dynamic student prompts; strong visual hierarchy
10: Flagship UI; student-first experience; instant apply CTA; predictive guidance

Only 17 institutions received a score of 8 or higher. None scored a 9 or 10.
The median score was 7, and the mean was 6.8. This suggests that while most institutions have adopted clean, responsive design principles, very few have gone the extra mile to fully optimize for prospective students.
What really separated the top scorers was language and targeting. Were they optimizing for prospective students? Current students? Alumni? Schools that scored highly had clear value propositions, modern visuals, intuitive site structures, and strong CTAs for key actions like “Explore Majors” or “Visit Campus.”
This is where things got interesting.
Even when two schools had nearly identical UI designs, the messaging made a major difference. For example, I looked at two Penn State branch campuses—Penn State Lehigh Valley and Penn State Scranton. On the surface, their websites looked similar. Can you guess which scored higher?


The AI gave Scranton a score of 6.5, while Lehigh Valley scored just 6.
Why? Scranton's homepage was slightly geared more toward prospective students. It featured action-based buttons ("Apply Now" vs. "Apply"), Contact Us vs. Giving, and immediate access to admissions resources. Lehigh Valley has more neutral tone and has Learn More rather than action items like visiting for an open house. This highlights the difference in wording plays into the scoring. It valued the website being framed for prospective students rather than current students or alumni.
That subtle distinction made a measurable difference.
Combining Website Results
Part 1: Initial Results Were Inconclusive
After collecting all the data, each institution was assigned a percentile rank across the four categories: Apply Button, Visit Campus, Request Information, and UI Rating.
Here’s how the rankings worked:
For click-based metrics, fewer clicks = higher percentile
For UI scores, higher score = higher percentile
So, for example, if your school had an apply button on the homepage (0 clicks), you'd be ranked in the top tier. If your UI score was above the median (which was 7), you'd fall into the top 50th percentile.
I was excited to dig into the results and see how this all played out. But initially, something didn’t add up.
Percentile Rank | Median 1-Yr Application % Change | Median 3-Yr Application % Change |
0%–10% | 1.40% | 34.02% |
10%–20% | 0.99% | 28.10% |
20%–30% | 6.28% | 31.55% |
30%–40% | 4.03% | 23.64% |
40%–50% | 2.83% | 10.95% |
50%–60% | 5.47% | 12.40% |
60%–70% | 5.50% | 13.71% |
70%–80% | 6.38% | 12.74% |
80%–90% | 5.30% | 5.92% |
90%–100% | 5.37% | 13.93% |
My hypothesis was that better UI/UX would correlate with higher growth in applications. And in the past 1 year, that held true—applications were slightly up for schools with better UX scores.
Percentile Rank - Combined Buckets | Median 1-Yr Application % Change | Median 3-Yr Application % Change |
0-50% | 2.91% | 20.10% |
50-100% | 5.92% | 12.60% |
But when I expanded the view to a three-year window, there was no statistically significant difference in application growth between schools with strong UI/UX and those with weaker designs.
At first, I thought this disproved my original theory.
But then I noticed a pattern: many of the schools that scored poorly on the UI audit were large flagship universities and Ivy League institutions. These schools aren’t optimizing their websites for more applications—they’re not trying to attract more students.

Looking at the distribution of acceptance rates, approximately 90% of schools have an acceptance rate of 40% or higher. Similar to how I removed outliers when analyzing yield in the deposit database, I applied the same approach here to focus on institutions within a more representative range. By excluding extremely selective outliers, the data offers a clearer picture of the typical admissions landscape across most colleges.
Part 2: Filtering for Most Institutions
With this insight, the data was filtered again—this time removing schools with acceptance rates below 40%, since these institutions are typically not fighting for additional applicants.
With the numbers adjusted, the trends became clear.
Percentile Rank | Median 1-Yr Application % Change | Median 3-Yr Application % Change |
0%–10% | 1.40% | 34.02% |
10%–20% | 0.99% | 28.10% |
20%–30% | 6.28% | 31.55% |
30%–40% | 4.03% | 23.64% |
40%–50% | 2.83% | 10.95% |
50%–60% | 5.47% | 12.40% |
60%–70% | 5.50% | 13.71% |
70%–80% | 6.38% | 12.74% |
80%–90% | 5.30% | 5.92% |
90%–100% | 5.37% | 13.93% |
Full Table of Data
Percentile Rank - Combined Buckets | Median 1-Yr Application % Change | Median 3-Yr Application % Change |
0-50% | 4.96% | 10.13% |
50-100% | 5.98% | 11.31% |
The baseline median 1-Yr Application % Change is 5.3%, the 3-Yr Application % Change is 13.2%
Schools above the 50th percentile in UI/UX scores had 1.02% higher application growth over one year.
That may sound small, but it’s actually a 20% faster growth rate year-over-year (5.98-4.96)/4.96 = ~20.5%.
Over three years, these same schools had 15% faster growth.
In other words, once you remove schools that already have excess demand, the results support the hypothesis: better UI and smoother user experiences are associated with stronger application growth.
And this is likely just the beginning.
If institutions begin to layer these UI/UX principles beyond what has been examined—talking with counselors, connecting with other students, or other steps prospective students take—the compounding benefits could become even more visible.
This data gives us the first glimpse into how user experience design isn’t just a branding exercise. It can be a powerful enrollment tool.
Takeaways
So does this mean the magic solution to increasing applications is simply reducing the number of clicks it takes to complete key tasks? Not exactly. The real takeaway is that this is a philosophy shift. Every time a student interacts with your institution—whether on your website, via email, or in person—you should be asking: Is there a barrier preventing them from taking the next step forward?
This lines up with what we’ve found in our deposit data as well. In some cases, deposits themselves can create friction and discourage students from enrolling. You could argue that schools with better websites are just spending more on top-of-funnel marketing—and maybe that’s true in some cases. But in a world where name buys are declining and the enrollment cliff is looming, optimizing your yield percentages matters more than ever.
It's about fitting into students’ lives—meeting them where they are, not just on your website but across all the platforms and touchpoints they’re already using. Make it easy for students to apply, to visit, to connect, and to start seeing themselves as part of your community. Because if you don’t, another university will.
Audit your flows. Have your admissions team—and even better, real prospective students or high school counselors—walk through your website. Watch where they get stuck. What seems obvious to you might be completely missed by someone unfamiliar with your structure. I’ve seen it firsthand, even with seemingly simple actions like finding the Request Info form.
This isn’t about checking a box. It’s about rethinking how you support students before they become students.
The next part of this series will focus on the admitted student experience.
Methodology
A full tutorial breaking down the entire process will be linked here once it’s ready, but here’s a high-level overview of how the system was built.
To create the crawler, I used a combination of OpenAI’s GPT API and a developer tool called Browserbase, which allows you to control a browser using code. The AI served as the “brain,” navigating through websites like a student would.
Each run began with a clear goal—for example, "Find the Apply button" or "Locate the campus visit scheduling form." The AI was told to begin clicking and was given a maximum number of clicks it could use to reach the goal. If the AI needed to hover over headers, it counted as a click.
If it didn’t succeed on the first try, it would attempt up to three alternate paths. If it still couldn’t reach the goal (or got stuck in a loop), the process would be aborted for that school. Interestingly, when the AI failed, it was usually due to confusing layouts or poor page structure—things that would confuse actual students, too.
So, while the AI wasn’t perfect, its limitations actually highlighted the same friction points that human users would experience.
Raw Data
I’m not releasing the full CSV file with all results—honestly, it’s messy, inconsistent, and wouldn’t be that helpful to sort through.
That said, if you’re curious about how your specific institution performed—especially on UI or click-based accessibility—feel free to reach out. I’m happy to share your individual scores and observations.
Ultimately, my goal with this project was to surface clear benchmarks that can help institutions audit their own admissions flows.
Some of the drop-off points and friction the AI encountered were very likely the same spots that real students get confused or leave. So whether you scored high or low, there’s always value in using this kind of feedback to refine your prospective student journey.