By Aimee Gruber, Senior Director of Outreach, The Enrollment Management Association
It is tempting to create a board report that paints a rosy picture of enrollment and trends. After all, boards occasionally have been known to shoot the messenger! However, it is imperative that enrollment managers educate heads and trustees about the complete market and landscape—including its challenges. If you think this does not apply to you (if your school is blessed with full enrollment and a wait list), there are still external factors over which you have no control that are likely to wreak havoc on your plans.
The admission funnel—a sales funnel—was created for one purpose: to facilitate predictive modeling to aid school planning. “[It was] first introduced in the 1970s as a way of looking at the recruitment and admission process on a more systemic level… it presents a static view of customers (or prospects) as they ‘fall out’ of interest in a product/service” (Admissions Lab). At a minimum, enrollment information gleaned from the funnel is needed to make critical resourcing decisions. School leaders must rely on funnel data to build budgets and to inform hiring: How many new teachers will be needed next year? Should we replace those who are retiring or leaving?
While the admission funnel remains the best way of tracking progress from one year to the next, external factors have impacted the traditional sales funnel. Most notably, it is no longer necessary for families to reach out and inquire if they want information or an application. Independent schools, like colleges, are seeing more “stealth applicants,” e.g., those students who do not identify themselves in the process until an application is submitted. The Enrollment Management Association published an article about this phenomenon in spring 2012, titled “Where Have All the Inquiries Gone?” In that article, we noted that in the last decade, independent school admission offices have seen a narrowing at the top of the funnel (inquiries). In 2001, independent day schools received eight inquiries per enrolled student. In 2009, that number had decreased to five (NAIS Tables, Admission Ratios and Percentages).
In higher education, institutions report that as many as 50% of students do not reach out prior to submitting an application. Noel-Levitz, a college consulting company, believes that the inquiry-to-applicant conversion rate is becoming a meaningless metric and advises colleges to “adapt to the new ways that prospective and future students enter (or don’t enter) the admissions funnel, as well as evolving yield and admit rates.” Despite such challenges, they still advise colleges to “stick with it,” because “without funnel data everything becomes just a guessing game.” Another recommendation is to use multiple funnels, “so you can analyze each of the pieces separately and further understand what is working well and what isn’t.” (How Do You Deal with a Changing Admissions Funnel? noellevitz.com)
Unfortunately, some Student Information Systems (SIS) require that every student record begins with an inquiry, even if the student has never technically inquired. This complicates independent schools’ quest for better data tracking, as does the focus on what The Enrollment Management Association likes to call “vanity metrics.” If families no longer have to “inquire” about our schools, why do so many heads and boards still view the number of inquiries as a measure of admission—and institutional—success?
In today’s environment, admission leaders need to parse demographic data both by those whose first point of contact was an application and also by those who inquired, in order to better understand family behavior. In the absence of the traditional inquiry, schools are looking for other data points to gauge interest—a great example of this is receipt of admission test scores. Campus visits and interviews, open house attendance, and Google analytics can all help. At a minimum, if I were still an independent school admission director, I would want to see a 5-year history of inquiries, admission test scores, applications, campus visits/interviews, accepts, enrolled, and number of students lost after depositing. I would also want data on financial aid applicants: How many? How many qualify? Yield rate?
Consider the two funnels below. The funnel on the left is the “traditional” admission sales funnel. The funnel on the right includes data points that allow for a more nuanced examination as students move through the application process in today’s world. As you can see, the funnel on the right provides more specific data, including those applicants labeled “DNI” (Did Not Identify), differences between numbers of complete and incomplete applications, and the number of students lost after deposits were received.
Given the critically important role data play in the modern admission office, it is also necessary to track month-over-month statistics to see if there are any emergent issues. In a session at last year’s TABS conference, an admission director was describing her office’s “high alert,” because they had 30 fewer inquiries for 9th grade boarding boys in the month of October compared with the year before. The school was able to attribute the previous year’s gain to a feeder school they had visited, which led to the generation of those inquiries. This is an example of great data mining, but also a reminder that schools need to track and assess the funnel in real time. In a complex and increasingly competitive environment, data analysis must be ongoing in order to drive strategic enrollment management.
The need to create and assess different funnels for different categories of students is also essential. For example, many schools separately track students applying for financial aid, knowing that they will likely yield more of the students who receive aid than full-pay students. Boarding schools (and increasingly day schools) also do this for international and Chinese applicants (their own category!). Learning support might warrant a separate funnel. One metric that colleges track is the application-to-admit ratio for students depending on the kind of application submitted—paper, online, common, etc.
So, why does the funnel need saving? Two reasons: First and as already noted, the funnel is still the best vehicle for predictive modeling, yet it has changed significantly. Therefore, if not appropriately captured and explained, funnel data is easily misinterpreted—and therefore misused—by school leaders and boards of trustees. If the funnel is not used to appropriately interpret the external environment, accurately gauge demand for a school’s program, and better understand student and parent motivation in the application process, then school leaders are inevitably blinded by vanity metrics. Here’s an example: Steep increases in inquiries and applications from China have buoyed funnel metrics for many schools. If not reported and recognized separately from other inquiry and application sources, these data might give heads and boards a false sense of enrollment health—as does misinterpreting a school’s total wait list number as a monolithic proxy for market demand by mission-appropriate students.
Second, the funnel needs saving because we lack common industry-wide data definitions. Without common data definitions, the entire independent school community lacks the ability to understand and anticipate market forces. If School A defines and reports an inquiry one way and School B another, or School A defines and reports an international student one way and School B another, then how can we possibly anticipate whether demand across independent schools is softening?
It is not surprising that at the college level, a model already exists from which the independent school community might learn. An initiative called the Common Data Set is a “collaborative effort among data providers in the higher education community and publishers as represented by the College Board, Peterson’s, and U.S. News & World Report. The combined goal of this collaboration is to improve the quality and accuracy of information provided to all involved in a student’s transition into higher education, as well as to reduce the reporting burden on data providers… Common Data Set items undergo broad review by the CDS Advisory Board as well as by data providers representing secondary schools and two- and four-year colleges. Feedback from those who utilize the CDS also is considered throughout the annual review process.” (www.commondataset.org) If all independent schools utilized the funnel appropriately and adhered to standard metrics for reporting and analysis, imagine the predictive and collective power they would have to understand market forces and mitigate their very real competitive threats! Now is not the time to continue asserting our “independence” or current fragmented approach. The Enrollment Management Association is honored to collaborate with several colleague organizations—NAIS, AISAP, and ERB—to help raise awareness about this important issue and to help convene enrollment thinkers about the common metrics and definitions our industry requires.