EMA & ERB are excited to announce our intent to merge. Read more.

Counting What Counts

Jonathan E. Martin
January 17, 2013
Jonathan E. Martin
January 17, 2013

Counting What Counts

Jonathan E. Martin
January 17, 2013
Jonathan E. Martin
January 17, 2013
Not everything that can be counted counts, and not everything that counts can be counted.
— Albert Einstein

So much of what really matters in education just can’t be measured.
— Independent school educators everywhere


Count me in. The quotes above are words I’ve uttered not dozens or scores but hundreds of times during my 15 years of independent school administration—and I very much believe I am in good company. Indeed, how can I argue with Albert Einstein?

But perhaps I am wrong. I’ve been enjoying reading this month a book which shakes my conviction that there is much of value that cannot be measured—and which gives very good guidance in how we can improve the way we capture in data just about anything we desire to know more about. The book is entitled How to Measure Anything by Douglas Hubbard—and although in my experience it is not a much discussed book in educational circles, I think it should be.

Grant Wiggins, author of the essential education book, Understanding by Design, is a fan of this book, directed me to Hubbard’s work in a blog post entitled “Oh You Can’t Measure That.”

Recently, I read a great book that might be of interest to anyone who wants to get beyond a knee-jerk reaction about what can and can’t be measured. The book makes the point from the git-go, in its title: How to Measure Anything: Finding the Value of Intangibles in Business, by Douglas Hubbard. Don’t let the ‘in Business” part throw you off. Almost everything in the book speaks to educational outcomes.

Hubbard writes with an axe to grind, and what becomes clear in the reading is that education is far from the only field or profession where managers express, frequently, their view that something, or most things, can’t be measured. This is Hubbard’s bête noir, one he is determined to confront with this book.

Often an important decision requires better knowledge of the alleged intangible, but when an executive believes something to be immeasurable, attempts to measure it will not even be considered.

As a result, decisions are less informed than they could be. The chance of error increases. Resources are misallocated, good ideas are rejected, and bad ideas are accepted.

Hubbard embeds as foundations to his argument three genuinely inspiring and impressive stories of measurement—times when individuals generated creative, ingenious methods for measuring something thought to be immeasurable—most famously and wondrously, Eratosthenes’ uncannily accurate measurement of the circumference of the earth two hundred years before the Common Era, using nothing more than shadows of the sun.

Fundamental to Hubbard’s argument is the particular definition he employs for the concept of measurement—and in the work of thinking about what and how we assess applicants for admission, it would seem critical to consider carefully definitions of measurement. For Hubbard, and he could not be more emphatic about this, measurement is a quantitatively expressed reduction of uncertainty based on one or more observations.

Hence, measurement never entails certainty or even near-certainty – it merely offers a reduction of uncertainty, a reduction which can then be used to improve a decision or analysis. Measurement doesn’t replace reasoned judgment; it informs reasoned judgment, because reducing uncertainty is always going to contribute to improved decisions, though it will never provide any guarantee of correct decisions.

The next key step in measurement is to be very clear about exactly what is the object to be measured—a surprisingly easy step to trip over. In parallel thinking, Susan Brookhart explains in her book How to Assess Higher Order Thinking is that the very first and most critical step is “Specify clearly and exactly what it is you want to assess.”

Brookhart very deliberately breaks down terms such as critical thinking and creativity into their parts, explaining only by doing so can you build tools for measuring student proficiency. Hubbard does the same: he shares many anecdotes of consulting to companies asking him to measure “strategic alignment,” “flexibility,” or “customer satisfaction”—all of which his clients tell him can’t be measured—and his response is always: “what do you mean, exactly?”

Be sure when developing what we really want to measure that its determination is driven by “why": “the purpose of the measurement is often the key to defining what the measurement is really supposed to be.” The why and the what are ultimately intertwined in a compelling dynamic.

As Grant Wiggins writes, so much of what we argue about when we discuss educational measurement have to do with these particular ambiguities: “We are either overly-simplifying complex aims or failing to measure something that matters because we think our measures must be perfect.”

With the goal of demystifying and democratizing measurement, Hubbard repeatedly reminds us it is not that complicated. There is an array of measurement approaches to be used: “measuring with very small random samples, measuring the population of things you will never see all of, measuring when many other even unknown variables are involved.”

These four assumptions—counterintuitive assumptions—Hubbard explains can bring measurement closer to our everyday practice:







The roadmap to effective measurement is also simpler than you may think.







Most of all: just do it. Measurement is an iterative process: we only get good at it by doing it and learning from each application.




Reports from the field




Every month or two it is my intent to share reports from the field, conveying to the readership here what has come to the attention of the Think Tank in the way of experiments in admissions assessment being undertaken by SSAT members and others in the admissions field. If you would like your school’s unique or experimental program profiled here, contact me.


One such example comes from Westminster School (GA), whose Director of Admissions, Marjorie Mitchell, is a member of our SSATB Think Tank. Returning home from the Chicago SSATB Annual Meeting, Marjorie reports, she had an inspiration. Recognizing her school’s application form had not changed in over 20 years, and knowing that she wanted to do more to differentiate her applicants and begin to distinguish among them, particularly in regards to their creative thinking, she decided to change the open response part of the application.

Formerly, the only prompt offered was this: In the space provided on this and the next page, write about an experience you had which taught you a lesson.


Now, inspired and illuminated by the suggestions of Dr. Robert Sternberg in his keynote address at SSATB, Marjorie developed, along with a few of her admissions committee colleagues, these two new optional alternate prompts:




Write a creative story (150-200 words on the next page) or poem that includes one of the following sets of words







Use your imagination to create and illustrate a scene from a story using one of the sets of words in item #2. While you will not need to write the story, please explain what you have drawn.




Marjorie reports her delight with the number of applicants choosing one of the new alternate prompts, and that she is getting new windows into the minds of these students—these can be “revealing self-portraits”— and that reading has become much more fun and interesting.




Much more importantly, she sees the possibility that this part of the application may become a more useful differentiator than it has been in the past. Currently, she says, the interview is the key distinguisher among the many fully qualified applicants—but she believes the interview favors those who are especially socially savvy and extroverted.








Her goal is to challenge the favoritism the interview process shows toward the extroverted. We are in a moment—see most especially Susan Cain’s Quiet: The Power of Introverts in a World That Can’t Stop Talking and the accompanying TED talk—of rising recognition of and respect for the important societal contributions introverted personalities make—and she believes that reinventing the application prompt will bring greater balance to the differentiating aspects of the admissions process.




-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------




Exciting experimentation with admissions assessment at the college level can be found at the new Innovation Portal, developed by University of Maryland professor Dr. Leigh Abts, who has long been a leader in his dedication to “improving the K-12 pipeline to engineering education.” Abts has a vision of engineering, he explained to me recently, that while inclusive of the importance of math and science skills also very much emphasizes creativity and creative problem-solving—and he fears that many K-12 students who would be terrific contributors to the profession are turned away because they perceive it as a non-creative endeavor.




Abts and his team interviewed college admissions officials and others about how to improve the methods of evaluating the creative problem-solving abilities of applicants, and found interest, but hesitation about the process. In language familiar to all of us in the world of admissions, they told him they were concerned about a systematic process and a standardized assessment tool:




Without a systematic process for reviewing original student design work there is no way to incorporate the value of the work into the algorithm of college admissions or any other recognition process.




Without a standardized assessment tool to organize and evaluate any submitted work there can be no systematic process.




What was needed was a well-structured and validated assessment rubric centered on the design process itself coupled with a secure means for students to build portfolios and connect their work to potential reviewers.




Abts and his colleagues, rather than being deterred or depressed, took the opportunity to put this information to use, and have developed what I think is a very exciting new resource, the Innovation Portal. Take the time to check it out online.

As it is described on the website, the portal contains two key components: first, an assessment rubric for the engineering design process, the “Engineering Design Process Portfolio Scoring Rubric” or EDPPSR, which will be “the central focus of a three year university study to validate and refine the work into a reliable assessment tool for wide spread use in the engineering education community.”





Second, it offers a “centralized hub where students and teachers can work in a secure environment to build detailed portfolios of their projects, [aligned to the] required standardization for organizing and displaying their portfolios.”



Abts believes there is no reason his portfolio assessment portal and platform couldn’t be used at the secondary level, and indeed is entirely open to experimental collaborations with high schools. Please contact me if your school might like an introduction to explore such an exciting project.



EMA Members can view the full report in our Member Community.

Become a member to gain access to our full magazine
and more professional development tools.

Subscribe to learn more about EMA and our services.

Jonathan E. Martin
January 17, 2013

Jonathan Martin has 15 years experience as an independent school head, most recently as Head of St. Gregory College Preparatory School (AZ). He holds degrees from Harvard University (BA, Government, cum laude); Starr King School for the Ministry (M.Div., Unitarian ministry preparation); and the University of San Francisco School of Education (MA, Private School Administration). In 2008, he was a Visiting Fellow at the Klingenstein Center at Teachers College, Columbia University. He previously headed Saklan Valley School (CA) and Maybeck High School (CA). In the first stage of his educational career, he taught History, Social Studies, and English at Maybeck, and served in a role equivalent to Dean of Students. From 2010-12 he was a member of the board, and Program & Professional Development Chair, of the Independent School Association of the Southwest (ISAS). He was a contributor to the new National Association of Independent Schools publication A Strategic Imperative: A Guide to Becoming a School of the Future.

Ready to make EMA part of your enrollment toolkit?

Subscribe to learn more about EMA and our services.

Counting What Counts

Jonathan E. Martin
January 17, 2013
Not everything that can be counted counts, and not everything that counts can be counted.
— Albert Einstein

So much of what really matters in education just can’t be measured.
— Independent school educators everywhere


Count me in. The quotes above are words I’ve uttered not dozens or scores but hundreds of times during my 15 years of independent school administration—and I very much believe I am in good company. Indeed, how can I argue with Albert Einstein?

But perhaps I am wrong. I’ve been enjoying reading this month a book which shakes my conviction that there is much of value that cannot be measured—and which gives very good guidance in how we can improve the way we capture in data just about anything we desire to know more about. The book is entitled How to Measure Anything by Douglas Hubbard—and although in my experience it is not a much discussed book in educational circles, I think it should be.

Grant Wiggins, author of the essential education book, Understanding by Design, is a fan of this book, directed me to Hubbard’s work in a blog post entitled “Oh You Can’t Measure That.”

Recently, I read a great book that might be of interest to anyone who wants to get beyond a knee-jerk reaction about what can and can’t be measured. The book makes the point from the git-go, in its title: How to Measure Anything: Finding the Value of Intangibles in Business, by Douglas Hubbard. Don’t let the ‘in Business” part throw you off. Almost everything in the book speaks to educational outcomes.

Hubbard writes with an axe to grind, and what becomes clear in the reading is that education is far from the only field or profession where managers express, frequently, their view that something, or most things, can’t be measured. This is Hubbard’s bête noir, one he is determined to confront with this book.

Often an important decision requires better knowledge of the alleged intangible, but when an executive believes something to be immeasurable, attempts to measure it will not even be considered.

As a result, decisions are less informed than they could be. The chance of error increases. Resources are misallocated, good ideas are rejected, and bad ideas are accepted.

Hubbard embeds as foundations to his argument three genuinely inspiring and impressive stories of measurement—times when individuals generated creative, ingenious methods for measuring something thought to be immeasurable—most famously and wondrously, Eratosthenes’ uncannily accurate measurement of the circumference of the earth two hundred years before the Common Era, using nothing more than shadows of the sun.

Fundamental to Hubbard’s argument is the particular definition he employs for the concept of measurement—and in the work of thinking about what and how we assess applicants for admission, it would seem critical to consider carefully definitions of measurement. For Hubbard, and he could not be more emphatic about this, measurement is a quantitatively expressed reduction of uncertainty based on one or more observations.

Hence, measurement never entails certainty or even near-certainty – it merely offers a reduction of uncertainty, a reduction which can then be used to improve a decision or analysis. Measurement doesn’t replace reasoned judgment; it informs reasoned judgment, because reducing uncertainty is always going to contribute to improved decisions, though it will never provide any guarantee of correct decisions.

The next key step in measurement is to be very clear about exactly what is the object to be measured—a surprisingly easy step to trip over. In parallel thinking, Susan Brookhart explains in her book How to Assess Higher Order Thinking is that the very first and most critical step is “Specify clearly and exactly what it is you want to assess.”

Brookhart very deliberately breaks down terms such as critical thinking and creativity into their parts, explaining only by doing so can you build tools for measuring student proficiency. Hubbard does the same: he shares many anecdotes of consulting to companies asking him to measure “strategic alignment,” “flexibility,” or “customer satisfaction”—all of which his clients tell him can’t be measured—and his response is always: “what do you mean, exactly?”

Be sure when developing what we really want to measure that its determination is driven by “why": “the purpose of the measurement is often the key to defining what the measurement is really supposed to be.” The why and the what are ultimately intertwined in a compelling dynamic.

As Grant Wiggins writes, so much of what we argue about when we discuss educational measurement have to do with these particular ambiguities: “We are either overly-simplifying complex aims or failing to measure something that matters because we think our measures must be perfect.”

With the goal of demystifying and democratizing measurement, Hubbard repeatedly reminds us it is not that complicated. There is an array of measurement approaches to be used: “measuring with very small random samples, measuring the population of things you will never see all of, measuring when many other even unknown variables are involved.”

These four assumptions—counterintuitive assumptions—Hubbard explains can bring measurement closer to our everyday practice:







The roadmap to effective measurement is also simpler than you may think.







Most of all: just do it. Measurement is an iterative process: we only get good at it by doing it and learning from each application.




Reports from the field




Every month or two it is my intent to share reports from the field, conveying to the readership here what has come to the attention of the Think Tank in the way of experiments in admissions assessment being undertaken by SSAT members and others in the admissions field. If you would like your school’s unique or experimental program profiled here, contact me.


One such example comes from Westminster School (GA), whose Director of Admissions, Marjorie Mitchell, is a member of our SSATB Think Tank. Returning home from the Chicago SSATB Annual Meeting, Marjorie reports, she had an inspiration. Recognizing her school’s application form had not changed in over 20 years, and knowing that she wanted to do more to differentiate her applicants and begin to distinguish among them, particularly in regards to their creative thinking, she decided to change the open response part of the application.

Formerly, the only prompt offered was this: In the space provided on this and the next page, write about an experience you had which taught you a lesson.


Now, inspired and illuminated by the suggestions of Dr. Robert Sternberg in his keynote address at SSATB, Marjorie developed, along with a few of her admissions committee colleagues, these two new optional alternate prompts:




Write a creative story (150-200 words on the next page) or poem that includes one of the following sets of words







Use your imagination to create and illustrate a scene from a story using one of the sets of words in item #2. While you will not need to write the story, please explain what you have drawn.




Marjorie reports her delight with the number of applicants choosing one of the new alternate prompts, and that she is getting new windows into the minds of these students—these can be “revealing self-portraits”— and that reading has become much more fun and interesting.




Much more importantly, she sees the possibility that this part of the application may become a more useful differentiator than it has been in the past. Currently, she says, the interview is the key distinguisher among the many fully qualified applicants—but she believes the interview favors those who are especially socially savvy and extroverted.








Her goal is to challenge the favoritism the interview process shows toward the extroverted. We are in a moment—see most especially Susan Cain’s Quiet: The Power of Introverts in a World That Can’t Stop Talking and the accompanying TED talk—of rising recognition of and respect for the important societal contributions introverted personalities make—and she believes that reinventing the application prompt will bring greater balance to the differentiating aspects of the admissions process.




-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------




Exciting experimentation with admissions assessment at the college level can be found at the new Innovation Portal, developed by University of Maryland professor Dr. Leigh Abts, who has long been a leader in his dedication to “improving the K-12 pipeline to engineering education.” Abts has a vision of engineering, he explained to me recently, that while inclusive of the importance of math and science skills also very much emphasizes creativity and creative problem-solving—and he fears that many K-12 students who would be terrific contributors to the profession are turned away because they perceive it as a non-creative endeavor.




Abts and his team interviewed college admissions officials and others about how to improve the methods of evaluating the creative problem-solving abilities of applicants, and found interest, but hesitation about the process. In language familiar to all of us in the world of admissions, they told him they were concerned about a systematic process and a standardized assessment tool:




Without a systematic process for reviewing original student design work there is no way to incorporate the value of the work into the algorithm of college admissions or any other recognition process.




Without a standardized assessment tool to organize and evaluate any submitted work there can be no systematic process.




What was needed was a well-structured and validated assessment rubric centered on the design process itself coupled with a secure means for students to build portfolios and connect their work to potential reviewers.




Abts and his colleagues, rather than being deterred or depressed, took the opportunity to put this information to use, and have developed what I think is a very exciting new resource, the Innovation Portal. Take the time to check it out online.

As it is described on the website, the portal contains two key components: first, an assessment rubric for the engineering design process, the “Engineering Design Process Portfolio Scoring Rubric” or EDPPSR, which will be “the central focus of a three year university study to validate and refine the work into a reliable assessment tool for wide spread use in the engineering education community.”





Second, it offers a “centralized hub where students and teachers can work in a secure environment to build detailed portfolios of their projects, [aligned to the] required standardization for organizing and displaying their portfolios.”



Abts believes there is no reason his portfolio assessment portal and platform couldn’t be used at the secondary level, and indeed is entirely open to experimental collaborations with high schools. Please contact me if your school might like an introduction to explore such an exciting project.



Jonathan E. Martin
January 17, 2013