EMA & ERB are excited to announce our intent to merge. Read more.

FAQs from SSATB's January 2013 Testing Brief

EMA
April 1, 2013

FAQs from SSATB's January 2013 Testing Brief

EMA
April 1, 2013

FAQs from SSATB's January 2013 Testing Brief

EMA
April 1, 2013

FAQs from SSATB's January 2013 Testing Brief

EMA
April 1, 2013

From Memberanda, Spring 2013 

We hope you have reviewed the January 2013 Testing Brief and that it has been helpful in answering questions about the SSAT this year. In order to continue this important discussion, we’ve asked some of our members and colleagues, "What else do you want to know after reading the Testing Brief?" Here’s what they asked:

Q What makes the SSAT different from a placement test? Placement tests are typically designed to measure a level of achievement. The SSAT is an admission test, which means that it assesses content which should be familiar to students (read: achievement), as well as material which has not yet been presented in the students’ educational setting. This allows for an indication of academic aptitude and a fair prediction of success at the student’s next level of education.

Q Getting it right half of the time – how is that good? Understanding this concept as an admission director is the easy part. Here are some suggestions when explaining admission testing to your applicants and their parents. First, explain the design of an admission test: each SSAT test question is crafted to be answered correctly only 50 percent of the time. This model is effectively used by admission professionals to predict success among applicants who have similar characteristics by making finer distinctions in performance. Finally, reassure parents by reminding them that the SSAT is designed to be challenging, not to score a high mark equivalent to an "A" on an English paper.

Q The Testing Brief advises use of SSAT Scale Scores rather than percentiles. If our office has always used the SSAT Percentile Rank, what do we do now? The SSAT Scale Scores are the numbers you should look to first when reading a student score report, as they reflect a student’s actual performance on the test. The Scale Score is calculated using a formula scoring process that takes into account the test taker’s raw score of right and wrong answers. The SSAT Percentile Rank can (and does!) vary each year as the norm group changes, but the SSAT Scale Score does not. The Scale Score will always be the most accurate reflection of an individual student’s performance on the test. If your school has always looked to the Percentile Rank when reading score reports you may not have been taking individual performance into account, particularly as it relates to gender and to students falling near the mean of the scale score. It would be useful to reference the score tables found in the Interpretive Guide on your Member Access Page to view the Scale Scores as they relate to Percentile Rank each year. You’ll also find the 2011-2012 Guide online to use as a comparison for year-over-year change.

Q If the new norms no longer include international testers, how is their SSAT Percentile Rank calculated? Previously, the SSAT Percentile Rank compared a student’s Scale Score with that of all students of the same grade/ same gender, who took the SSAT over the last three years. Beginning August 1 (the start of the 2012-2013 test year), this was changed. SSAT Percentile Ranks are still based on students in the same grade/same gender, but now only scores for domestic (U.S. and Canada) testers who tested on one of the eight Standard dates are included. Further, for students who take the SSAT more than once, only the score of their first test is included in the norm group. International students and Flex testers do not have a separate norm group(s). While they are not a part of the norm group, their scale scores are compared to the domestic/Standard/first-time tester group described above.

The outreach team is happy to assist with understanding how these frequently-asked questions might affect your office, so don’t hesitate to call!

Aimee Gruber 609-619-2672
Dave Taibl 609-480-7346
Kate Auger-Campbell 609-480-8372

 

EMA Members can view the full report in our Member Community.

Become a member to gain access to our full magazine
and more professional development tools.

Subscribe to learn more about EMA and our services.

EMA
April 1, 2013
Ready to make EMA part of your enrollment toolkit?

Subscribe to learn more about EMA and our services.

FAQs from SSATB's January 2013 Testing Brief

EMA
April 1, 2013

From Memberanda, Spring 2013 

We hope you have reviewed the January 2013 Testing Brief and that it has been helpful in answering questions about the SSAT this year. In order to continue this important discussion, we’ve asked some of our members and colleagues, "What else do you want to know after reading the Testing Brief?" Here’s what they asked:

Q What makes the SSAT different from a placement test? Placement tests are typically designed to measure a level of achievement. The SSAT is an admission test, which means that it assesses content which should be familiar to students (read: achievement), as well as material which has not yet been presented in the students’ educational setting. This allows for an indication of academic aptitude and a fair prediction of success at the student’s next level of education.

Q Getting it right half of the time – how is that good? Understanding this concept as an admission director is the easy part. Here are some suggestions when explaining admission testing to your applicants and their parents. First, explain the design of an admission test: each SSAT test question is crafted to be answered correctly only 50 percent of the time. This model is effectively used by admission professionals to predict success among applicants who have similar characteristics by making finer distinctions in performance. Finally, reassure parents by reminding them that the SSAT is designed to be challenging, not to score a high mark equivalent to an "A" on an English paper.

Q The Testing Brief advises use of SSAT Scale Scores rather than percentiles. If our office has always used the SSAT Percentile Rank, what do we do now? The SSAT Scale Scores are the numbers you should look to first when reading a student score report, as they reflect a student’s actual performance on the test. The Scale Score is calculated using a formula scoring process that takes into account the test taker’s raw score of right and wrong answers. The SSAT Percentile Rank can (and does!) vary each year as the norm group changes, but the SSAT Scale Score does not. The Scale Score will always be the most accurate reflection of an individual student’s performance on the test. If your school has always looked to the Percentile Rank when reading score reports you may not have been taking individual performance into account, particularly as it relates to gender and to students falling near the mean of the scale score. It would be useful to reference the score tables found in the Interpretive Guide on your Member Access Page to view the Scale Scores as they relate to Percentile Rank each year. You’ll also find the 2011-2012 Guide online to use as a comparison for year-over-year change.

Q If the new norms no longer include international testers, how is their SSAT Percentile Rank calculated? Previously, the SSAT Percentile Rank compared a student’s Scale Score with that of all students of the same grade/ same gender, who took the SSAT over the last three years. Beginning August 1 (the start of the 2012-2013 test year), this was changed. SSAT Percentile Ranks are still based on students in the same grade/same gender, but now only scores for domestic (U.S. and Canada) testers who tested on one of the eight Standard dates are included. Further, for students who take the SSAT more than once, only the score of their first test is included in the norm group. International students and Flex testers do not have a separate norm group(s). While they are not a part of the norm group, their scale scores are compared to the domestic/Standard/first-time tester group described above.

The outreach team is happy to assist with understanding how these frequently-asked questions might affect your office, so don’t hesitate to call!

Aimee Gruber 609-619-2672
Dave Taibl 609-480-7346
Kate Auger-Campbell 609-480-8372

 

EMA
April 1, 2013