What Happens when a College Goes Standardized Test Optional

  • The NACAC (National Association for College Admission Counseling) led by William Fitzsimmons, dean of admissions and financial aid at Harvard, issued, during the latter part of September, its annual commission report questioning the value of standardized tests in the admissions process. The report mentions that admissions offices that begin analyzing standardized tests soon question their value. Among the universities cited in the analysis are the University of California (and its debate over the use of SAT Subject Tests); Hamilton College, which after a five-year experiment with SAT-optional, went optional in 2006; Worcester Polytechnic Institute (Massachusetts), the first competitive science and engineering school to go standardized test-optional; and now, Wake Forest (number 30 on the US News and World Report List of top national universities)

After a campus goes ‘SAT or ACT optional,’ what happens?  Jill Tiefenthaler, provost at Wake Forest, clarifies Wake Forest’s stance on the subject, “…this is not about sacrificing academic excellence.” (insidehighered.com 25 September 2008, “After You Go SAT-Optional” by Scott Jashik). Rather, once standardized tests are optional, the admissions office must scrutinize the applicants with much more rigor.

Technology certainly can factor into the process. Personal interviews were once an accepted part of most admissions decisions. When it was determined that interviewing favored wealthier candidates who could attend on-campus interviews, most selective schools eliminated them. Wake, who received 9,000 applications last year, and has previously interviewed between 10-20% of its applicants, plans to interview virtually all the candidates this year. How? Even candidates with modest means can get to a computer in their public library, add a webcam, set up a Skype account, and have a 30 minute interview. Wake added two to its admissions staff, and plans to include faculty and retired admissions officers to address its increased interview volume.

Many campuses are able to better deal with the volume of information submitted by applicants, by going paperless. Northeastern University in Boston already is. It had over 30,000 applications last year, over three times the volume of  Wake Forest, and is able to review, workflow, and track candidate information meticulously. Harvard’s admissions office is also in the process of becoming completely paperless in the next 9-10 months. Wake is also exploring this path. Having better access to candidate information allows admissions officers more time to ask questions of high school counselors and thereby get a better feel of what high school rank means, and just how rigorous the classes are.

Surprisingly, even with the standardized test being optional, many students still do submit them. Students who attend schools with weak curricula feel test scores give Wake Forest a better sense of their college readiness. Martha Allman, the director of admissions at Wake, advises students, “...if you feel the score is a good representation, then go ahead and submit. If you don’t, don’t.” (Insidehighered.com op. cit.)

While it’s too early to determine what the actual ramifications are to Wake Forest’s decision to go standardized test-optional, applications are up for the year; additionally, anecdotal evidence indicates that more minority visitors have come to the campus, and gaining more applications from minorities was one of Wake Forest’s objectives. Further, the list of test-optional institutions (go to http://www.fairtest.org for a complete list of over 775 schools that are) keeps adding rather impressive names: University of North Carolina, Chapel Hill, Mount Holyoke, Smith, Pitzer (one of the Claremont Colleges), and Holy Cross are some of them.

As I’ve mentioned in previous columns, there will never be an end to the debate about the usefulness of standardized tests in predicting college performance, and sifting out the less capable applicants. State University of New York (SUNY), recently published data that shows the SAT is a good predictor of graduation (New York Times, 18 November 2008, “The Test Passes, Colleges Fail,” by Peter Sallins, p. A23). Another study, covering 150,000 students, measured the connection between SAT performance for the high school class of 2006 and college grades. The study found high school grades were slightly better indicators of college performance than the SAT. This is hardly a science, which makes experiments like Wake Forest’s that much more interesting.