Putting Test Scores on Trial

by Molly McCracken on Sep 26, 2016 2:51:36 PM

Week3__Graduates_Medium.jpg

GMAC's Corporate Recruiter survey found that 80 percent of recruiters cared about the quality of your graduates, while only 20 percent cared about the admissions requirements to get into your school. Yet, many schools are continuing to compete over this elusive status for accepting “only top performers” on tests like the GMAT, GRE, and SAT.

With publications like U.S. News touting “scoring high is critical to gaining admittance to a top-ranked MBA,” students and admissions professionals alike are constantly hearing that test scores determine a) the quality of the school and b) the quality of the student.

However, the trouble with test scores is that they only measure certain quantifiable aptitudes, rather than the overall person and their ambition, approach, and abilities.

 

Have you taken the GMAT?

If so, you understand how the questions target very specific types of intelligence. If not, take our totally unscientific “what would YOU score on a GMAT test?” to see a sample of how the assessment works.

(We also have "What would YOU score on an SAT-test?" here.)

 

 

Let’s be clear: The GMAT and other aptitude tests like the GRE, LSAT and MCAT, have a role to play in admissions. GMAT scores have a proven correlation with academic success in your program. But they were never designed to assess applicants on the skills needed to succeed in the workplace.

We do not suggest that schools should eliminate an IQ component from their application process. Instead, schools should be considering the weight that these tests play in whether or not an applicant is considered for their program.

Today, test scores go on trial.

We’ve identified five critical issues with a reliance on high test scores for a college application:


 

Allegation 1: Test Scores Do Not Determine Employability

In 2015, Rotman Business School studied the employment outcomes of over 1,000 of their MBA grads from 2008 to 2013. As they looked for a link between entrance GMAT score and employability of their students after graduating, they found a candidate’s score on the test to have essentially no correlation to their employability.

While U.S. News & World Report’s MBA ranking puts substantial weight on a class’s average GMAT score, pushing schools towards enrolling applicants with high averages, there is no evidence of GMAT scores equating to strong business leadership.

“GMAC has never claimed the GMAT exam is predictive of employability. It’s predictive of your ability to perform in the program,” Rotman’s (then) managing director of the MBA program, Kevin Frey, told Poets&Quants in 2015.

For Rotman, a Canadian business school competing on the world stage, they took these findings and redesigned their admissions process to build better classes.

Rotman, and other top business schools are realizing that test scores may determine how students will perform on exams, but they won’t tell you the students who will create great classroom discussions, lead teams, close sales deals, persuade investors, or strategize award-winning campaigns.

As Frey explains:

 

We need to identify and evaluate talent that others are discounting and not seeing the value in. A lot of people think we are in the sleepy education business, but we are also in the talent game. We have to be able to identify the talent, bring it in, develop it and then match it with employers. Rotman doesn’t win too many bidding wars with the Harvards, Stanfords and Kelloggs of the world, so we decided to find ways to identify hidden talent that those schools may not be able to identify or may not be fully valuing.”

 

In 2005, Malcolm Gladwell compiled a piece on “the social logic of Ivy League admissions” in The New Yorker called “Getting In." He spotlighted the flaws in society's approach to grades and test scores, giving examples of schools who relied too heavily on tests and lacked the real superstars in their classes because of it.

"A law school that wants to select the best possible lawyers has to use a very different admissions process from a law school that wants to select the best possible law students,” claims Gladwell.

So, what is your program’s goal: Educate your students at being the best students or educate them to be the best business people? 

 

Allegation 2: Test Scores Cannot Reflect Social Responsibility, Leadership, or Collaboration

Over the last year here at Kira, we have talked a lot about the Making Caring Common report that argues college admissions’ need for a redesign to better embrace social engagement and common good. One of the three core recommendations of the report calls for schools to find ways to better assess ethical engagement and contributions to others, including family and community.

With admissions processes that rely heavily on test scores, what you’re accurately assessing is how much time a person was able to invest into preparing for tests. Students who tend to do really well on tests were likely to spend a lot of time focused on independent study, rather than going out and contributing to their community.

What if, instead, you could look at how much time applicants were dedicating to sitting on their undergraduate student council, coaching their sibling’s softball team, or working two part-time jobs to reduce their student debt.

Seeing if an applicant is engaged with his or her community can be a huge factor in their classroom success and their employability long-term. We wrote about this on our blog earlier this year, in The Socially Conscious MBA Student.

Collaboration and leadership are other major competencies that business schools need to assess for at the admissions stage. They were both among the top five skills employers are looking for in The Bloomberg Job Skills Report 2016.

Tests cannot reflect an applicant's ability to lead or work with others. However, being able to lead, to follow, and generally work with others are all vital to success in business. Applicants with high test scores can study well, but can they play well with others?

There’s no way to tell, which can be risky for success in a business program with group work and in a career path where teams need to collaborate.

 

Allegation 3: Test Scores Gloss Over Importance of Communication Skills

Although sections of the GMAT like the AWA (Analytical Writing Assessment) can evaluate reading and writing skills, and the TOEFL can be great for assessing ESL candidates, the ability to be a strong business communicator cannot be quantified.

No computer algorithm or answer key can prove that an applicant will be a persuasive salesperson, a compelling presenter, or a manager who can articulate constructive feedback to her team.

Likewise, how a person deals with a written test question is significantly different than how he will respond to a criticism of his company’s product, or handle a media request. Often aptitude tests offer multiple choice responses, asking candidates to “select the best option” rather than write their own variation of the best option. In the business world, there’s rarely a “best option template” to lean on in a critical situation.

Without hearing how applicants explain their ideas, defend their arguments, and solve problems on the fly, you cannot truly assess their communication capabilities. 

As text messages and social networking replace the need for phone calls and in-person communication, today’s applicants are getting less day-to-day exposure to oral communication, which arguably leads to generally weaker interviews and presentations.

 


  

Plug.pngShameless Plug Alert: Kira Helps Assess Competencies and Communication Skills

Kira’s consultative approach helps you identify the competencies students need to succeed in your program, and then assess for them in an asynchronous interview setting. Applicants, who may have previously been in the ‘grey area’ because of less than stellar test scores, have an opportunity to excel in an interview setting. The end result is a stronger, more diverse cohort of students with the skills needed to succeed in your program. 
 


 

Allegation 4: Test Scores Can Be Cheated 

Lu Xu wrote the GMAT over 500 times.

As he told Poets&Quants, “I am a walking legend in (the) GMAT world. The test is part of my life. What happened to me can be made into a movie like ‘Catch Me If You Can’ or 'The Wolf of Wall Street.’"

Leonardo Dicaprio fandom aside, Xu is an extreme, but true, example of the potential impact of admissions fraud. Last week, we talked about admissions fraud focusing in on admissions essays, but it can absolutely happen in the realm of standardized tests as well.

Hiring a test writer is one extreme, but there are a number of other paid prep services and tutors which claim to help people easily “hack” the GMAT and memorize their way to a good test score. Is this making our future business leaders smarter or better problem solvers?

It makes sense that students may go to this length: With the competitive nature of top MBA programs, students may feel overwhelmed by the pressure to perform and turn to paid solutions to score higher on admissions tests, or, to not have to write them at all.

As we discuss in The Essential Guide to Preventing Admissions Fraud, “the application process in the United States is a notoriously complicated process. Requirements vary by school, by program, and by level of education for both domestic and international students.

There’s a seemingly endless list of documents to submit: letters of recommendation, SAT scores, personal essays and statements (anywhere from one to four per application), and high school transcripts including GPA.”

Culture and sociology can also be motives for applicants to cheat. As Carrie Marcinkevage from Smeal College of Business told us: "For cultures whose educational system focuses highly on testing and little on writing or constructive argument, essays themselves are foreign; plagiarism is often unheard of."

She also added, that if a student perceives his or her competition in the applicant pool to be cheating, they may feel inclined to do so as well in order to keep up.

 

Allegation 5: Tests and Grades Have Inherent Bias

Earlier this year, the Jack Kent Cooke Foundation released a powerful report called True Merit, which explains that getting into a selective institution is more difficult for low-income students than others. Not only do few low-income students attend highly competitive universities, few students apply.

Test scores are a direct contributor to this problem: It has been proven that students who have a higher socioeconomic status do better on tests. Here’s a study from 2013: Socioeconomic status big factor in low scores for US on global exam. Here’s another from back in 2004: Socioeconomic Status and Intelligence: Why Test Scores Do Not Equal Merit. We could go on listing examples, but a quick Google search will show you several more examples that back up this correlation.  

Test scores can be strongly influenced by socioeconomic status because preparing to write a standardized test requires a financial investment. Graduate school applicants need money to pay for the test itself, test prep materials, and sometimes even coaches. They also have to be able to afford to take the time to study, borrowing time from work, family, or other responsibilities.

Recently, an MBA applicant, Zoheb Davar, wrote in Poets&Quants about how scoring 700+ on the GMAT took him over 800 hours and close to $10,000 in tutoring materials. Although his story is just one case, it shows how standardized testing can lean in favour of those with time and money to prepare. An applicant working two jobs to support his or her family will have many more barriers to success than someone who can take the summer off or hire a tutor.

When low-income applicants feel they cannot access high quality education, everyone suffers.

As the authors of True Merit report, “selective institutions cultivate our nation’s leadership: 49 percent of corporate industry leaders and 50 percent of government leaders graduated from only 12 selective colleges and universities.”

This has to change. We need students from all communities in the classroom, discussing today’s issues, and solving tomorrow’s problems. 

 

Closing Statement

Reviewing the evidence above, there are five problematic issues with schools giving a heavy weight to admissions tests:

  • Test scores do not determine employability of an applicant in the workforce.

    Test scores do not reflect key competencies like engagement, leadership, and collaboration.

    Test scores give a shallow evaluation of essential communication skills.

    Test scores may be cheated or fraudulent.

    Test scores have an inherent bias against students in low socioeconomic spheres.   

 

The solution is not an easy one. Tests present a straightforward, quantifiable way to look at a student’s aptitude, and they have a role to play in the admissions process, but they cannot be a factor that keeps students out of the programs where they’ll truly excel.

Schools need to level the playing field for their applicants and assess on more than test scores.  

This article appears in Kira University. If you're interested in more like this, sign up for the Master of Business School Admissions, or the Master of Graduate School Admissions. 

 

author-bg1.png
author avatar

Written by Molly McCracken

Molly is the Marketing Manager at Kira. She strives to create helpful articles, emails, and resources for admissions teams, and she is an avid supporter of the Oxford comma.