Making Higher Education Count

Ashley Thorne

How do we know whether American universities are really educating students? 

In 1996, the National Association of Scholars published a report, “The Dissolution of General Education: 1914-1933.” Based on the hypothesis that general education requirements had dissolved since the 1960s, NAS studied the fifty top-ranked schools listed in U.S. News and World Report's America's Best Colleges.” We looked at the structure, content, and rigor of required courses at each institution, and found that most institutions were no longer seriously committed to ensuring that their students are exposed to broad surveys of basic subject matter. 

In 2002 our Arizona affiliate examined how much graduates of Arizona’s three public universities knew about history, science, math, literature, arts, civics and other subjects, to see what taxpayers were getting in return for their support of the state’s universities. The affiliate issued a 40-question survey to 167 college seniors. The question with the most correct answers was the pop culture question, “Identify Snoop Doggy Dogg” and the question with the fewest correct answers was, “Who was the father of the U.S. Constitution?” 

This year, our sister organization the American Council of Trustees and Alumni launched a new website, WhatWillTheyLearn.org, which grades colleges based on the courses they require. It looks for requirements in “core subjects” of composition, literature, foreign language, U.S. history, economics, mathematics, and science. Schools that require six or seven of the core subjects get an A; only seven out of the127 schools ACTA reviewed made it to the A list. 

The assessment movement to quantify “student learning outcomes” is another approach to measuring higher education’s success. But one problem with tallying up outcomes in this way is that some educational outcomes cannot be easily quantified—nor should they be. Another problem is that having professors submit “student learning outcomes” gives them an incentive to aim low. A professor who achieves easy goals may look good on paper but will probably fail to teach students to grapple with complex concepts. NAS has examined the outcomes movement in “Seat Time at the AAC&U” and “LEAPs and Bounds.” 

Kevin Carey, policy director of the Washington-based think tank Education Sector, also has some ideas for measuring higher educational success. In an interview with Time Magazine this week, he argues that colleges must be held accountable. For one thing, he says, colleges should publish results of standardized tests such as the Collegiate Learning Assessment (CLA). For another, schools should graduate “a reasonable percentage of [their] students compared with other universities that have similar students.” Perhaps not such a high percentage (98%) as Harvard, he adds. “That’s probably too high. I’m pretty sure you’d have to shoot somebody not to graduate from Harvard.” 

Carey recommends that state governments should the ones holding schools—public and private—accountable for these things. He concludes that unlike K-12 education, higher education is generally assumed to be doing a good job and not needing to change. “Colleges do more than anyone to perpetuate that myth,” Carey says. 

NAS has long been calling for greater transparency in American higher education, and in that we join with Carey’s accountability prescription. As for graduation rates, he is right in observing that 98% is too high. Graduation inflation results from grade inflation. Students need to be challenged intellectually, to stretch the limits of their minds, and even to have a legitimate fear of failing if they don’t put in good work. Attempting to graduate certain percentages of students—albeit lower ones than Harvard’s—is tricky because, like outcomes assessment, it incentivizes a decrease in rigor for the sake of the numbers. Carey raises some good points, but we need to be careful in the ways we quantify success. It’s all too easy for attempts to ensure quality education to backfire and make things worse. 

So how do we know whether students are really learning what they should? Should we: 

  1. Examine required courses,
  2. Publish test results,
  3. Assess “outcomes,” or
  4. Measure graduation rates?
  5. Other 

A and B seem to be the best methods. What do you think? 

  • Share

Most Commented

October 29, 2024

1.

The Looming Irrelevance of Middle East Study Centers

Today’s Middle Eastern Studies Centers are facing a crisis due to the winds of change in the Middle East and their own ideological echo chamber....

November 19, 2024

2.

Lee Zeldin Should Reform EPA Science Policy

NAS welcomes the nomination of Congressmen Lee Zeldin to lead the Environmental Protection Agency....

November 20, 2024

3.

NAS Welcomes Administrator McMahon's Nomination to Serve as Education Secretary

With McMahon, the new administration has a chance to drastically slim down and depoliticize the Education Department....

Most Read

May 15, 2015

1.

Where Did We Get the Idea That Only White People Can Be Racist?

A look at the double standard that has arisen regarding racism, illustrated recently by the reaction to a black professor's biased comments on Twitter....

October 12, 2010

2.

Ask a Scholar: What is the True Definition of Latino?

What does it mean to be Latino? Are only Latin American people Latino, or does the term apply to anyone whose language derived from Latin?...

May 26, 2010

3.

10 Reasons Not to Go to College

A sampling of arguments for the idea that college may not be for everyone....