What Goes Into Education Rankings Affects What Comes Out

By Benita M. Dodd

In 2017, U.S. World and News Report’s ranking of the best states to live featured not a single Southern state in the top 10. Georgia, at No. 32 overall, finished at No. 31 in the education rankings; Massachusetts was No. 1 in education and No. 8 overall.

When it comes to education, it’s nothing new for Georgia to end up in the bottom half of national rankings. But a new report by University of Texas researchers suggests it’s not education necessarily at fault in Georgia and other Southern states. It’s ranking systems that fail to make an “apples to apples” comparison between states.

“Students arrive in class on the first day of school with different backgrounds, skills, and life experiences, often related to socioeconomic status,” write researchers Stan J. Liebowitz and Matthew J. Kelly in their study, “Fixing the Biased State K-12 Education Rankings.”  

“Assuming away these differences, as most state rankings implicitly do, may lead analysts to attribute too much of the variation in state educational outcomes to school systems instead of to student attributes.”

They add: “When these rankings fail to account for heterogeneity of student populations … they skew results in favor of states with fewer minority students.”

For example, the NAEP, known as “the nation’s report card,” provides average scores for various subjects at various grade levels. But ranking systems do states a disservice by focusing on the states’ average NAEP test scores instead of examining results by the demographic breakdown NAEP provides, including ethnic and socio-economic factors. And, the researchers point out, spending more money — “input” — is included as a component in national rankings, “when that wasteful extra spending should instead be penalized in the rankings.”

To improve accuracy of rankings, Liebowitz and Kelly developed a new “quality rank” for state education systems, using disaggregated achievement data and excluding “factors that are not directly related to learning.”

They compared state scores for three subject matters (Math, Reading and Science), four major ethnic groups (whites, blacks, Hispanics and Asian/Pacific Islanders) and two grades (fourth and eighth). They excluded graduation rates and pre-K enrollment, “which do not measure how much students have learned.”

When that happened, states with small minority population shares dropped in the rankings and states with high minority populations rose in the rankings. In one astonishing example, Maine – with a 90 percent white student population – went from No. 6 in the U.S. News and World Report to 49th in the researchers’ rankings. The researchers called it “astounding” that the magazine could rank Maine as highly as sixth, “given the deficient performance of both its black and white students … relative to black and white students of other states.”

Florida went from No. 40 to No. 3. Texas went from No. 33 to No. 6, and Georgia went from No. 35 to No. 8. Rankings did worsen for some Southern states, including Alabama and Louisiana.

The differences were even more marked when spending efficiency was taken into account, the researchers noted. Georgia was in the top five in the nation in bang for the education buck, behind Florida, Texas and Virginia and Arizona. “All of these states are southern or southwestern, with right to work laws and very low levels of unionization, the very opposite of the conventional narrative,” the researchers add.

West Virginia, Alabama and Maine were ranked least efficient in spending. While Massachusetts and New Jersey had excellent results, they were big spenders. And while researchers also found that class size, vouchers and the share of students in private school had little effect on the rankings, they noted there is “some evidence that charter schools may have a small beneficial impact on student achievement.”

The lessons from this study are worth close attention.

“If you consult the conventional state education rankings, you’d think Georgia was failing its students,” Kelly told the Foundation.
“This is just wrong; the result of flawed statistics. When a sounder statistical methodology, which focuses on student performance and takes account of diverse student populations is used, the state actually is a leader. Furthermore, Georgia taxpayers get an exceptional bang for the buck.”

Georgia’s educators deserve kudos for reaching the students who need the most help; the ranking shows the state is doing the right things. Policymakers must not be distracted by the ongoing demands for more education spending. Instead, Georgia must continue to target its education dollars at  better education spending, to students who need the most academic assistance. The money must follow the child.


Benita Dodd is Vice President of the Georgia Public Policy Foundation.

By Benita M. Dodd

In 2017, U.S. World and News Report’s ranking of the best states to live featured not a single Southern state in the top 10. Georgia, at No. 32 overall, finished at No. 31 in the education rankings; Massachusetts was No. 1 in education and No. 8 overall.

When it comes to education, it’s nothing new for Georgia to end up in the bottom half of national rankings. But a new report by University of Texas researchers suggests it’s not education necessarily at fault in Georgia and other Southern states. It’s ranking systems that fail to make an “apples to apples” comparison between states.

“Students arrive in class on the first day of school with different backgrounds, skills, and life experiences, often related to socioeconomic status,” write researchers Stan J. Liebowitz and Matthew J. Kelly in their study, “Fixing the Biased State K-12 Education Rankings.”  

“Assuming away these differences, as most state rankings implicitly do, may lead analysts to attribute too much of the variation in state educational outcomes to school systems instead of to student attributes.”

They add: “When these rankings fail to account for heterogeneity of student populations … they skew results in favor of states with fewer minority students.”

For example, the NAEP, known as “the nation’s report card,” provides average scores for various subjects at various grade levels. But ranking systems do states a disservice by focusing on the states’ average NAEP test scores instead of examining results by the demographic breakdown NAEP provides, including ethnic and socio-economic factors. And, the researchers point out, spending more money — “input” — is included as a component in national rankings, “when that wasteful extra spending should instead be penalized in the rankings.”

To improve accuracy of rankings, Liebowitz and Kelly developed a new “quality rank” for state education systems, using disaggregated achievement data and excluding “factors that are not directly related to learning.”

They compared state scores for three subject matters (Math, Reading and Science), four major ethnic groups (whites, blacks, Hispanics and Asian/Pacific Islanders) and two grades (fourth and eighth). They excluded graduation rates and pre-K enrollment, “which do not measure how much students have learned.”

When that happened, states with small minority population shares dropped in the rankings and states with high minority populations rose in the rankings. In one astonishing example, Maine – with a 90 percent white student population – went from No. 6 in the U.S. News and World Report to 49th in the researchers’ rankings. The researchers called it “astounding” that the magazine could rank Maine as highly as sixth, “given the deficient performance of both its black and white students … relative to black and white students of other states.”

Florida went from No. 40 to No. 3. Texas went from No. 33 to No. 6, and Georgia went from No. 35 to No. 8. Rankings did worsen for some Southern states, including Alabama and Louisiana.

The differences were even more marked when spending efficiency was taken into account, the researchers noted. Georgia was in the top five in the nation in bang for the education buck, behind Florida, Texas and Virginia and Arizona. “All of these states are southern or southwestern, with right to work laws and very low levels of unionization, the very opposite of the conventional narrative,” the researchers add.

West Virginia, Alabama and Maine were ranked least efficient in spending. While Massachusetts and New Jersey had excellent results, they were big spenders. And while researchers also found that class size, vouchers and the share of students in private school had little effect on the rankings, they noted there is “some evidence that charter schools may have a small beneficial impact on student achievement.”

The lessons from this study are worth close attention.

“If you consult the conventional state education rankings, you’d think Georgia was failing its students,” Kelly told the Foundation.
“This is just wrong; the result of flawed statistics. When a sounder statistical methodology, which focuses on student performance and takes account of diverse student populations is used, the state actually is a leader. Furthermore, Georgia taxpayers get an exceptional bang for the buck.”

Georgia’s educators deserve kudos for reaching the students who need the most help; the ranking shows the state is doing the right things. Policymakers must not be distracted by the ongoing demands for more education spending. Instead, Georgia must continue to target its education dollars at  better education spending, to students who need the most academic assistance. The money must follow the child.


Benita Dodd is Vice President of the Georgia Public Policy Foundation.

« Previous Next »