38% of Americans Say Colleges are Hurting America

College was this country's unofficial religion for the last three generations. The solution to everything was higher education. After decades in which higher education became a worthless and hideously expensive joke, a machine for generating student debt, retarding the economy, and creating radicals, the country is beginning to turn on the educracy. 

A new Pew Research Center survey finds that only half of American adults think colleges and universities are having a positive effect on the way things are going in the country these days. About four-in-ten (38%) say they are having a negative impact – up from 26% in 2012.

The share of Americans saying colleges and universities have a negative effect has increased by 12 percentage points since 2012. The increase in negative views has come almost entirely from Republicans and independents who lean Republican. From 2015 to 2019, the share saying colleges have a negative effect on the country went from 37% to 59% among this group. Over that same period, the views of Democrats and independents who lean Democratic have remained largely stable and overwhelmingly positive.

Of course they have.

College serves their agenda. Meanwhile, as the culture war heats up, more conservatives are rejecting the standardized dogma of the movement and are becoming more willing to challenge the destructive institutions that are wrecking this country.

Lefties insist that college should be taxpayer subsidized. But many taxpayers no longer agree that it even serves a valid purpose.

Especially now that the liberal arts education has become extremely illiberal.


Wondering what happened to your Disqus comments?

Read the Story