Political Bias And Anti-Americanism On College Campuses
• https://www.zerohedge.com, by Walter WilliamsA recent Pew Research Center survey finds that only half of American adults think colleges and universities are having a positive effect on our nation.