Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What textbook taught in the United States has anything with an ideological stance that isn't at the very least revisionist towards making the US look better?

Most history taught in the states already waters down the historical facts to make us look better.



Depends on how you characterize the War of Northern Aggression I suppose


Also depends on how you characterize the genocide of the natives.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: