What textbook taught in the United States has anything with an ideological stance that isn't at the very least revisionist towards making the US look better?
Most history taught in the states already waters down the historical facts to make us look better.
Most history taught in the states already waters down the historical facts to make us look better.