Are Americans really as arrogant and ignorant as the media and the world portrays them as?

What do y’all think?