What state portrays US history as utopian in the past 20 years? I grew up in Kentucky and definitely got a lot of stuff about slavery, Native American betrayals and murders, civil rights, Mexican American war, Hawaiian colonization, the only reason they didn’t cover the Vietnam war or Filipino is because they ran out of time from assigning pointless art projects.
Edit: Other stuff that was covered, bonus army, workers rights and child abuses
So, there’s a chance to meet someone from another state that may really have been taught a different history at school. How different depends on the state’s political majority?
To be fair, I hear US history education varies wildly by state. Thanks states rights!
What state portrays US history as utopian in the past 20 years? I grew up in Kentucky and definitely got a lot of stuff about slavery, Native American betrayals and murders, civil rights, Mexican American war, Hawaiian colonization, the only reason they didn’t cover the Vietnam war or Filipino is because they ran out of time from assigning pointless art projects.
Edit: Other stuff that was covered, bonus army, workers rights and child abuses
So, there’s a chance to meet someone from another state that may really have been taught a different history at school. How different depends on the state’s political majority?
That is correct. That’s what it’s like living here every day. Though largely it’s just two categories.
The people your meet are either ignorant religious nut jobs, or they’re not from the south.