I have taken just about all I can of our current culture in the US. Everyone thinks they know what’s best for everyone else. What ever happened to live and let live?
For example, my mom swoons for the Orange Clump. I love my mom, so I don’t talk to her about it. Sometimes she’ll bring it up, and things will get tense, but I’m not going to not have a relationship with my mother over it.
And yet, I have seen so many people trashing new shows/movies over their selection of non-white-male actors for their characters. Their motto is “Go woke, go broke,” as if Hollywood relies on the dollars of narrow-minded twats how can’t seem to realize that straight, white men only make up about 25% of the population of the United States, and therefore, should only be represented about 25 percent of the time.
What kills me is that these same people claim that “the woke crowd” are always bringing up “identity politics.” How is being pissed that a mermaid is black not about race? How is feeling emasculated by a strong female superhero who doesn’t wear skimpy outfits and who talks about the real bullshit that women deal with on a daily basis not about feeling threatened by women?
People are literally demonizing Disney because they’re trying to be more inclusive.
Remember the Great American Melting pot? I remember being taught about it in grade school, but the only time I ever remember seeing any example of it was when I began teaching at my own alma mater, and our school hosted International Week. All nationalities were represented and celebrated.
The America that I learned about in the 70s does not exist. I’m not sure that it ever did.
I’m tired of it.
I miss the America I grew up believing in.