I'm a European and literally everything that I hear nowadays of the United States is very, very bad. I mean you have school shootings, an incompetent President, police brutality, nationalism, wealth inequality, shitty gun laws, warmongering politicians, an awful healthcare system that leaves you in debt, antivaxxers, not even to mention the horrible gesturing of the current pandemic.
I get it that the media usually portrays stuff like this in a much worse manner, but these problems seem to never stop and that's why I'm wondering, is there genuinely anything good happening? I'm certain that the US is a cool place, I would love to visit, but as far as I'm concerned, I would never want to actually live there.
241,412 members
578 members
379,376 members
3,526 members
800,094 members