I’m really worried about the state of the US despite being a white male who was I’ll coast right through it. I’ll also accept “I don’t” and “very poorly” as answers
I’m really worried about the state of the US despite being a white male who was I’ll coast right through it. I’ll also accept “I don’t” and “very poorly” as answers
From my point of view, the world is neither getting worse nor better, the world has always been the way it is, and it doesn’t seem like it’s going to change. It’s just my opinion.
Edit: In fact, what is getting worse is our economic system, but that is nothing new.
Weird that people insist climate change isn’t changing the world. Or at least not in a way that matters?
If we can’t say that our actions are making the world better or worse, then there really is no need to be involved in politics.
I did not deny climate change. I was referring to social issues, about people in general. Why the fuck do you think I’m a climate denier?
Is the climate changing the human world for the worse, or isn’t it? Maybe you don’t deny that climate change is happening, but this comment seems to deny its impact.