I guess my question is who gave the Americans the right? I say this as an American. But would not the world be a better place if we just minded our own business and quit nation building and stoking non existant fires?
I guess my question is who gave the Americans the right? I say this as an American. But would not the world be a better place if we just minded our own business and quit nation building and stoking non existant fires?
This drives me nuts with the news cycle. “The US won’t get involved in X”. The media shows how awful fighting/revolt/etc are in X. “Why won’t the US do something about the horror in X!?” The US gets involved and, of course, some civilians die. This is guaranteed in war. The media then goes “The US is awful for killing civilians in X!” The US pulls out of X. The media goes “Why has the US abandoned X!?”