09-20-2012, 09:23 PM
Wars outside of our own borders used to be primarily about acquisition. Now, it seems most of them are about changing a portion of another country's culture, sometimes under the claim of "humanitarianism", but often for strategic political and/or economic benefit (imo). I don't know if it's possible to affect cultural change without formally ruling the territory, if even then.
It's very appealing to imagine the US staying out of other countries' internal conflicts; ideal. But, should we really not intervene when ethnic cleansing is underway? If one of our allies is under threat or attacked, should we stay put and wish them the best? If terrorists supported by and head-quartered in other countries come to the US and launch attacks (like 9/11), should we attempt to find their leaders and stop the aggressors or isolate ourselves more and continue to up security in expectation that they'll try again?
I don't have the answers, but these are the questions that I struggle with. Does the US have a solid identity and role in regards to foreign policy in times of conflict, or should it remain situational (as it has been for the last few decades, imo)? Essentially, I'm torn about whether international intervention/war is sometimes a necessary evil, or if we should instead avoid it altogether unless it's in defense of American land/people on our ground/air/water (as Ron Paul would have it, according to my understanding of his plans).
It's very appealing to imagine the US staying out of other countries' internal conflicts; ideal. But, should we really not intervene when ethnic cleansing is underway? If one of our allies is under threat or attacked, should we stay put and wish them the best? If terrorists supported by and head-quartered in other countries come to the US and launch attacks (like 9/11), should we attempt to find their leaders and stop the aggressors or isolate ourselves more and continue to up security in expectation that they'll try again?
I don't have the answers, but these are the questions that I struggle with. Does the US have a solid identity and role in regards to foreign policy in times of conflict, or should it remain situational (as it has been for the last few decades, imo)? Essentially, I'm torn about whether international intervention/war is sometimes a necessary evil, or if we should instead avoid it altogether unless it's in defense of American land/people on our ground/air/water (as Ron Paul would have it, according to my understanding of his plans).