With all that is going on within the boundaries of the United States, it should not come as a complete surprise that American's are not so interested in being a main fixture on the world stage as in years passed (additionally, do we need to be?). There is no more cold war, conflicts in which American forces are used in the past 15 years have been primarily out of self-interest (though America's engagement in the Balkans as well as in Afghanistan was warranted in this author's opinion), and modern warfare has changed across tactics, theaters and goals.
Much like an article I wrote a few weeks back, Americans are in a malaise, a widening economic divide, limits on opportunities, and the growth of China are certainly reasons that could contribute to this thinking. "The Myth of the American Decline" might not be believed by all, but certainly it is compelling enough to warrant a discussion. But, we have to face facts, there is just no global cultural appetite for American hegemony.
Certainly the United States is a one of the most influential global actors, but there is a belief that the nation, internally, is broken and needs fixing. It cannot be coincidental that Americans wish to see a retreat in their nation's global involvement, especially when so many of them want to see change happen inside the borders.
What do you think? Post a comment below, should America pull back its activities globally? if so, how/what/where?