11-14-2004, 04:50 AM
A question of gradiose importance popped into my head today:
Let's assume that the United States would do whatever it took to make relations with the middle-east right. In your opinion, what would have to be done?
I personally believe that the greatest gift America could give the middle-east and arab world would be to simply leave them alone. Most people of earth simply want to be left alone and allowed to live their own way, hoever the United State government has seemed to have a big problem with this since its pre-revolution era.
But what does everyone else think? WHat has to be done to make things right and friendly again? Or is it too late?