American Elephants


The Truth About The Vietnam War by The Elephant's Child

Did the United States win or lose the Vietnam War? We are taught that it was a resounding loss for America, one that proves that intervening in the affairs of other nations is usually misguided. The truth is that our military won the war, but our politicians lost it. The Communists in North Vietnam actually signed a peace treaty, effectively surrendering, but the U.S. Congress did not hold up its end of the bargain.




%d bloggers like this: