Did the United States Create Democracy in Germany?
By James L. Payne
Both advocates and opponents of nation building say that the United States played a key role in helping post-war Germany become a democracy. In fact, a close look reveals that, from the standpoint of democratic nation building, the U.S. occupation of Germany is actually a lesson in what not to do.
|Other Independent Review articles by James L. Payne|
|Winter 2017/18||The Government Nobody Knowsnor Wants to Know|
|Summer 2016||Government Fails, Long Live Government! The Rise of Failurism|
|Fall 2014||The Real Case against Activist Global Warming Policy|
|[View All (12)]|