Democracy finally arrived in Germany in the wake of a Civil War that immediately followed the breakdown after World War One. It arrived in the form of The Weimar Republic (named after the city where a new constitution was hammered out as Berlin fell into chaos). The Republic lasted for but a brief period from 1919 until 1933, when Hitler came to power.
In its first years the Republic was faced politically with both a Far Right and a Far Left, and economically by soaring inflation.
Finally, in the humiliation of both defeat and the loss of life during the war, many Germans sought to replace that reality with a mythology of German strength and to find excuses for the defeat. The Nazis developed this into their theory of 'the stab in the back', accusing Socialists and, especially, Jews for the 1918 disaster.
Rather surprisingly in many ways the Weimar Government established a degree of stability and international respectability by 1925. Yet four years later Germany, along with the rest of the leading nations, suffered from the Great Depression, and a mere two years later Weimar fell and Hitler and Nazism came to power.
In the last analysis Weimar failed. Was this inevitable, or could it have saved Germany and the wider world from the horrors of Nazism, notably The Shoah, and the catastrophe of a Second World War?
Thank you so much. My grandfathers fought in World War I and my parents grew up during the time of the Weimar Republic. It gives me great intellectual stimulation and emotional satisfaction to hear you talk about it all in such fine and detailed lectures.
Barbara Thal-Hodes