England Objects to the Treaty of Versailles, June 1, 1919

Journal of Liberal History

Liberalism in the United States

Liberalism in the United States

What is political liberalism in the United States? The original concept was the protection of people from arbitrary power, support for the free market and advocacy of religious tolerance. But that started to change in the early twentieth century, when American liberals joined with progressives in advocating government intervention in the economy and social legislation. The presidency of Franklin D. Roosevelt from 1933 to 1945 confirmed that American liberalism would be based on using the market economy to deliver mass prosperity and active government to promote greater equality. FDR’s version of liberalism became America’s national creed and for three decades, the welfare state expanded massively. But in 1981, the new President, Ronald Reagan declared, ‘Government is not the solution to our problem, government is the problem’. Most Americans seemed to agree and, despite some interruptions, a powerful surge from the right has dominated American politics ever since. The word ‘liberal’  is now a term of abuse in the country’s political discourse. Join us to discuss the origins, development and challenges of American liberalism with Helena Rosenblatt (Professor of History at The Graduate Center, City University of New York and author of The Lost History of Liberalism) and James Traub (journalist and author of What Was Liberalism? The Past, Present and Promise of  a Noble Idea). Chair: Layla Moran MP (Liberal Democrat Foreign Affairs spokesperson) This will be an online meeting, held over Zoom. You must register in advance to participate; register here.

July 6, 2021 03:04 PM