The point is key.
This is a note that I wrote to a Facebook "friend":
It seems to me - as someone who spent 25 years as a Democrat on the progressive-left - that the American Right should not allow the Democrats and the Left to keep the word "liberal." I'm just making an argument over semantics, but I think that it is important. This is particularly true since the Left is in the process of shedding its liberalism, anyway.
The American Left is trading universal human rights for the multicultural ideal. That is the process that they have been undergoing since 9/11. They are, as we see in the streets of Paris and Berkeley, throwing the tradition of Enlightenment liberalism into the toilet. Thus western-feminists no longer care about women's rights under Islam. They no longer care about freedom of speech if that speech contradicts their ideological worldview. To be liberal is to oppose racism, yet the progressive-left is the most racist political ideology in the west today outside of political Islam.
Because the Left is in the process of deliberalization, the Right should take up the mantle of Enlightenment liberalism.
I am not a Republican, but if you want the American center, stand with the tradition of Western Enlightenment Liberalism.
Let the other side stand with Black Lives Matter and Sharia-fan, Linda Sarsour.
We will stand with Thomas Jefferson and Martin Luther King, Jr.