In recent years, it has become unfashionable to identify as a liberal in many parts of the West, especially in intellectual circles. As I have previously said, the Right started this trend by making 'liberal' a dirty word in the 1980s, but in recent years, the Left appears to be outdoing them, attacking anyone who is disappointing to them as a 'lib'. In a sense, the Left and the Right have become allies on this issue, burying liberalism together, even as they can't agree on almost anything else. I believe the problem with this decline in liberalism is that, it will necessarily lead to a decline in morality, or even a total collapse in morality, in the Western world. This will likely be accompanied by an increasing dysfunction of our political systems, which I fear we are already beginning to see. Let me explain.