Hey, has anybody noticed a subtle shift, in recent years, in the way laws are written, and in the kinds of things they aim to accomplish?
It has long been that most law told us what we couldn’t do, and as long as we refrained from such actions, we were left alone.
But the new trend in law, throughout the Western world, is to tell us what we must do. And we are not left alone.
You must purchase health insurance, whether you want it or not, or else pay a fine.
You must pay for someone else’s contraceptives, and even pay for abortions, no matter what violence it does to your conscience.
You must actively join in promoting homosexuality–by catering a sodomite parody of marriage, or allowing your child to be taught that sodomy is a virtue, or hiring a freaky “transgender” waitress for your restaurant, etc.–even if you are convinced that it’s an abomination and a sin.
It’s getting so the only ones who are safe from the law are criminals.