Hey, has anybody noticed a subtle shift, in recent years, in the way laws are written, and in the kinds of things they aim to accomplish?
It has long been that most law told us what we couldn’t do, and as long as we refrained from such actions, we were left alone.
But the new trend in law, throughout the Western world, is to tell us what we must do. And we are not left alone.
You must purchase health insurance, whether you want it or not, or else pay a fine.
You must pay for someone else’s contraceptives, and even pay for abortions, no matter what violence it does to your conscience.
You must actively join in promoting homosexuality–by catering a sodomite parody of marriage, or allowing your child to be taught that sodomy is a virtue, or hiring a freaky “transgender” waitress for your restaurant, etc.–even if you are convinced that it’s an abomination and a sin.
It’s getting so the only ones who are safe from the law are criminals.
Good observation, Lee. I’ll begin to use this filter as we see these changes.
Your last sentence says it all.
I saw something the other day and I wish I could remember where in order to give proper credit to whoever said it. The gist of the statement was this: The government has nothing to give us until it takes things away. How true!
An amazing truth.
I’m afraid that most people simply don’t understand that government has no money of its own. It only has what it takes from us in taxes.
The same goes for rights – the right to keep and bear arms, the right to free speech, religious freedom – any and all God-given rights. If the government can regulate them somehow, then they can dole out little bits and pieces from there, while taking away the whole.
Laws are suppose to be negative. Check out the 10 Commandments, they all have “shalt nots” except “honor the sabbath” which insinuates if you do not honor it there are negative consequences.