What happened to the American work ethic? Not that long ago Americans had pride in earning a living and making their communities better places to live. Serving our country was considered an honor as well as a duty, not just a means to make a living and learn a skill.
It seems today we have more people who feel that they are entitled to a home, an education, and a job that pays a living wage. They are somehow owed these things strictly because they live in America, be they citizens or not. What will happen when we have more people taking money out of the pot than people working for a living and having their wages reduced by ever increasing taxes?
We will get a chance in November to pop the giant festering zit that is today's Federal Government, but are we collectively smart enough as a nation to do the right thing?
I can explain it to you but I can't understand it for you.
"Any man who thinks he can be happy and prosperous by letting the Government take care of him; better take a closer look at the American Indian." - Henry Ford
Corruptissima re publica plurimae leges; When the Republic is at its most corrupt the laws are most numerous. - Publius Cornelius Tacitus
I think they want everything to be fair, which is fine, so do I. The difference is that I don't believe the government can make our lives fair by redistributing wealth. Life is never fair, never was, and never will be. But we can keep trying to force by making more people dependant on the productive.
The left is angry because they are now being judged by the content of their character and not by the color of their skin.