There's something wrong with America. It seems that the problems currently
confronting our nation are, at times, intractable and insurmountable. Divorce,
unending wars, rampant drug addiction, senseless violence, serious health
issues, racial discord and the list goes on and on. Morals, etiquette and manners
seem to be lost arts these days. Our leaders promise us the world, then fail to
deliver. They appear incapable of grasping the nature of the problems that
plague us.
Is this the way things are supposed to be? Is this the way it has to be?