People all over the world are lacking responsibility according to the article Personal responsibility waning, experts say. According to the experts, people aren't taking responsibility for their actions. The only way the responsibility gets put on them is if a jury or judge decide the outcome( such as in corporate lawsuits), or if Americans put blame on the government for such items as lack of intelligence in the September 11th incident. Historians, philosophers, political scientists and sociologists cite many reasons for the decline of an ethic of responsibility in America over recent decades, including:
- A culture of narcissism or self-absorption;
- The rise of celebrity worship and entitlement;
- The distractions of the war on terrorism.
They claim that we hold people responsible in a legal sense, but in a moral sense no one takes responsibility. Are Americans getting more and more unethical in their practices, or is this an acceptable social norm these days?