(A propósito de un artículo que puse el otro día, incluyo aquí uno de los artículos a los que hacía referencia. Es de Dan Ariely y se publicó en el Wall Street Journal del 26 de mayo. El enlace original: http://online.wsj.com/article/SB10001424052702304840904577422090013997320.html#articleTabs%3Darticle)
We like to believe that a few bad apples spoil the 
virtuous bunch. But research shows that everyone cheats a little—right 
up to the point where they lose their sense of integrity.
Not too long ago, one of my students, named Peter, told me a story 
that captures rather nicely our society's misguided efforts to deal with
 dishonesty. One day, Peter locked himself out of his house. After a 
spell, the locksmith pulled up in his truck and picked the lock in about
 a minute.
"I was amazed at how quickly and easily
 this guy was able to open the door," Peter said. The locksmith told him
 that locks are on doors only to keep honest people honest. One percent 
of people will always be honest and never steal. Another 1% will always 
be dishonest and always try to pick your lock and steal your television;
 locks won't do much to protect you from the hardened thieves, who can 
get into your house if they really want to. The purpose of locks, the 
locksmith said, is to protect you from the 98% of mostly honest people 
who might be tempted to try your door if it had no lock.
We tend to think that people are either honest or dishonest. In the 
age of Bernie Madoff and Mark McGwire, James Frey and John Edwards, we 
like to believe that most people are virtuous, but a few bad apples 
spoil the bunch. If this were true, society might easily remedy its 
problems with cheating and dishonesty. Human-resources departments could
 screen for cheaters when hiring. Dishonest financial advisers or 
building contractors could be flagged quickly and shunned. Cheaters in 
sports and other arenas would be easy to spot before they rose to the 
tops of their professions.
But that is not how dishonesty works. 
Over the past decade or so, my colleagues and I have taken a close look 
at why people cheat, using a variety of experiments and looking at a 
panoply of unique data sets—from insurance claims to employment 
histories to the treatment records of doctors and dentists. What we have
 found, in a nutshell: Everybody has the capacity to be dishonest, and 
almost everybody cheats—just by a little. Except for a few outliers at 
the top and bottom, the behavior of almost everyone is driven by two 
opposing motivations. On the one hand, we want to benefit from cheating 
and get as much money and glory as possible; on the other hand, we want 
to view ourselves as honest, honorable people. Sadly, it is this kind of
 small-scale mass cheating, not the high-profile cases, that is most 
corrosive to society.
Much of 
what we have learned about the causes of dishonesty comes from a simple 
little experiment that we call the "matrix task," which we have been 
using in many variations. It has shown rather conclusively that cheating
 does not correspond to the traditional, rational model of human 
behavior—that is, the idea that people simply weigh the benefits (say, 
money) against the costs (the possibility of getting caught and 
punished) and act accordingly.
The basic matrix task goes as follows:
 Test subjects (usually college students) are given a sheet of paper 
containing a series of 20 different matrices (structured like the 
example you can see above) and are told to find in each of the matrices 
two numbers that add up to 10. They have five minutes to solve as many 
of the matrices as possible, and they get paid based on how many they 
solve correctly. When we want to make it possible for subjects to cheat 
on the matrix task, we introduce what we call the "shredder condition." 
The subjects are told to count their correct answers on their own and 
then put their work sheets through a paper shredder at the back of the 
room. They then tell us how many matrices they solved correctly and get 
paid accordingly.
In a variety of experiments, 
Dan Ariely and his colleague have identified many factors that can make 
people behave in a more or less honest fashion.
 
 
 
What happens 
when we put people through the control condition and the shredder 
condition and then compare their scores? In the control condition, it 
turns out that most people can solve about four matrices in five 
minutes. But in the shredder condition, something funny happens: 
Everyone suddenly and miraculously gets a little smarter. Participants 
in the shredder condition claim to solve an average of six matrices—two 
more than in the control condition. This overall increase results not 
from a few individuals who claim to solve a lot more matrices but from 
lots of people who cheat just by a little.
Would putting more money on the line 
make people cheat more? We tried varying the amount that we paid for a 
solved matrix, from 50 cents to $10, but more money did not lead to more
 cheating. In fact, the amount of cheating was slightly lower when we 
promised our participants the highest amount for each correct answer. 
(Why? I suspect that at $10 per solved matrix, it was harder for 
participants to cheat and still feel good about their own sense of 
integrity.)
Would a higher probability of getting 
caught cause people to cheat less? We tried conditions for the 
experiment in which people shredded only half their answer sheet, in 
which they paid themselves money from a bowl in the hallway, even one in
 which a noticeably blind research assistant administered the 
experiment. Once again, lots of people cheated, though just by a bit. 
But the level of cheating was unaffected by the probability of getting 
caught.
Knowing that most people cheat—but just by a little—the next logical question is what makes us cheat more or less.
One thing that
 increased cheating in our experiments was making the prospect of a 
monetary payoff more "distant," in psychological terms. In one variation
 of the matrix task, we tempted students to cheat for tokens (which 
would immediately be traded in for cash). Subjects in this token 
condition cheated twice as much as those lying directly for money.
Another thing that boosted cheating: 
Having another student in the room who was clearly cheating. In this 
version of the matrix task, we had an acting student named David get up 
about a minute into the experiment (the participants in the study didn't
 know he was an actor) and implausibly claim that he had solved all the 
matrices. Watching this mini-Madoff clearly cheat—and waltz away with a 
wad of cash—the remaining students claimed they had solved double the 
number of matrices as the control group. Cheating, it seems, is 
infectious.
Other factors that increased the 
dishonesty of our test subjects included knowingly wearing knockoff 
fashions, being drained from the demands of a mentally difficult task 
and thinking that "teammates" would benefit from one's cheating in a 
group version of the matrix task. These factors have little to do with 
cost-benefit analysis and everything to do with the balancing act that 
we are constantly performing in our heads. If I am already wearing fake 
Gucci sunglasses, then maybe I am more comfortable pushing some other 
ethical limits (we call this the "What the hell" effect). If I am 
mentally depleted from sticking to a tough diet, how can you expect me 
to be scrupulously honest? (It's a lot of effort!) If it is my teammates
 who benefit from my fudging the numbers, surely that makes me a 
virtuous person!
The results of these experiments 
should leave you wondering about the ways that we currently try to keep 
people honest. Does the prospect of heavy fines or increased enforcement
 really make someone less likely to cheat on their taxes, to fill out a 
fraudulent insurance claim, to recommend a bum investment or to steal 
from his or her company? It may have a small effect on our behavior, but
 it is probably going to be of little consequence when it comes up 
against the brute psychological force of "I'm only fudging a little" or 
"Everyone does it" or "It's for a greater good."
What, then—if anything—pushes people toward greater honesty?
There's a joke
 about a man who loses his bike outside his synagogue and goes to his 
rabbi for advice. "Next week come to services, sit in the front row," 
the rabbi tells the man, "and when we recite the Ten Commandments, turn 
around and look at the people behind you. When we get to 'Thou shalt not
 steal,' see who can't look you in the eyes. That's your guy." After the
 next service, the rabbi is curious to learn whether his advice panned 
out. "So, did it work?" he asks the man. "Like a charm," the man 
answers. "The moment we got to 'Thou shalt not commit adultery,' I 
remembered where I left my bike."
What this little joke suggests is that 
simply being reminded of moral codes has a significant effect on how we 
view our own behavior.
Inspired by the thought, my colleagues
 and I ran an experiment at the University of California, Los Angeles. 
We took a group of 450 participants, split them into two groups and set 
them loose on our usual matrix task. We asked half of them to recall the
 Ten Commandments and the other half to recall 10 books that they had 
read in high school. Among the group who recalled the 10 books, we saw 
the typical widespread but moderate cheating. But in the group that was 
asked to recall the Ten Commandments, we observed no cheating 
whatsoever. We reran the experiment, reminding students of their 
schools' honor codes instead of the Ten Commandments, and we got the 
same result. We even reran the experiment on a group of self-declared 
atheists, asking them to swear on a Bible, and got the same no-cheating 
results yet again.
This experiment has obvious 
implications for the real world. While ethics lectures and training seem
 to have little to no effect on people, reminders of morality—right at 
the point where people are making a decision—appear to have an outsize 
effect on behavior.
Another set of our 
experiments, conducted with mock tax forms, convinced us that it would 
be better to have people put their signature at the top of the forms 
(before they filled in false information) rather than at the bottom 
(after the lying was done). Unable to get the IRS to give our theory a 
go in the real world, we tested it out with automobile-insurance forms. 
An insurance company gave us 20,000 forms with which to play. For half 
of them, we kept the usual arrangement, with the signature line at the 
bottom of the page along with the statement: "I promise that the 
information I am providing is true." For the other half, we moved the 
statement and signature line to the top. We mailed the forms to 20,000 
customers, and when we got the forms back, we compared the amount of 
driving reported on the two types of forms.
People filling out such forms have an 
incentive to underreport how many miles they drive, so as to be charged a
 lower premium. What did we find? Those who signed the form at the top 
said, on average, that they had driven 26,100 miles, while those who 
signed at the bottom said, on average, that they had driven 23,700 
miles—a difference of about 2,400 miles. We don't know, of course, how 
much those who signed at the top really drove, so we don't know if they 
were perfectly honest—but we do know that they cheated a good deal less 
than our control group.
Such tricks aren't going to save us 
from the next big Ponzi scheme or doping athlete or thieving politician.
 But they could rein in the vast majority of people who cheat "just by a
 little." Across all of our experiments, we have tested thousands of 
people, and from time to time, we did see aggressive cheaters who kept 
as much money as possible. In the matrix experiments, for example, we 
have never seen anyone claim to solve 18 or 19 out of the 20 matrices. 
But once in a while, a participant claimed to have solved all 20. 
Fortunately, we did not encounter many of these people, and because they
 seemed to be the exception and not the rule, we lost only a few hundred
 dollars to these big cheaters. At the same time, we had thousands and 
thousands of participants who cheated by "just" a few matrices, but 
because there were so many of them, we lost thousands and thousands of 
dollars to them.
In short, very few people steal to a 
maximal degree, but many good people cheat just a little here and there.
 We fib to round up our billable hours, claim higher losses on our 
insurance claims, recommend unnecessary treatments and so on. 
Companies also find many ways to game 
the system just a little. Think about credit-card companies that raise 
interest rates ever so slightly for no apparent reason and invent all 
kinds of hidden fees and penalties (which are often referred to, within 
companies, as "revenue enhancements"). Think about banks that slow down 
check processing so that they can hold on to our money for an extra day 
or two or charge exorbitant fees for overdraft protection and for using 
ATMs. 
All of this means that, although it is
 obviously important to pay attention to flagrant misbehaviors, it is 
probably even more important to discourage the small and more ubiquitous
 forms of dishonesty—the misbehavior that affects all of us, as both 
perpetrators and victims. This is especially true given what we know 
about the contagious nature of cheating and the way that small 
transgressions can grease the psychological skids to larger ones.
We want to install locks to stop the 
next Bernie Madoff, the next Enron, the next steroid-enhanced all-star, 
the next serial plagiarist, the next self-dealing political miscreant. 
But locking our doors against the dishonest monsters will not keep them 
out; they will always cheat their way in. It is the woman down the 
hallway—the sweet one who could not even carry away your flat-screen TV 
if she wanted to—who needs to be reminded constantly that, even if the 
door is open, she cannot just walk in and "borrow" a cup of sugar 
without asking.
—Mr. Ariely is the James B. Duke Professor of 
Behavior Economics at Duke University. This piece is adapted from his 
forthcoming book, "The (Honest) Truth About Dishonesty: How We Lie to 
Everyone—Especially Ourselves," to be published by HarperCollins on June
 5.