Related to: Rationalists Should Win, Why Our Kind Can't Cooperate, Can Humanism Match Religion's Output?, Humans Are Not Automatically Strategic, Paul Graham's "Why Nerds Are Unpopular"
The "Prisoner's Dilemma" refers to a game theory problem developed in the 1950's. Two prisoners are taken and interrogated separately. If either of them confesses and betrays the other person - "defecting" - they'll receive a reduced sentence, and their partner will get a greater sentence. However, if both defect, then they'll both receive higher sentences than if neither of them confessed.
This brings the prisoner to a strange problem. The best solution individually is to defect. But if both take the individually best solution, then they'll be worst off overall. This has wide ranging implications for international relations, negotiation, politics, and many other fields.
Members of LessWrong are incredibly smart people who tend to like game theory, and debate and explore and try to understand problems like this. But, does knowing game theory actually make you more effective in real life?
I think the answer is yes, with a caveat - you need the basic social skills to implement your game theory solution. The worst-case scenario in an interrogation would be to "defect by accident" - meaning that you'd just blurt out something stupidly because you didn't think it through before speaking. This might result in you and your partner both receiving higher sentences... a very bad situation. Game theory doesn't take over until basic skill conditions are met, so that you could actually execute any plan you come up with.
Sam DeCesare and I continue to have smart exchanges over email. He's kindly allowed me to share another set of his thoughts -
On an unrelated note, I just realized that you wrote the Defecting by Accident article. I've noticed the same behavior among technical people. It's interesting that someone will accept that to get a computer to do what you want you have to tell it things in a very specific way, but won't accept that you have to do the same thing with people. I'm sure they wish that they didn't have to tell the computer things in an obtuse language, but they don't refuse to learn how to program because of it. Computers just are that way, there's nothing to do but deal with it. People are the same; we're just wired to respond positively to sincere praise and kindness and respond poorly to insult and criticism. I suspect the problem is that we tend to blame and condemn people when they do things we don't like, and the purpose of blame is to make yourself feel better and absolve yourself of responsibility. If it's other people's fault that they don't respond to your frank style, then they need to change, not you. As it happens though, blame is completely unproductive, no one's ever gotten people to change by blaming them. So the frank and direct person never makes any progress with people.
Really, that whole excerpt is brilliant, and I'd recommend you re-read it thoroughly if you're skimming. This was the biggest insight for me -
"It's interesting that someone will accept that to get a computer to do what you want you have to tell it things in a very specific way, but won't accept that you have to do the same thing with people. I'm sure they wish that they didn't have to tell the computer things in an obtuse language, but they don't refuse to learn how to program because of it."
Sam and I previously had a little discussion on the topic of, "How much do people make their own decisions?"