Shrug. He sweet-talked some guy on the internet into stupidly parting with ten bucks. Big fricking woop. Used car salesmen perform far more impressive (and lucrative) tricks of manipulation every day. I'm not sure why he's still straining his arm to pat himself on the back over it, especially as he eventually started losing and then gave up.
Sorry Eliezer, but the self-congratulatory tone of these posts is pretty grating. You've never done anything impossible, and if winning at a "let's pretend I'm an AI" role-playing game is the most impossible thing you've ever done then you've never done anything hard either.
The point of the post wasn't the AI box game. I agree that, at least to me, convincing someone to let you out of the box doesn't really sound impossible. In the post he says he chose it as the example of something "impossible" specifically because it's about the easiest thing he could think of to achieve and then discuss that to many people seems impossible. He's just aiming in the post to give people a real sense of what it means to try to solve an impossible problem, and try with the actual goal of succeeding.
I also think formulating a provably friendly AI probably actually is an extremely hard problem. I'm glad he's undertaking the challenge.