The point of the post wasn't the AI box game. I agree that, at least to me, convincing someone to let you out of the box doesn't really sound impossible. In the post he says he chose it as the example of something "impossible" specifically because it's about the easiest thing he could think of to achieve and then discuss that to many people seems impossible. He's just aiming in the post to give people a real sense of what it means to try to solve an impossible problem, and try with the actual goal of succeeding.
I also think formulating a provably friendly AI probably actually is an extremely hard problem. I'm glad he's undertaking the challenge.
I also think formulating a provably friendly AI probably actually is an extremely hard problem. I'm glad he's undertaking the challenge.