I don't find it surprising that a single network can do all those things with appropriate formatting of the data. In itself it just means the network has a large enough capacity to learn all the different tasks.
The interesting questions imo, which they studied, is what kind of added generalization takes place by learning across the different tasks. For example, does learning multiple tasks make it better at a given task than a model that is just trained for one task, and can it generalize to new tasks (out of distribution).
They looked at how it performed on held out tasks (see fig 9 in the paper). I'm still getting my head around the result though so couldn't summarize their finding yet.
Yeah, Figure 9 is the money figure in this paper and it actually splashes some cold water on the claims in the rest of the paper. While it generalizes OK to some tasks that are held out, it does pretty poorly on the Atari boxing task, which they openly admit is quite different from the others. Gato seems more likely to be a competent attempt at brute forcing our way towards weak general AI, which is a valid approach, but the question then will always be how does it do with something its never seen before, and how do you possibly brute force every possible situation? I think we're heading more towards a constellation of very intelligent expert machines for particular tasks that may be wrapped into a single package, but that are not strong AI.
The interesting questions imo, which they studied, is what kind of added generalization takes place by learning across the different tasks. For example, does learning multiple tasks make it better at a given task than a model that is just trained for one task, and can it generalize to new tasks (out of distribution).
They looked at how it performed on held out tasks (see fig 9 in the paper). I'm still getting my head around the result though so couldn't summarize their finding yet.
Edit: the paper is here https://storage.googleapis.com/deepmind-media/A%20Generalist...
There is currently another submission on the front page that links to it directly.