This isn't a rollback of the API-oriented architecture, it's a rescue of a bunch of wrong-headed design mistakes, like forcing the rendering of individual status pages through a JavaScript function, using a hash-bang URI.
One of the major drivers behind the constraints in a RESTful architecture is user-perceived response time. Routing all content through "code on demand", which is effectively what you're doing when you force a JavaScript function to do all the rendering, instead of the browser itself, is not taking advantage of the fast, incremental rendering nature of plain old HTML generated by the server.
One can still design good API-oriented URIs with a server-side approach, you're just providing different serializations of the same resource (a nice HTML one and a JSON one for API access).... so there's nothing fundamental that Twitter has lost or abandoned here. They're just using the web better.
On the initial request, Twitter has no way of knowing what resource was intended to be rendered when it's prefixed by a fragment identifier in the URL. There is no reason why having javascript render the page has to be perceptibly slower than shipping down the page in HTML. Hashbang URLs are the sole issue here, not javascript.
Server-generated HTML: the browser GETs it and renders it as it streams through.
Client-generated HTML: the server GETs an HTML page with a JavaScript link, then GETs a JavaScript .js, then executes it, the jS GETs the JSON, then translates that into HTML at the DOM level.
The latter is arguably something that will lead to a better user experience once the page is in a "steady state", i.e. all dependent representations are loaded into the browser and rendered. But relying on it for "first render" makes for a slow experience when (e.g.) clicking on a link to an individual Twitter status.
Getting that JavaScript is a one-time operation, and a 304 from then on. Also, the HTML can include bootstrapped data, saving the roundtrip for JSON.
Also, with client-side rendering you execute more code on the client but less on the server, so in an environment like Twitter where it's not possible to do any sort of heavy caching (everybody sees something else), you're simply trading time on the server for time on the client. Not faster, not slower.
Server-side HTML generation is not a magical 0ms process.
Don't forget that JavaScript has to be interpreted every time the page loads, even if you have a cached copy of it. If it's a large chunk of code, the time to do that is not trivial.
I'm not quite sure that it's as much of a zero-sum game as you present it. I can easily think of scenarios where rendering on the server is much faster (e.g. using a compiled language vs JS, taking advantage of powerful hardware, granular caching, etc) and much more constant.
"Getting that JavaScript is a one-time operation, and a 304 from then on."
In theory it sounds right. However, there are a couple of cases where users will have to load JS a lot more than they should. Since most of the logic lives in the JS file(s) they will be changed and pushed out a lot more. This will force users to download the JS every time code is deployed.
Also, I am not sure what percent of "New users" land on Twitter pages, but they will have to download the JS.
Probably because the browser doesn't have to generate the HTML before he can render it?
And generally from a user perspective you merge the steps "show site" & "show content" back to "show site with content".
So even if the server takes just as long to generate the HTML (and I don't think that's the case) the perceived speed will be higher when the site loads and that's it compared to loading, showing something and loading again to show the rest.
edit: parasubvert was faster and said the same thing with fancier words;)
Why would you assume that client-side HTML generation is slower than server-side generation?
Also, it's perfectly possible with client-side rendering to show a blank page until you have all the data. Would it be perceived as being faster? Well, you can't really say that until you test it, can you?
Simple: there are fewer synchronous/blocking steps involved. The server can get the data, render the HTML, and stream that right to the browser. The user has the content the instant that the response is received, while a client-side approach would have to wait for the JS to load (even if it's in the cache), interpret, then execute the "onload" code which then renders HTML via some template, plus any time to fetch extra data needed over the wire. As a case study, take a look at GitHub's source - the time to render the page is typically in the 100ms range, and it takes another ~2s or so for the page to fully initialize (according to Firebug). If GitHub decided to render everything on the client, I can pretty much guarantee that it would not appear as snappy.
On a side note, it almost seems like you're trolling - given that your own site seems to render content on the server. ;)
You talk about a "test" like there isn't an article linked directly above wherein one of the largest web properties in the world explicitly states that server side page generation is much faster based off of real world results.
> Why would you assume that client-side HTML generation is slower than server-side generation?
Some possible reasons:
* server hardware is assumed faster than my smartphone
* servers can share caches
* http conditional gets can now apply to rendered content (as apposed to templates and data which would require re-render client side)
One of the major drivers behind the constraints in a RESTful architecture is user-perceived response time. Routing all content through "code on demand", which is effectively what you're doing when you force a JavaScript function to do all the rendering, instead of the browser itself, is not taking advantage of the fast, incremental rendering nature of plain old HTML generated by the server.
One can still design good API-oriented URIs with a server-side approach, you're just providing different serializations of the same resource (a nice HTML one and a JSON one for API access).... so there's nothing fundamental that Twitter has lost or abandoned here. They're just using the web better.