Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
ROCA: Resource-oriented Client Architecture – an alternative to SPAs (roca-style.org)
97 points by pvorb on Feb 9, 2020 | hide | past | favorite | 21 comments


As a UI architect I've worked on both SPAs and also ROCA-style UI quite a bit. Like everything pros and cons to both.

As a general rule of thumb I find SPAs deliver much faster, but ROCA-style solutions are much more predictable. That is, getting a modest MVP out the door with an SPA terms to be much faster, but almost always reach a critical mass of functionality at which point they become hard to maintain. ROCA-style apps don't suffer with this as much.

I don't think this experience is just down to the architecture however. I just think it asks a lot more of a software engineer to structure and maintain an SPA well. Ultimately an SPA can be a more optimal architecture for many applications because the client-side environment is an increasingly powerful VM in its own right. However, I rarely see the engineering discipline required to do it right.

I tend to prefer ROCA-style because it fits better with the web's distributed integration architecture, but in practice I find building an SPA with a resource-oriented mindset is usually a good compromise.


Reading this was lovely. I strive for the majority of these goals. Unless it's an incredibly simple "one-task" sorta deal I avoid the SPA approach like the plague.


This seems a very nostalgic take.

So long as websites pretend they are documents, they will remain bloated poorly performing synchronous applications.

Accept that websites are distributed applications and deliver great experiences.


> So long as websites pretend they are documents, they will remain bloated poorly performing synchronous applications.

I disagree.

Most websites (99%) are just barely interactive documents with buttons.

The bloat we suffer today is because we ship them and design them as full interactive App when it's absolutely not needed.

Why this happened ? Because the current tooling make it easy. Not because it is the right design.

The "distributed application" is just a bullshit fashion of a time. very little applications require the current level of complexity compares to the features they give.

They are like they are because "create-react-app" and "npm install world" is easy.


I agree with you. However, making a great single-page app takes a lot of experience and work, a bit less so with current frameworks, but still. A lot of in-house SPAs are truly terrible, with bad performance and broken navigation, you wish the developers wouldn’t have bothered. Multi-page apps might be a bit harder to screw up as badly.

So maybe the user experience hierarchy goes something like: great SPA > MPA > average SPA?

This would mean there’s still a place for MPAs, at least with certain budget constraints.


> “However, making a great single-page app takes a lot of experience and work, a bit less so with current frameworks, but still.”

So what? Get experience. The days of a plucky developer throwing together some simple page he built over a weekend while reading a programming book and having that be good enough for millions of people around the world are over.

Users will require richer and deeper experiences and that will breed a demand for some developers who actually know what they are doing, and possess a vast experience to draw insights from.


It depends what the app is, surely? The problem is too many things that should just be displaying a single document instead produce a bloated SPA with worse usability. Medium is probably the archetype of this.


And I think all apps and websites should work like that.

Currently the web doesn't work without JavaScript, which I find very alarming.


I love to see this, and of course the early shoutout for pjax.

Another library worth checking out if you're interested in this type of site is intercooler.js (https://intercoolerjs.org/). It's all about using minimal DHTML/AJAX to use server rendered partials.


Adding AJAX (and usability) to ROCA-style applications is exactly what intercooler as designed to do:

http://intercoolerjs.org

No application-land javascript code, just HTML-extending attributes that fill out the functionality missing in vanilla HTML.



Reading through the site feels like a breath of fresh air in today's convoluted web development. This is how it should have been right from the beginning.


This _is_ how it was for quite a while. Check out Rails as of about Rails3 or so. Unobtrusive JS, serving up partial HTML, etc.

There's a set of interactions for which this works incredibly well and a set where it falls short.

If your UX and data model support the notion of users interacting with a single resource at a time it's really effective. If you have lots of data that has effects on other data you might be displaying, or you need lots of fine grained interactions, a full client side app will probably provide the best experience.


I don’t think a server should serve the same content in xml and json.

1. Why cater to this preference - you must ultimately force the consumer to do certain things to consume your data. Where do you stop? If some prefer the data in excel97 should you cater to that as well?

2. Do you want to have the liability of the possibility of the format’s data not being fully synchronized? Serve it one way and it will not happen.


Most of our REST endpoints can generate HTML, JSON and some of them text/csv. It's been working really well for us since 2016.


Virtually every editor, programming language, database system, etc., can deal with json and/or xml. So it's not much of a constraint. And there's nothing stopping you from providing additional streams if they're desired for any reason.


I’ve had this discussion as well, and I fully agree. In my experience, this kind of thing exists purely to sneak in the more programmer friendly JSON in corporate environments that have otherwise “standardized on XML”. You wouldn’t get away with pure JSON, but you can argue for a hybrid approach.


yes. and the version 1 just provides json. After a while, the client accept to forget about xml (in favour of other functionalities) and the case is closed.


Is http basic over ssl a realistic solution in most cases?


No, it’s not. That’s why we wrote “because of the limits of browser-native authentication (e.g. no logout, no styling), form-based authentication in conjunction with cookies can be used”. In practice, that’s the only option for public applications.


I would question why that is first in the list then as it is confusing and at odds with the rest of the document which appears to aim for clarity.

Would something like this be better?

"Authenticated communication via a browser relies form-based authentication, possibly in conjunction with cookies. If cookies are used, they should include all of the state needed for the server to process them. All other forms of authenticated communication should rely on HTTP Basic or Digest Authentication, typically combined with SSL, possibly with client certificates."

Unless HTTP Basic / Digest are also unsuited for public API's in which case should they not be removed and some other recommendation be made?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: