Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
10.7 Lion allows multi-user remote computing (9to5mac.com)
85 points by solipsist on Feb 27, 2011 | hide | past | favorite | 94 comments


It seems like Apple have discovered Terminal Services, or the basic networking functionality in X11.

Hopefully it'll be much shinier than X11 and less restrictive than Remote Desktop Protocol.


I imagine it uses Apple Remote Desktop, which is apple's own version of RDP. I don't know much about it but it works pretty damn well for me.

https://secure.wikimedia.org/wikipedia/en/wiki/Apple_remote_...


It's actually using a VNC based application, and uses the VNC protocol. I can "remote desktop" into my desktop using a simple VNC app on my iPad.

It seems that they've provided the option of creating a new VNC server/desktop process when connecting in, rather than just attaching to the existing and current desktop.

Very cool.


For the record, this has been a feature of VNC/X forever; in fact, it's harder to run VNC on an extant X session than it is to just start a new one specifically for VNC.

It's just a little annoying when Apple finally integrates tech that's 20+ years old and gets heralded as being "ahead of the curve" or "very cool" for doing it.


Get pissy much? I consider it "very cool" in the sense that this is a cool feature that I finally get to use in the OS I prefer to work in.

I don't care if it's been around for decades or done a thousand times before on other OS's, it's being done by them now, and I find it useful.

I'd rather see them do stuff like this and be praised for it than do stuff like this and get shit on by people with the tiresome "it's about time" or "Linux had this a zillion years ago and did it better" crowd.


Actually, I won't have the same attitude... I'll be indifferent, and not comment at all, as I don't really care what Windows does. Even though I use it almost daily for various client work.

When it comes right down to it, I don't buy into this whole OS religion thing. I've had more than a passing exposure to quite a few of them, and they all have/had their place.

Case in point, I started my career on IBM 3081's and PDP-11's. I even have a full rack of gear in my home office that includes a bunch of boxes with various flavours of Linux installed, a Tadpole laptop running Solaris, an operational NeXT Cube, SGI and HP boxes running their Unix variants, a cluster of Sun Netras, and on and on. I've also got a few different laptops with Windows and Linux installed. And a couple of MacBook Pro's.

They all have good features, and bad, but none of them have everything.

Do I care what someone else uses? Nope. Do I care if they gush over it and think it's the coolest thing ever? Nope. Do I feel some overwhelming urge to "educate" them on why their choices are "wrong"? Nope.

I just know what works for me, and am happy to discuss things with like-minded individuals. I'm not a fanboy, and I haven't drunk the Cool-Aid. I also don't think everyone who owns a Mac is like that either.

Call me crazy, I guess.


Tadpole laptop? Really? Which one? I did the port of Slowaris to several of the SPARCbooks back when I worked there.

(Started my career on a DEC-10, followed by a Vax 11/780, a couple 11/750s and a smattering of PDP-11s (11/44, 11/70))

Had Linux (and NeXTstep) running on an early Tadpole P1000 (100MHz Pentium laptop, it was the sh*t when it came out.)

Mostly all Macs now.


It's an UltraBookIIe. Needed it when I provisioned 6 racks of Sun gear in an off-shore co-lo for a gaming company, and it was the easiest way to auto install/configure all the OS's and software into bare metal gear. Wasn't the fastest thing on the planet, but it worked really well.


Oh. That's the "other" Tadpole (previously RDI, who acquired Tadpole Technology, Plc in order to trade its stock.)


You're damn straight I'm calling you crazy, running Solaris on a Tadpole Sparcbook - If you're running a Sparcbook or even a Sparcbook 2, Solaris is just too slow to reasonably run - choose life, choose SunOS 4.1.3.


I personally did the ports of 4.1.3 and 4.1.4 to both the SPARCbook and SPARCbook 2.


Dude, if you're ever in the UK I owe you a beer. Heck, many beers.

To be fair, the main problems I had with Solaris on SPARCbook 2 were more to do with Java being too slow and Sun's insistence on making everything Java-i-fied around the time I was using it.


I'm similarly OS-agnostic in terms of performance and usually recommend Apple gear for people with no preferences. But it still bugs me when people bray about 'revolutionary Apple products' that are really just Apple joining the party. This 'amazing new thing' gets shoved in my face that has been around for ages and I'm supposed to mollify them with vapid oohs and ahs.

Or if someone asks me why I've got an android phone and not an iphone and I casually respond that I don't like the way Apple devices get locked to iTunes, I get groans and eye-rolls. Fuck that shit, you ask me a question and then you socially punish me because I prefer to go a different way, damn right I get pissed off at the 'Apple Cult'.

Fanboys of all stripes are painful as hell, and the Apple Fanboy Cult is currently ascendent, that's all.


OK. As long as you have the same attitude when Microsoft implements the Dock for Windows users, and everyone says, "Microsoft is really cool, this is a great new feature! Way to go Microsoft", without any acknowledgment of the origin.

It's just the whole culture around Mac. Apple gets lavished and rewarded, heralded as "clairvoyant" (the term used in a big thread yesterday about iPad) and similar silly pronouncements, seemingly no matter what they do -- whether the idea and existing implementations are older than the average age of Apple's workforce or not. People go around with Apple bumper stickers and define themselves by their association with the Apple brand. They hire people and choose friends based on whether the candidates use a Mac. And, when Apple finally implements an old idea, they are super awesome for getting around to it 20 years after almost everyone else.

It's just a bit annoying for those of us not infected, I guess.

(I have a MacBook Pro, for the record. I run Arch Linux on it. This has caused several Mac fanboys much distress.)


> People go around with Apple bumper stickers and define themselves by their association with the Apple brand.

Sometimes, it seems like just as many people define themselves by their lack of association with the Apple brand..


I'm sure there are. I'm not one of these, generally. I carry around a MBP and consequently display the Apple logo everywhere I use that. I just don't buy into the dogma or the reality distortion field that makes some hail an old style of VNC implementation as innovative and awesome.


I think it all boils down to levels of tech-savvyness. My mother wouldn't know where to start with a VNC what-you-ma-call-it, but by clicking a button that says "See your desktop and files" makes Apple awesome.

Besides, I think there is something valuable in making tech simple. Generally, the applause for Apple is misplaced under "look at feature/concept X" when it should be placed under "feature/concept X 50% simpler".


Execution matters.


Well first, pretty much every tech is 20+ years old before it is packaged for the general public. Remember Douglas Engelbart?

Second, this has been Apple's post NeXt philosophy, and it is much better than the Not Invented Here philosophy that preceded it. Many OS X features are simply a better user interface for some Unix feature or other.

Apple should be applauded for this. Too much great technology has failed to make a difference in the lives of Muggles because of the incomprehensible incantations required to invoke it.


Yes, but people are reluctant to give credit for feature ideas in implementations that they feel to be inadequate. The Linux GUI has traditionally been one of the biggest objections people have to the whole operating system (though it seems to be less so lately), so they don't get credit for most of the features of their GUI. Similarly, Apple rarely gets credit for introducing digital photography to the consumer space because few people knew or cared about their camera product.

To offer an alternate perspective: This will likely be a new thing to lots of people who aren't very technologically literate.


It's just a cop-out. The GUI on Linux has been good for a long time -- certainly comparable to Windows, which ripped KDE4 off much more directly than OS X in Vista and 7. People just make excuses for things. Remmina and Vinagre both make it easy to connect to "Jeff on Box 1", "John on Box 1", "WALMART on WALMART Box". You just have to add the IP address and give it a name, and you have to have an IP address to connect on OS X too.


"You just have to add the IP address and give it a name, and you have to have an IP address to connect on OS X too."

I will be curious to see if you actually need to know your IP address. This seems like exactly the kind of thing Apple would abstract away for the user, which is one reason why many people are willing to pay a premium for OS X over Linux.


If you're on the same LAN or conbecting to another computer using the same MobileMe account you don't need to find the IP. Both cases use Bonjour/zeroconf.


You generally have to. Another guy replied with a few circumstances where zeroconf will broadcast the availability, and vinagre at least supports zeroconf so it would work fine with that. Outside of a LAN or MobileMe tunnel, you have to have an IP address or DNS record. Most people aren't going to set up dyndns, so it'll be an IP address.


So I am not the only one who when they first used Windows 7 felt like it was a weird copy of their KDE 4 box at home?


But why? It may not be ahead of the curve but it is cool, all right.


It doesn't matter how old the tech is. Apple is bringing it to the masses and nobody has done that before. Why? As you said it's 20 years old.


> It doesn't matter how old the tech is. Apple is bringing it to the masses and nobody has done that before.

Nobody has done that before, except for a company with about 90% of the Desktop market share, first party remote desktop services, a third party market with services to 'go to your pc', implementations of some sort of VNC-type service and goodness knows what else to secure it all.


Puh-leese. Apple is "bringing it to the masses", where I guess masses qualifies as < 10% of the desktop market, only because Apple has larger marketshare than anyone who has done it before.

This stuff has been part of every Linux distribution for years. People have even written shiny interfaces for it, see remmina or vinagre or any other remote desktop management suite (and yes, these ship with many distributions by default).

That OS X has more marketshare than Linux shouldn't automatically make OS X more "cool" on its own merits; after all, Windows has 8-9x as much marketshare as OS X, but I don't see a bunch of threads pop up lavishing MS with praise every time they integrate a new feature into Windows.

I don't think Apple should be hailed as super cool and great by technical people just for finally getting around to doing something that has been done thousands of times before for decades. I can understand an ignorant person thinking a feature that's not found in Windows makes Apple "cool", but it's silly for anyone browsing Hacker News to ascribe a bunch of street cred to Apple for this.

It's a nice feature and I'm glad to see it go in, it's nice that Apple is adding nice features, but I'm not going to sit here and pretend like Apple is the coolest kid on the block just because they finally got around to writing a semi-normal implementation of VNC, just as my mind wasn't blown when Apple finally integrated workspaces after 15-20+ years of availability on all other respectable WMs. It's more of an "about time" than an "Apple is consummately cool".


I appreciate that the pieces have been around for a while but the fact that you think tools such as remmina[1] are appropriate for the masses indicates that you don't quite get it. One has to type in a "server" to connect to a remote machine in remmina, and at that point the average user is already lost.

If you think I'm giving Apple "street cred", whatever that means, you've misunderstood me. If you think I'm hailing Apple as "the coolest kid on the block" indicates that you may not be looking at this situation objectively. Please don't put words into my mouth.

By bringing something to the masses I mean making it accessible to less technical users. It's available and usable by anyone, just because OS X doesn't have 80% market doesn't make it any less available. I really feel that you have misunderstood me on many levels. It's apparent that you're tired of people hailing Apple for no good reason but I have not done so, so don't take it out on me. I'm not a blind Apple fanatic.

[1] http://remmina.sourceforge.net/screenshots.shtml


not "20+" years old, either.

The evidence shows that VNC isn't from much before 1997 or 1998.


X11 is much older.


IIRC, one of the negatives about Ubuntu choosing to move away from X11 towards Unity is that they will be losing the network-friendliness for remote screens for which X11 is specifically designed.

X Windows was designed around running multiple thin-clients as a primary purpose. Wiki says X started in 1984, with X11 being released in 1987 - it'll be 24 years old this year.


It's not "X Windows", either. It's either "X" or "The X Window System".

I ran X9 on a Decstation in 1985 or so.


heh, sorry, I always unintentionally do that, and it always gets up the nose of the old guard :)


I think you mean Wayland not Unity, and I don't think its mainly Ubuntu pushing it. There probably are some issues with unity and network transparency due to dbus use for shell components(and I think the universal menu) but that's a separate issue.


My mistake, swap in Wayland. Most of the moaning I've heard on the web about it is from Ubuntu - probably because I use debian and watch those crowds more.


You know, you just gave me a brainwave. With server being integrated with client, Apple may be integrating ALL their server products - ARD included. Wouldn't that be cool.

Now we just need a nice 1U server again. Now what do they do with xsan and final cut server? I can see FCS being integrated with final cut studio. Maybe even in a distributed fashion (owww git for studios!) But I'm not sure what they could do with xsan.

Anyways, we shall see what happens, so far I'm liking the look of things.


it would be nice to get something equivalent to X11 forwarding that would work with all native Mac apps.


Hopefully it'll be much shinier than X11 and less restrictive than Remote Desktop Protocol.

Out of curiosity... In what ways do you find the Remote Desktop Protocol restrictive? Does the protocol itself limit what can be done? From an end-user point of view it seems very flexible and powerful, while lightweigth and responsive.


Vacri hit the nail on the head - the locking out users on the desktop combined with the licensing restrictions on the server make it a tad too restrictive out of the box.

Having said that, when properly configured and licenced Terminal Services is freaking awesome.


Ah right. I thought you meant there were some technical/architectural limitations with the protocol itself and didn't even think about limitations added on top of it via licensing. Thanks for clearing it up.


I'm not an expert in remote protocols, but I also find it snappy. Perhaps the point that for non-server versions it locks out the local console while it's running is the restrictive part?


What do you mean? X11 obviously copied Apple's innovation.

EDIT: every time there's a talk about Android versus iOS there are people complaining about how Google stole interface elements from iOS. This in spite of the fact that no software is revolutionary and every idea is based on previous art; even the interface on iOS.

I am just making a joke here based on repeated evidence that even Apple (indeed an innovative software and hardware company) copies / gets inspiration from others and has done so repeatedly.

Sorry,


Snark is frowned upon on Hacker News. It's generally low in signal and often drags the tone of the conversation down, so it's part of HN culture to downvote snarky comments.


I'm fairly sure that current HN culture is to downvote snark that you disagree with.


I've noticed that with comments of my own - where I've made a sarcastic comment, then said 'kidding aside now, real comment' and been heavily downmodded. Normally this wouldn't rankle, but in the same threads are far more sarcastic comments with zero content happily getting modded up, sometimes quite strongly.

Another time I was arguing against a guy about language, and despite his position being offered in good faith and with a degree of detail in his argument, he was modded into the negs.

Comments do seem to do better here when appealing to emotion and popularity rather than internal merit.



I'd like to see a remote login implementation for OSX that allows you to login remotely, and also adjusts the screen size to match that of the actual physical device you're on.

The existing implementation of screen sharing is frustrating for me to use because I'm often remoting in from a 15" Macbook to a Macpro tower with two heads @ 1920x1200 each. So the options are scale to 1) down the display (slow, fuzzy) or 2) scroll all over the place.

An ideal solution would be to let me log into a virtual display that matches the size of my physical remote terminal. In other words, make the server think it has a head the same size as the client.


I'm glad to hear that they've catched up to the original UNIX. :)


I would rather say that it is more OS X returning back to its roots where NeXTStep/OpenStep use to have such capability when all the rendering was done with the PostScript built in server. These feature disappeared from OS X, so it is more like a welcome back with a user friendly UI. Back then you usually had to use some terminal command line to make ir happen.


Amazing that this is the only comment so far that points out this wasn't even Not-Invented-Here Syndrome, because their own OS already had this feature (Display PostScript remoting) twenty years ago. At least until some architect marking their territory was allowed to wilfully break it and replace it with ... nothing.


It's a tough row to hoe, having NIH Syndrome.


My job wants to use a third party product to do the exact same thing.

http://www.aquaconnect.net/mac-terminal-server.php

This is a great addition to the os. Imagine your house having one [desk|lap]top and a few tablets or phones. Hop on vnc on your docked tablet while someone is using the computer and you're as good as using the desktop. I see this as a move to push both os x and the i[pad|phone] as viable business devices.

This is possible with windows right? I vaguely remember using remote desktop a few jobs ago, but I dont remember if it was one account logged in at a time.


The server versions of Windows allow that, it's called Remote Desktop Services (or Terminal Services pre-2008). Way better than using VNC, probably because it hooks all the window control drawing on the server and does it on the client instead, rather than sending over raw bitmaps for buttons/textboxes etc.


This is gonna make my Mac Pro way cooler as a central computing "mainframe" for my house...


This is going to be absolutely perfect for educational use. Imagine a university computer lab that lets you remotely login from home after the lab has closed to access all the expensive, specialized software the school has already purchased. I know my university was trying to set something like this up, but couldn't find any user-friendly way of doing this.


So, imagine a university computer lab that installed SSH and VNC?


Imagine explaining how to set this up and run it to an entire incoming class of freshmen?

Much of OS X's value comes from taking Unixy technologies and making them much easier to use.


Remote desktop sharing isn't that complicated of a concept. I use it on my Mac all the time - ssh to a server, connect VNC to localhost, you have a desktop. Is that supposed to be an enticement or a drawback? It seems straightforward to me. Apple has stuff for VNC built in, too.


Except only one user can control the computer at a time with VNC. The functionality described here means 20 students could use the same computer in theory.


>Except only one user can control the computer at a time with VNC.

Not true. You can have multiple VNC server/desktop session running on the same box, each listening on a different port, each with it's own user controlling it.


Really? I don't know much about multi-user VNC, but my understanding was that multiple users could control a machine from different ports, but they would still be sharing a single screen. That's definitely useful to know that multiple screens are possible too. I'll have to look into it more.


Don't think screens, think desktops. Each vncserver that launches (and binds to its own unique port) has its own separate desktop. Each runs its own distinct window manager application, and is not tied to a physical monitor or screen. Usually the default is TWM (minimalist window manager that takes up minimal resources), but you can also run GNOME or a host of other window managers if you want.

In OS X case, it's running the Finder.

If you log in as a user on Linux, and launch vncserver as that user, then that VNC session and desktop will be of that user.

One of the huge advantages is that the state is also maintained on the server. I've used it a LOT to install various Oracle software over the years. Fire up a VNC server session as an Oracle user, connect into it at the client site or office and start the installation. You can then disconnect, go home, do whatever, and connect in again remotely from another location (tunnelled over SSH usually), and the install will still be there running. It might have paused waiting for a dialogue to be answered, but it hasn't killed the install like logging out of the main desktop would have done.

You can also set the colour depth to be used by the client, thereby increasing performance over crappy SSH connections. 8-bit colour takes WAY less data than 32-bit, especially when installers have stupid meaningless animated gifs that run all the time.


That's the case on Windows... not sure about Mac OS. It's definitely not the case with X11 on Linux; I run multiple VNC displays all the time in addition to my normal desktop display. You can do several separate displays/desktops per user, even. The only limit is the host's resources.


As code_duck says, on MS Windows VNC just lets you see the existing single-user desktop. It's a different beast to the unix VNC.

On X Windows (X11) VNC runs 'more natively' creating a new desktop for you when you remote in - each new connection creates a new desktop instance for the user logging in. It's actually more configuration work to make it view an pre-existing desktop session :)


Remember that X was developed at MIT to run MIT labs, and that X was originally meant for thin clients that accessed programs in much the same way described. This stuff has been possible for a long time. VNCs are actually much easier to use with its own X server in most instances than running on a living one.


I doubt there are a bunch of user accounts on a uni computer lab Mac, so this wouldn't be much different than regular screen sharing. It's also still a mess to choose which computer to try, doubly so if you're on an outside network.


Mac OS X Server had (has?) decent support for proper accounts, with the client machines essentially just diskless terminals that boot from the server, and with all accounts on the server - called netboot. So not like screen sharing at all. This worked pretty well in a lab setup, but didn't support remote logins - this sounds much better.


The OP was talking about doing this so expensive university software... I'm going to go out on a limb here and state that it's just not going to work well when 100 students try and spin up instances of AutoCAD or Mathimatica on the same machine.


I doubt there are a bunch of user accounts on a uni computer lab Mac

But they could always just throw a beefy machine in a closet and have it connect to the same network directory service that the lab machines grab their user accounts from.

My old university did the same thing, but did so nearly a decade ago with a Windows terminal server.


Each computer on campus is connected to a user directory, so students can login to any computer with their individual student account. I'm assuming that Lion's multiple login feature will work with networked accounts.


Perhaps, but it's not going to work well to have 100 students logged into the same computer. I think the home example given in the article is a much more likely outcome than a university setting.


That's true, it may have to be limited to 10-20 simultaneous logins at a time (or whatever the machine can reasonably handle), but that is still better than 0. Considering my university has a relatively small student body, this would be definitely be a sufficient solution for us.


Anything has to be better than Microsoft's "Call Our Account Manager To Have Them Explain Appropriate Volume Licensing - Oh You Bought Something Slightly Weird Let Us Pass You Through To The Next Guy Up Who Might Still Fob You Off" licensing methods.


RDP on windows. If they can figure out how to use expensive specialized software, they can figure out Windows Remote Desktop.

(My university does this... pretty much so engineering students can use matlab at home... badly winces)


Er, ssh?

I don't see what the big deal is. I've been able to remotely ssh into my mac for years.


now imagine this at the gui level. imagine being on your mac, pick up your ipad, move to the next room, and swipe and have access to the same apps currently running. jut look at what apples been putting out the last few years. look at lions big features. full screen mode doesnt sound impressive. but that would make those apps much easier to use on the ipad.

im nut saying this is all new stuff, rather, its going to be packaged by apple, and that is going to mean something.


If you want to do this right now, check out Screens for iPad.


Yeah, I've seen that before. I'm not talking about just that. What I'm talking about is having the full screen mode Apple is developing will have support for turning the app into an iPad version with little effort. Basically, by following their API and implementing official full screen support, when the time comes to implement Mac to iPad support like I'm describing, instead of seeing the Mac's desktop, your seeing the application on your iPad in an "iPadified" version of the app.


I just want to apologize for the awful spelling and punctuation. It was written on an iPad.

"im nut saying..." >_<


This is nothing special. Lion says it would include a server version of Mac OS X, and this is just a very basic function for any server OS. Although I haven't played with OS X server before, I'm quite sure this feature has been there for some time.


The way I like to think of this feature is as a throwback to the days of "dumb" terminals that would connect to a server where you stored your files, account info, etc.

"Back to Unix".


Why not? With HTTP we went back to the future vis-a-vis 3270.


If I understood it correctly, this has been possible for years in OSX. You just need to be “fast user switched out” instead of logged out: - Log in with user A - Fast User Switch to user B - Log in to user A with VNC. It works just fine.


a lot of the comments are fairly shortsighted. i called this some time ago, and it makes sense. consider how mobile our live are right now. consider the wireless nature of everything. imagine having your computer be situated out of sigh, and your screen is mobile. your ipad and iphone connect to the same central hub. the computer becomes an appliance at home. just look at lion and the elements of ios its pulling in. now imagine apple making it easy to do all of this.

airdrop, mobileme, app store, all of this are elements to a grand vision.


yes, and especially with rumors that iPhone 5 having NFC technology. Imagine being able to walk up to any mac and just swipe your iPhone and all your settings, bookmarks, etc. will just transfer over and you will be able to use it, just like you were infront of you mac at home.


Is it just me or are 99% of tech "news" and "innovation" just reinvention of stuff already found in UNIX, Lisp etc. decades ago?


It's not just you. Every new thing that's any good is usually a retake on something already though about in the early computer times. But some of the implementation back then were crude. With the reinvention there is usually a rethinking and polishing.


No need for the quotation marks. It’s news because you can’t do it with out-of-the-box OS X right now. It’s not innovation (the UI might be, I don’t know) but the article doesn’t claim it is.

Technology, however beautiful, often doesn’t matter if many people are unable to use it. It’s news when said technology might become more accessible in the future.


It's not just you, Henry Spencer reportedly said it (first?) in 1987.

"Those who don't understand UNIX are condemned to reinvent it, poorly."

http://en.wikipedia.org/wiki/Unix_philosophy#Quotes

http://en.wikipedia.org/wiki/Henry_Spencer#cite_note-2


He was just paraphrasing the bible:

"What has been will be again, what has been done will be done again; there is nothing new under the sun." -- Ecclesiastes 1:9


It more or less all started with Doug Engelbart giving the "mother of all Demos" in 1968. It's all been catch-up since that event.


It's not new. The way I see it is as in the film industry, it's a reboot. It's shinier, it may lack the depth and revelation of the original, but what it lacks in that area it makes up for in special effects and ease of consumption.

Consider Sendmail vs Exim, or Tron vs Tron:Legacy.

Both are essentially the same thing, but one is infinitely easier on the eye.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: