Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
'Li-fi 100 times faster than wi-fi' (bbc.co.uk)
74 points by grahamel on Nov 27, 2015 | hide | past | favorite | 70 comments


That's nice, but my 5-year-old 802.11n WAP still has ~20x the bandwidth you can buy for any amount of money at my house. Faster WiFi is not particularly interesting for consumers unless they're also getting faster ISPs.


Has the entire world forgotten what a hard drive is? What LANs are? Good grief! You can transfer data on a network without sending it to Dropbox first! Faster local networks have tons of uses beyond just slaving them to someone's walled garden.


Funny story (missing the point I know), the Dropbox client has provisions to automatically detect other clients on your network and then sync your files to your other computers over LAN rather than through Dropbox servers (files are still synced to the servers of course, but they don't have to make a return trip to your other computers).


Dropbox sync over wifi has never worked for me. Perhaps it's because the two clients are running on different OS's (Win/OS X).


Except the UX of that is horrible. I'd have no idea how to set-up that and I'm sure it would take me at least a couple of hours...


Windows can automatically set up a "home group". OS X can use a network for Time Machine. In Linux as well, you can typically right click a folder and choose "share". You can use a Chromecast. You can use Plex or Kodi/XBMC. You can use AeroFS, PogoPlug, Owncloud, or any of a number of commercial NASes. You can live-stream games using Nvidia tools or Steam. You can use IP cameras. Etc. ad infinitum.

None of these have a UX that is appropriately described as "horrible".


- Time Machine is a background process so network bandwidth is not super important.

- Can you even Chromecast content that isn't from the internet? I have one and use it with Netflix all the time, but that's news to me.

- Typical living-room fare is only getting into XBMC or onto a hard drive/NAS if it's been BitTorrented. Streaming rules the day for me and for most people, it seems. (Not all of it is legal, of course, but people seem to vastly prefer illicit streaming sites to torrent trackers and clients.)

- OwnCloud is substantially more useful on a VPS so that you can access it from anywhere. Time Warner blocks incoming connections to its residential cable modems, last time I checked.

- IP cameras tend to be low resolution, low framerate, or both.

Of course local networks have their uses. What I contend is that very few people are actually constrained by the ~300Mbps of 802.11n.

Maybe the real application is business networking, so that remote desktop workstations could be practicable without wired networking.


> - Can you even Chromecast content that isn't from the internet? I have one and use it with Netflix all the time, but that's news to me.

You can Chromecast local content over LAN via Videostream[0] for Google Chromecast. You can even control the Videostream host via mobile app[1].

It's the next best thing to using Kodi[2] (formerly XBMC) over HDMI with a wireless controller.

> Typical living-room fare is only getting into XBMC or onto a hard drive/NAS if it's been BitTorrented.

On a weak ISP, breaking synchronicity lets you stream content around the house at higher quality. E.g., your ISP only supports ~720p streaming speeds, so you download 1080p+ content overnight / while at work, then stream around the house later.

[0] https://chrome.google.com/webstore/detail/videostream-for-go...

[1] https://play.google.com/store/apps/details?id=com.videostrea...

[2] http://kodi.tv/


It'd be a bit strange for tab streaming to go over the internet, and there certainly are apps that stream local videos. Maybe they do go over the internet, but I wouldn't just assume it.


>You can Chromecast local content via Videostream[0] for Google Chromecast. You can even control the Videostream host via mobile app[1].

The issue with Chromecast is you still need an internet connection to get the app onto the Chromecast to stream the video to it... I was using this and it was awesome! Then things happened and I was stuck on a shaped 256kbps connection and Videostream stopped working reliably. Switched to Kodi on the Remix Mini, works well, though not as good as Videostream did...


Everytime a home group came up in windows I closed the window immediately because I had no idea what it was about. I imagined it was for some kind of file sharing but honestly it only had checkboxes for default windows documents folders. I, and also everyone I know, do not use them for storing my files. Honestly I don't know why tho.


Samba/CIF is infinitely more simple than setting up Dropbox. But people have been trained on proprietary solutions, thanks to the iPod, hence find that simpler.


Unfortunately when I tried to set up a local synchronization solution I could not find any simple guides without already knowing what the names of the protocols were. Call me a filthy plebian, but in contrast all of my friends already knew what Dropbox was.


Well no, there's way to sync files simply but accessing files is quite simple.

Regardless, even Dropbox uses local network bandwidth when syncing on two computers on the same network.


Wi-Fi is not only about Internet. How about in-home use?

I've just backed up laptop's HDD to a desktop machine the other day. Took me about 10 hours. I haven't bothered, since I slept most of that time anyway, but would still enjoy if it'd be faster. I think I've also hit bandwidth constraints, when I streamed high-definition media over the LAN.


Hard drives and protocol overhead are to blame. 1gig+ .11n wifi is common, Ethernet is always an option but in either case neither will be the bottleneck.


In my country in bigger cities, there's gigabit fiber connections available to home consumers at affordable rates. And my 802.11n can't saturate this.

Coverage is steadily increasing, too, so WiFi or 'LiFi' technologies like this will become more and more important.


Every couple of years I hear about this. Then it gets forgotten, and re-invented. Each time it never manifests into a real product. That reminds me, I think we're about due for another rehash of the "Voyager leaves the solar system" article, anyone? Or another segment on the local news where they say "Scientist find [wine|beer|cheese|coffee|nuts|fat|red meat] [good|bad|healthy] for you" article.


Don't forget that we're overdue for another miracle development in batteries, as well as the next iteration of sprayed-glass coating that turns everything waterproof and bacteria-free.


And solar cells, and some vaccine for dental cavities.


And the moon doing some "once in a millennium" thing.


Like spontaneously disintegrating?

(Seveneves is a good book, I highly recommend it.)


This is one of my biggest pet peeves. Every week on the news you hear about a miracle graphite battery that is super thin and has 500 times the capacity. The first time I was excited, the second disappointed, and every other time I just sighed. I'll believe it once it's hit the market. I don't understand why the news reports experimental research just because it's novel. Of course the researchers will play along because it makes their work seem important, stroking their egos and helping them get future funding.


All I can think now is "Will a new battery size of rolodex be able charge your iPhone 400 times and power your electric car drive from Folkestone to Stoke-on-Trent?"

And "New solar panel technology powers 30 middle Atlantic households or 400 French sheds."


> I don't understand why the news reports experimental research just because it's novel.

Ad impressions.


any time I see the word 'study' I immediately skip to the next link. I don't believe anything from a 'study' anymore.


I have seen this before too, and I ask myself what impact it has on the life of the lamps.

Others have made me think about other issues with this, like how the lamps will get their connection? -Someone then mentioned through the power grid network. And then I think you are limited to the bandwidth of that. How much infrastructure needs to change to make this li-fi happen?

edit: spelling error, also I am thinking only about the impact on for example an office installing this. At a home this is too expensive for what it gives.


There was an IR physical layer defined for 802.11 last century, but it didn't go anywhere. http://dl.acm.org/citation.cfm?id=2289301


Interesting, I just remembered about 10-15 years ago, I had laptop with "infrared" support. Never quite knew what that was for. But one day noticed in the lab I was working the HP printer had an IR window, after a quick search on the net, sure enough I could print a few pages using IR from my laptop. That's all I did with it and had forgotten it since.


Apple products used to (pre-2012) come with an infrared remote control. You could use the port for anything. I used mine to record the secret codes that the TV repair guy was punching into my set.


HP had a whole line of infrared networking products in the 1990s. Little reddish domes on the ceilings were the access points. They never really caught on, but worked OK.


I remember a lot of old phones had IR support


According to the article IR is slow.


While the usable bandwidth for AM modulated carrier is obviously smaller than for visible light, it does not matter in any meaningful way, because it is still significantly larger than anything that could be processed by today's electronics.

There are different protocols specified for IR transmissions:

- IrDA - master-to-multiple-slaves, directly amplitude modulated IR with RZ coded 8N1 asynchronous serial port frames frames (which is directly supported by reasonably modern UARTs) with HDLC-like protocol suite on top. This supports standard serial port speeds up to 115k2 and has extensions with different modulation schemes that go up to 512Mbps, with 4Mbps variant being generally supported by most PC hardware. It serves mostly same purpose as bluetooth, protocol stack is largely similar and for long time it was in practice significantly faster than bluetooth.

- Above mentioned 802.11 IR Physical layer - IIRC something like 1 or 10 Mbps, protocols same as for 802.11, there probably were no commercially successful implementations.

- Various standard, "standard" and completely proprietary protocols for remote controls. Hundreds to thousands bps, typically using few tens of kHz carrier wave that gets modulated on to the IR.


That's more protocol than anything. The IR used day-to-day was designed for consumer products in the 70s and 80s, so by necessity was a very simple protocol. There's no physical limitation for higher bandwidth IR.


As I understand

https://en.wikipedia.org/wiki/Nyquist%E2%80%93Shannon_sampli...

IR has a frequency of between 430THz and 300GHz, so Max bandwidth would be between 215Tsymbols and 150Gsymbols (I used symbols instead of bits because I forgot the right term...)

So there is an information-theory limit, as I understand it. If I was ever wrong, this is the right crowd to find the person who will correct me!


IR comma doesn't the frequwncy of the light as a carrier, it uses pulsed IR (on=1 off=0 usually in the KHz range) the limitation is usually the rise time of the LEDs


Yeah, but I wondered what the absolute upper limit would be. Good point about the rise time of the LEDs though.


Symbols/second = baud (https://en.m.wikipedia.org/wiki/Baud)

And, ignoring other physical or 'current technology' limits, you can put as many bits in each symbol as you like.


Thank you! Can't believe I couldn't remember that word, although I never really understood why you can put as many bits in each symbol as you like - it seems like that would just mean stuff an infinite number of bits into each one? (Obviously silly, but then why is it defined that way?) - Great, now I have to read about that this morning. :)


Yeah, there is the Shannon Limit which gives the theoretical max, I was more trying to get at there not being a reason why we'd be stuck at remote control/irda speeds.


To all the "lab tech never actually arrives" skeptics, it's worth noting that 12 years ago this was a thing:

http://news.nationalgeographic.com/news/2003/03/0318_030318_...

In my opinion, expect to see VR as the first consumer device to pick this up - it's too bandwidth limited for today's wireless and sorely wants not to be.


Guys, Guys, for Internet to work packets must be sent and received. How is the sending part going to work?


Why does VR need more bandwidth? Surely it's just a video feed in and some telemetry data out. Isn't latency more of a problem?


First of all, you need a lot higher refresh rates, because you tie in your ocular-vestibular system which is very sensitive, compared to the visual feedback required to be comfortable sitting in front of a monitor. It's still unclear where the upper limit is, but the lower is probably at least 120 Hz. The consumer devices being released soon are running at 90 Hz, and that's the bare minimum. I've heard "1000 Hz" coming out of Michael Abrash's mouth, but that was somewhat of a hypothetical number.

Secondly, higher resolution. You'll want at least 4K in the near future, because with a lower resolution display that close to your eyes, you'll discern individual pixels and the actual space between them (causing the infamous "screen door effect").


Even further, strict latency requirements on an already-busy system prevent you from doing any interesting compression.


My experience is with Occulus. Full HD is not enough as you clearly see the pixels. I was so distracted by the "low" resolution that I couldn't immerse myself at all.

Probably 4K will be better, but I suspect it still won't be enough


Those Codecs that compress video well have considerable latency. Raw video out has been the single signal with the highest bandwidth in almost every computer.


Since fiber optics work just dandy, clearly getting your source to switch on and off at the requisite rate is not the trick. But if you're not shining down a fiber optic cable, you have the problem that intensity decreases as the square(?) of distance from the emitter; and that your multipath problems become a lot more pronounced.


These 2 things are created in different semiconductor technology (materials) that have quite different properties. Most communications are either in GaAs for NIR or in InGaAS/InP for mid IR. Visible stuff here is generally a large GaN die pumping a phosphor material. As the size of the die increases so does its capacitance which ultimately limits the rate at which the signal can be modulated or the brightness of the source. Additionally in the visible you use Si detectors which are slow due also to their large capacitance. So I think it's unlikely you could ever realize 224Gb/s unless you were using WDM (wavelength multiplexing).


How do you request the data to be transferred? This magical system is starting to get an awful lot more complicated.


Beamforming. The other articles I've seen on using the visible spectrum for communications mostly discuss sending to a mostly stationary object. It's not great for general purpose, but as a wiring replacement, can be rather useful.


Can we use a waveguide for beamforming? Maybe some kind of tube relying on total-internal reflection to guide our light around sharp corners?


i'm no computer scientist, but don't i have to request information in order to receive it? What i mean is, do i have to attach a LED light to my laptop to request the data from the LED in my roof?


Yes and no. One could just receive broadcasted data -- a monitor on the other side of the room really doesn't need to send much info back. For a laptop, one wouldn't necessarily need to request data via an LED, the requests could be made over some other medium, like traditional wifi, with the responses coming from the LED.


1Gbps is not 100X faster than Wi-Fi, unless you're talking about 802.11g under marginal conditions.


If only the next line wasn't: "Laboratory tests have shown theoretical speeds of up to 224Gbps".

Which is still not 100X from all wi-fi (more like 10x), but it indeed is 100x faster than the wifi speeds most see in practice.


So you're comparing the theoretical speeds in lab tests to actual speeds in the real world of older wifi tech? That's not really a fair nitpick.


I'm not. The article is. I just explained their probable reasoning -- and that the 100x doesn't refer to the 1Gbps speed since they go on to talk about higher possible speeds.


I've come to take the BBC's tech reporting with a pinch of salt.


Given that everyone else's reporting is worth the same or even less, I wonder where do you still find salt for all of that? I've run out of mine long ago.


You can rely on the BBC to give you the right time and roughly what the weather was like yesterday. Otherwise forget it.


Great, but you can't use it in the dark! Perhaps they could fix this with infrared?


I forget where I saw this (think it may have actually been a video), but they said the latest versions could work with the lights off as it is still flickering the light, but just dinner than what we could perceive. Of course this could with the assumption that the source I hard it from was correct and if so, then it's probably only feasible with lab grade sensors to pick up signals that sensitive.


or UVs..!


...then wireless networking really will cause cancer.


maybe a deep violet right at the edge of the visible spectrum? all you would see is a faint glow


I'm not sure but I heard short wavelength (blue/violet) visible light has most impact on circadian rhythms.

If it's indeed the case, insomnia or any other kind of sleep disorder is not worth it.


Ok, but you'll be the one to explain to people who don't want to accept the ionizing/non-ionizing radiation distinction that this sort-of-UV is not the cancer-causing UV.


Imagine if someone used ultrasonic sound in conjunction? Ultra fast networking and witless charging. Oh wait, that won't work....




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: