> by preventing wall penetration, reducing jamming
This is the main selling point I think, but not intentional jamming... regardless of the inherent lower latency, lower jitter and higher throughput, unlike wifi none of these aspects are hampered by proximity to adjacent signals from other networks and other EM sources.
Even if people don't want to kit out their entire house a la PoE, it would be a nice benefit to have this work side by side with wifi... bad signal? just walk into the room with the router, auto switch to lifi and it's effectively as good as wired. Also more devices automatically using lifi when wifi is not necessary will alleviate interference for everything and everybody else where it's actually necessary. It's a win win technology.
It's also interesting to see something move (relatively) quickly from experiment to standards proposal. I suppose that's due to the practicality of this tech.
In high density living (condo towers, downtown etc), it is common to have interference from neighbor's wifi network on the same floor or adjacent floors.
Having a network that expands using a non-interfering frequency would be a godsend. Especially given the limited number of channels for existing WiFi standard.
Newb question: since they are both electromagnetic radiation, how does Li-Fi get up to 100x faster than Wi-Fi? Are the light transmitters and sensors that much faster? Or perhaps is it able to use a wider band of frequencies?
I'm also so curious how this ends up working in practice. Even using infrared, would it interfere with things like baby monitors in night mode? The Wikipedia article says it can be tuned to be less intense than humans can perceive, but I'm curious if that's true in practice. (Granted, babies don't typically need Internet access while they are sleeping, but maybe the monitor itself does.)
> the higher the frequency, the higher the theoretical maximum bandwidth.
This is not true and is unrelated to Shannon's theorem.
Shannon's theorem shows us that wider bandwidths allow for larger bit rates. At higher frequencies our bandwidths can be bigger. For example a band from 1 to 2 terahertz is 1 terahertz wide, which is 1000 times larger than a band from 1 to 2 gigahertz (1 gigahertz wide).
The total bandwidth available (including multiple channels) for 2.4 GHz Wi-Fi is about 100 MHz. The total space available for this new standard is 800 to 1000 nm [0], which is 450 THz. That's 4.5 million times wider than Wi-Fi. That is why you get higher bit rates with this new standard, AKA more throughput, or more "bandwidth", when the term is used to mean data rate.
Just imagine what we'll be able to do with Gamma data transmissions ;-)
I like to explain to my kids that it's all colors - we are transparent to some (such as X-rays) the same way some fish are mostly transparent to the light we can see and walls are transparent to the radios we use in Wi-Fi, and also that both snakes and bees can see light in colors we can't (snakes see IR, bees see UV).
I think you're misreading Shannon's Law. It's not that the higher the frequency the greater the max bandwidth. It's the higher the bandwidth the more data you can put through.
I think Li-Fi is more comparable to Ethernet instead of Wi-Fi, not because the fundamentals are very different but because of the link budgets available. If you think of an ethernet line as a pipe carrying data, then a collimated laser can be thought of in a similar way - most of the energy that you transmit is going to make it to the destination, and that is not the case with Wi-Fi, even with very high gain antennas. This allows for different modulation schemes and thus higher throughput. Copper ethernet is now capable of 1.6 Tbps [1] and Li-Fi doesn't seem so very fast compared to that; however keep in mind this is comparing only physical layers. Demonstrations of optical laser links of hundreds of Gbps over hundreds of kilometers have taken place [2] using COTS optical ethernet transceivers and special output stages providing precise collimation and pointing.
For your second question, I'm not sure how baby monitors work but the proposed wavelengths are unlicenced and there are little if any rules for how to deal with interference. There are rules for eye safety of laser which limits the maximum energy that can be delivered to the output. Generally as Li-Fi gets more common we will have to learn to deal with interference as it arrives. For example, Lidar systems (older, noncoherent ones) interfere with one another and are even susceptible to interferece from IR motion detectors and such, but these aspects have to be considered during design.
My understanding is they went with 800 to 1000 nm (infrared), or ~ 375 to 300 THz. I'm not sure as to the total combined bandwidth of all of WiFi 6 or 7, but 75,000 GHz band gives them a lot to play with.
> Newb question: since they are both electromagnetic radiation, how does Li-Fi get up to 100x faster than Wi-Fi?
They specified that they wanted to be 100x faster, and they added parallel channels, until they reached that goal. No, really, this is the real reason.
It is entirely nonsensical to ask for a physical reason, because different channels are just different.
well the physical reason is that the band available at the visible light spectrum allows you to add that much channels in parallel. You can't do that at 2.4GHz
> Even using infrared, would it interfere with things like baby monitors in night mode?
If you're referring to the infrared LEDs that illuminate the baby, their light is not polarized, while light used for communication is, so a polarizing filter in the receiver can filter out such noise.
Baby monitors with night vision usually have a small set of tiny infrared lights so that they can see. Adding additional infrared to the room will help the monitor to see better.
Correct. The reason why microwaves can cook food isn't the fact that it is a microwave frequency. Microwaves ovens cook by flipping the polarity back and forth. The frequency emitted is the same resonant frequency as water molecules, so the water molecules attempt to align constantly to the ever changing polarity. Movement is heat; thus, the water heats the food.
On one hand, I see where this could be helpful in certain scenarios like an office, where there is a consistent and planned layout specifically for the purpose of productivity. Or for military applications, where EMSEC is taken very seriously (to the point where Wi-Fi is generally not used at all in most classified facilities). Though I am also not convinced this would change the calculus much there in reality.
On the other hand, I don’t see the draws outweighing what seem to be clear setbacks. E.g. if I put my LiFi enabled phone in my pocket mid download, it will completely cease to work.
What is interesting is the idea of a much more comprehensively connected future. E.g. imagine a building either both Wi-Fi and LiFi enabled, with automatic switching between the two based on which is less congested and provides the best speeds. As our daily bandwidth footprint grows, I can see the benefit in having multiple spectra for information transmission.
I’m not sure that this provides any benefit from an EMSEC perspective, as you’d basically be going from a position where you’re avoiding radios and even cables and similar things that aren’t TEMPEST shielded for fear of emissions leaking sensitive data, to a position where you’re broadcasting your sensitive data over the air, and a listening device simply needs to look at your lights somehow. I will agree that it’s easier to block light than radio, but I think that’s where the advantages end.
I think you have this backwards. You'd still encrypt your transmissions same as if you were using WiFi, so its a wash in terms of security. But it should be much harder to jam your receiver/transmitter, because its point-to-point.
> E.g. if I put my LiFi enabled phone in my pocket mid download, it will completely cease to work.
The light part, sure, but the regular radio wifi part will be fine; it'll be slower, but it won't go away. Ideally there's seamless transition between the networks, LiFi if you have your phone out and there's a sender in the receiver's signal, WiFi in other cases.
> Of course, Li-Fi isn’t going to sweep away Wi-Fi and 5G alternatives (nor wired networks). Radio waves still have a distinct advantage with regard to transmission through the atmosphere at great distance, and though opaque objects. Instead, work must concentrate on using horses for courses – with Li-Fi advantages being harvested where possible.
>* if I put my LiFi enabled phone in my pocket mid download, it will completely cease to work.*
As per your EMSEC use-case, this is also a privacy benefit, as your pocket becomes a defacto faraday cage guaranteeing that your devices can only transmit information when you want them to.
In your pocket, or just turn around so your body is in between the source and your device? Maybe reflections get the job done, but that has to harm data rates, right? I am not an optical expert.
I've researched LiFi before, but everytime I research it I find expensive commercial equipment, or hobbyists playing with very low data rates on Arduino.
I have many questions, especially how a LiFi receiver works. wouldn't this essentially need to be a high speed camera with very few pixels?
Does anyone have recommendations of a dev kit, or transceivers to play with this? Also, ones that don't cost several thousand dollars?
The technology is mostly in those two camps because it's new, expensive, and niche, like WiFi was 20 years ago. As production increases (starts?), components will drop in cost and become standardized and more ubiquitious. At the moment there is a lot of practical research you can read about from conferences like OFC [1] and SPIE PW at the free-space laser or telecom tracks [2]. At the moment transmitters and receivers are made up of highly specified components for their use case and are very parametrized, for example there are hundreds of different DFB lasers that can be used as sources.
I'm a little surprised that IEEE has already standardized, especially given the wavelength they chose but I imagine their members were forced to adopt a prolific technology as without a standard they risk the technology moving ahead without them.
Basically, the only new principle involved is that instead of de-serialized and subcarrier modulated data modulated onto 2.4GHz sinewave and emitted from antennae as electrical field changes, it is now sent as changes in light level on subcarrier frequency.
Or more simply, maybe it could be done in YouTuber style by a light-emitting diode on Tx antenna port and a photo-sensitive diode on Rx antenna port? Switching speed of Tx side LED could become the limiting factor in that case.
> wouldn't this essentially need to be a high speed camera with very few pixels?
A camera sensor 'pixel' is just a device which conducts proportional to the amount of photons that hit it. A typical digital color camera uses CMOS chips to do this, with a filter on top of them to isolate red, green, and blue. It is pretty basic; the real trick is getting millions of them on a 1/4" sensor and having them relay the data properly with a reasonable amount of noise.
We have other devices that are "just a device which conducts proportional to the amount of photons that hit it" but we do not call all of them "cameras".
A camera is a device which receives light signals and translates those into an image of some format.
A photovoltaic cell in a solar panel is a device which generates electricity proportional to the number of photons that hit it.
A photoresistor is a device which resists current in proportion to the number of photons which hit it.
See now, such a LiFi transceiver would not necessarily be termed a "camera" any more than the infrared sensor in urinals is a camera. I believe that's the way we want it to be, right? LiFi has no use for producing images, only translating light back into network and signaling data. That's not called a "camera" by any means.
Also to a more trivial extent, perhaps the same idea behind the IR ports on Gameboy Colors. Which is a really interesting and probably underutilized feature to think back on now. And even more interesting in that this preceded the first mass consumer devices offering WiFi, which Wikipedia tells me only really took off with Apple’s iBooks in 1999.
Lego Mindstorms bricks could be programmed over an IR connection like this. The earliest ones shipped with an IR transmitter that had a 9-pin serial socket on the back.
IR was great, you could do things like transfer contacts and data between palmtops.
I mean I got a palmtop (a Palm V iirc) for cheap well after they were commonly used and I never used it for anything important, but still, it was a cool device. I think I have it somewhere still, wonder if it still works. I mainly used it to play Sudoku on though.
> My first thought on reading that headline: isn't that single/multimode fiber?
Kind of how radio being a wireless telegram system:
> You see, wire telegraph is a kind of a very, very long cat. You pull his tail in New York and his head is meowing in Los Angeles. Do you understand this? And radio operates exactly the same way: you send signals here, they receive them there. The only difference is that there is no cat.
This was tried with WiGig, and that didn’t solve any problems significant enough that it caught on. People tried to use WiGig for things like wireless docking stations, and I suspect it just wasn’t needed because at such short ranges, most people would want/need to have a power cable connected (which naturally leads to the USBC docking stations we have today, where both power and data go over the same cable).
Wigig wasn't fast enough to be useful for wireless HDMI, especially given 4k displays arrived right at that moment, and 2nd gen WiGig it's probably way to expensive.
The question is whether it's possible to create a full-rate 224 gbps transciever that is cheap enough to appear in mid range phones, notebooks and TVs and whether it would work without line of sight, but in the same room, with reflected light.
The wireless adapter for the HTC Vive headset was based on WiGig. The range is limited to a few meters, but that's more than enough for room-scale VR. The need for massive bandwidth and low latency make it a perfect use case for WiGig.
WiGig just came too early, the underlying tech was also barely able to support it. Windows 7 and Vista era, expensive and the software really wasn't there either.
It might be different now with VR and the rise of docks.
> Of note is that the first commercially available Li-Fi system has been available since 2014, and it hasn't gained any traction?
Probably for the normal early reasons, too big, fiddly, and pricey. Someone must have finally shrunk and/or cheapened it enough for a large segment of enthusiast or business for it to gain more attention again.
reading that, it seems to be about consumption/receiving data rather than the sending of data. or do you have light shining out of your laptop back at it?
there is an example of a school using it, and the students have a usb device in their laptop to read the light, but is it slow upload or are they also connected to wi-fi?
This is the main selling point I think, but not intentional jamming... regardless of the inherent lower latency, lower jitter and higher throughput, unlike wifi none of these aspects are hampered by proximity to adjacent signals from other networks and other EM sources.
Even if people don't want to kit out their entire house a la PoE, it would be a nice benefit to have this work side by side with wifi... bad signal? just walk into the room with the router, auto switch to lifi and it's effectively as good as wired. Also more devices automatically using lifi when wifi is not necessary will alleviate interference for everything and everybody else where it's actually necessary. It's a win win technology.
It's also interesting to see something move (relatively) quickly from experiment to standards proposal. I suppose that's due to the practicality of this tech.
Having a network that expands using a non-interfering frequency would be a godsend. Especially given the limited number of channels for existing WiFi standard.
I'm also so curious how this ends up working in practice. Even using infrared, would it interfere with things like baby monitors in night mode? The Wikipedia article says it can be tuned to be less intense than humans can perceive, but I'm curious if that's true in practice. (Granted, babies don't typically need Internet access while they are sleeping, but maybe the monitor itself does.)
https://en.wikipedia.org/wiki/Shannon%E2%80%93Hartley_theore...
Wifi is 2.6 Ghz or 5Ghz, visible red light is 430 terahertz.
Or it sounds like LiFi is just pulsing the lights on an off, in which case the Nyquist Rate that was invented for telegrams is a better analogy
https://en.wikipedia.org/wiki/Nyquist_rate
This is not true and is unrelated to Shannon's theorem.
Shannon's theorem shows us that wider bandwidths allow for larger bit rates. At higher frequencies our bandwidths can be bigger. For example a band from 1 to 2 terahertz is 1 terahertz wide, which is 1000 times larger than a band from 1 to 2 gigahertz (1 gigahertz wide).
The total bandwidth available (including multiple channels) for 2.4 GHz Wi-Fi is about 100 MHz. The total space available for this new standard is 800 to 1000 nm [0], which is 450 THz. That's 4.5 million times wider than Wi-Fi. That is why you get higher bit rates with this new standard, AKA more throughput, or more "bandwidth", when the term is used to mean data rate.
[0]: https://standards.ieee.org/ieee/802.11bb/10823/
I like to explain to my kids that it's all colors - we are transparent to some (such as X-rays) the same way some fish are mostly transparent to the light we can see and walls are transparent to the radios we use in Wi-Fi, and also that both snakes and bees can see light in colors we can't (snakes see IR, bees see UV).
[1] https://en.wikipedia.org/wiki/Ethernet_physical_layer#1.6_Tb...
[2] https://ntrs.nasa.gov/api/citations/20210026855/downloads/sp...
For your second question, I'm not sure how baby monitors work but the proposed wavelengths are unlicenced and there are little if any rules for how to deal with interference. There are rules for eye safety of laser which limits the maximum energy that can be delivered to the output. Generally as Li-Fi gets more common we will have to learn to deal with interference as it arrives. For example, Lidar systems (older, noncoherent ones) interfere with one another and are even susceptible to interferece from IR motion detectors and such, but these aspects have to be considered during design.
They specified that they wanted to be 100x faster, and they added parallel channels, until they reached that goal. No, really, this is the real reason.
It is entirely nonsensical to ask for a physical reason, because different channels are just different.
If you're referring to the infrared LEDs that illuminate the baby, their light is not polarized, while light used for communication is, so a polarizing filter in the receiver can filter out such noise.
But each frequency has different things that are opaque to it, and travel different distances before dropping off.
Wi-Fi is already “really low frequency, really low intensity infrared light”, so I suspect the article is correct when it says we won’t notice it.
No, infrared by definition starts around 300 GHz and goes up from there.
On the other hand, I don’t see the draws outweighing what seem to be clear setbacks. E.g. if I put my LiFi enabled phone in my pocket mid download, it will completely cease to work.
What is interesting is the idea of a much more comprehensively connected future. E.g. imagine a building either both Wi-Fi and LiFi enabled, with automatic switching between the two based on which is less congested and provides the best speeds. As our daily bandwidth footprint grows, I can see the benefit in having multiple spectra for information transmission.
Deleted Comment
The light part, sure, but the regular radio wifi part will be fine; it'll be slower, but it won't go away. Ideally there's seamless transition between the networks, LiFi if you have your phone out and there's a sender in the receiver's signal, WiFi in other cases.
> Of course, Li-Fi isn’t going to sweep away Wi-Fi and 5G alternatives (nor wired networks). Radio waves still have a distinct advantage with regard to transmission through the atmosphere at great distance, and though opaque objects. Instead, work must concentrate on using horses for courses – with Li-Fi advantages being harvested where possible.
As per your EMSEC use-case, this is also a privacy benefit, as your pocket becomes a defacto faraday cage guaranteeing that your devices can only transmit information when you want them to.
I have many questions, especially how a LiFi receiver works. wouldn't this essentially need to be a high speed camera with very few pixels?
Does anyone have recommendations of a dev kit, or transceivers to play with this? Also, ones that don't cost several thousand dollars?
I'm a little surprised that IEEE has already standardized, especially given the wavelength they chose but I imagine their members were forced to adopt a prolific technology as without a standard they risk the technology moving ahead without them.
[1] https://www.ofcconference.org/en-us/home/about/archive/ [2] https://spie.org/Publications/Proceedings/Volume/12413?&orig...
Or more simply, maybe it could be done in YouTuber style by a light-emitting diode on Tx antenna port and a photo-sensitive diode on Rx antenna port? Switching speed of Tx side LED could become the limiting factor in that case.
But you forgot the most important aspect: Is side-fumbling effectively prevented?
A camera sensor 'pixel' is just a device which conducts proportional to the amount of photons that hit it. A typical digital color camera uses CMOS chips to do this, with a filter on top of them to isolate red, green, and blue. It is pretty basic; the real trick is getting millions of them on a 1/4" sensor and having them relay the data properly with a reasonable amount of noise.
So, yes.
A camera is a device which receives light signals and translates those into an image of some format.
A photovoltaic cell in a solar panel is a device which generates electricity proportional to the number of photons that hit it.
A photoresistor is a device which resists current in proportion to the number of photons which hit it.
See now, such a LiFi transceiver would not necessarily be termed a "camera" any more than the infrared sensor in urinals is a camera. I believe that's the way we want it to be, right? LiFi has no use for producing images, only translating light back into network and signaling data. That's not called a "camera" by any means.
Definitely has a niche application in areas where RFI is to be minimized.
I mean I got a palmtop (a Palm V iirc) for cheap well after they were commonly used and I never used it for anything important, but still, it was a cool device. I think I have it somewhere still, wonder if it still works. I mainly used it to play Sudoku on though.
My first thought on reading that headline: isn't that single/multimode fiber?
Kind of how radio being a wireless telegram system:
> You see, wire telegraph is a kind of a very, very long cat. You pull his tail in New York and his head is meowing in Los Angeles. Do you understand this? And radio operates exactly the same way: you send signals here, they receive them there. The only difference is that there is no cat.
* https://quoteinvestigator.com/2012/02/24/telegraph-cat/
This time the tail is fibre and not copper, but without the cat tail.
The question is whether it's possible to create a full-rate 224 gbps transciever that is cheap enough to appear in mid range phones, notebooks and TVs and whether it would work without line of sight, but in the same room, with reflected light.
It might be different now with VR and the rise of docks.
For those like me confused about how this would even work.
Of note is that the first commercially available Li-Fi system has been available since 2014, and it hasn't gained any traction?
Probably for the normal early reasons, too big, fiddly, and pricey. Someone must have finally shrunk and/or cheapened it enough for a large segment of enthusiast or business for it to gain more attention again.
there is an example of a school using it, and the students have a usb device in their laptop to read the light, but is it slow upload or are they also connected to wi-fi?