I do not have access to the original paper, but I would want to see how this compares to 802.11ah "WiFi HaLow".
(edit) OK, I got a copy from ResearchGate, and I misunderstood! I had failed to grok the part of the article where LoRa is now supported by the sx128x (as opposed to the sx126x) on 2.4GHz.
> In this article, we introduce a new algorithmic framework called WiLo, designed to enable directional communication from Wi-Fi to LoRa, which employs signal emulation techniques to enable off-the-shelf Wi-Fi hardware to produce a valid 2.4 GHz LoRa waveform.
So, critically, and as far as I can tell this isn't in the summary article, this is purely unidirectional; and so, this isn't about being able to build a network that upgrades the range of WiFi with some tradeoffs: this is about being able to send data from existing WiFi hardware to existing LoRa hardware using a relatively minimal set of changes (though I still don't appreciate how this would practically be done to the existing hardware, and they apparently only simulated this with software-defined radio).
> The core innovation of WiLo lies in the signal emulation technique used to generate a valid 2.4 GHz LoRa waveform. Through sophisticated signal processing algorithms, WiLo transforms the standard Wi-Fi signals into LoRa-like wave-forms, while ensuring compliance with the LoRa modulation specifications. This enables the LoRa hardware to decode WiFi signals without requiring any modifications to the hardware itself. The emulation of LoRa waveforms is achieved by carefully manipulating the parameters of the Wi-Fi signals, such as the modulation index, spreading factor, and BW, to closely match the characteristics of LoRa modulation.
> We would like to emphasize that WiLo is directly supported among commodity devices, and the USRP-B210 devices are used only for evaluation purposes to measure low-level PHY information, which is inaccessible by commodity devices. For example, a commodity Wi-Fi card such as the Atheros AR2425 can replace USRP-B210 devices as the sender.
I want to highlight that this paper should be read in the same spirit as "guess what! we figured out how to cross-compile C into JavaScript using Emscripten" came across to everyone back in the day before our modern viewpoint where WebAssembly was taken for granted.
It doesn't mean that this should be used, or should be the standard, but it absolutely does mean that this is possible to do within the terms of the 802.11g radio protocol spec, which no one had really realized and done the heavy lifting to demonstrate yet.
> So, critically, and as far as I can tell this isn't in the summary article, this is purely unidirectional; and so, this isn't about being able to build a network that upgrades the range of WiFi with some tradeoffs: this is about being able to send data from existing WiFi hardware to existing LoRa hardware using a relatively minimal set of changes (though I still don't appreciate how this would practically be done to the existing hardware, and they apparently only simulated this with software-defined radio).
This leads me to believe you could flip a switch and turn entire swaths of access points into a broadcast fabric for LoRa? Wifi networks meet software defined radio a bit.
OK, we have a Wi-Fi device that can talk to a LoRa device at a large distance. Now replace the LoRa device with another Wi-Fi device that talks the LoRa protocol. If mission is not accomplished, what's missing?
> (though I still don't appreciate how this would practically be done to the existing hardware, and they apparently only simulated this with software-defined radio).
It is my understanding that most modern baseband chips can effectively be considered "software defined radios", as most of the modulation/demodulation is performed by the firmware. While the researchers appear to have used a USRP (a dedicated SDR platform), it is conceivable their scheme could be accommodated in the firmware.
Hmm. LoRa uses up-and downchirps. That would be pretty difficult to do with a WiFi radio that's meant to stick to predefined channels. But the radio is probably some kind of sdr.
I just want speed to degrade gracefully down to 1 kbps or even 100bps. Ie I should be able to be 1 mile from my house, but still be imessaging over my home wifi.
Physics lets me do that with no additional transmit power (shannon's channel capacity formula, combined with signal power dropping off as a function of distance squared).
If suitable modulation was chosen (ie. some kind of CDMA), then a setup which can give 100 Mbit at 100 yards should be able to do 322 kbits at 1 mile, and 3kbps at 10 miles!
WiFi 7 finally extends wi-fi clients to be capable of bonding multiple radios together into a coherent link. This is critical to being able to bond in longer-range, lower-bandwidth channels in the future, and I'm certain they are considering how they can bring all of the 802.11 wireless specifications under one umbrella.
However, this will also expose issues with encapsulation of "the connection", which can now vary exponentially in capacity. Operating systems and applications are coded to be 'carrier frequency agnostic', without any capability for a given connection to switch from "low data mode" (avoid unnecessary transmissions) to "high data mode" (do whatever you like), much less to "extremely low data mode" (i.e. push notifications and foreground app only) or to "extremely high data mode" (i.e. 4k / 240fps over 60GHz); all on a single dynamically-adjusting connection.
Cellular fakes this today by saying "5G high data permitted?" but not being able to downgrade OS and app behaviors gracefully when the 5G connection is hella weak or overloaded, i.e. not fetching mail or not autoplaying movies or etc.
Windows exposes connection-speed data to applications, and don’t forget the “metered connection” setting - which plenty (albeit probably not most) applications support - which goes a long way to solve the problem you’re describing.
The problem with that is you end up time division multiplexing. Packets for the distant client take as many microseconds to transmit as the high bandwidth nearby clients. The aggregate bandwidth for the system craters with more remote clients.
Is that really true in practice though? Unless you're in the true middle of nowhere, by the time you get a mile out, there's going to be other people using the spectrum, and at low bandwidth they'll be using it for a long time.
Current stuff like LoRa works because there's not many users and the protocols are optimized, but if everyone was iMessaging we'd probably need more spectrum.
We can already do WiFi for miles with narrow beam antennas, we could make mesh network tech cheap if it was standardized and mass produced.
The oldest WiFi standards, at 1 Mb/s or 2 Mb/s, could easily establish point-to-point links at 50 km or even 100 km, with the condition of using good directive antennas mounted on high-enough masts.
This could be used to implement bridges between LANs located at great distances one from the other. There were commercial products implementing such LAN bridges through WiFi.
When an access point must transmit and receive omni-directionally, to reach any station placed randomly around it, that diminishes a lot the achievable range.
That’s my dream network. A long range, low bandwidth, decentralized network. Mesh would be cool, but even just being able to exchange with neighbors at the scale of 1-10mi would be amazing.
This sounds like what RNode devices for Reticulum networks appear to be able to do. (I haven't tried it for myself yet.)
> RNodes can be made in many different configurations, and can use many different radio bands, but they will generally operate in the 433 MHz, 868 MHz, 915 MHZ and 2.4 GHz bands. They will usually offer configurable on-air data speeds between just a few hundred bits per second, up to a couple of megabits per second.
> [...]
> While speeds are lower than WiFi, typical communication ranges are many times higher. Several kilometers can be acheived with usable bitrates, even in urban areas, and over 100 kilometers can be achieved in line-of-sight conditions.
> Reticulum is the cryptography-based networking stack for building local and wide-area networks with readily available hardware. Reticulum can continue to operate even in adverse conditions with very high latency and extremely low bandwidth.
It would be illegal (at least in the usa); you’re not allowed to share your home internet, or function as a public utility.
There are huge regulatory moats around everything that costs $20-500/mo recurring and is incurred by large percentages of the population. Internet access is a huge one.
Did I understand the article correctly in that they simply managed to reach 500 meters with wi-fi? If so, I don't see what they have actually achieved. In the early days of 802.11b I regularly connected my wifi-enabled (via dongle) Palm PDA to open networks that were sometimes hundreds of meters away, and the airport free wifi I could use from 1.5km away (at least - it could be longer, it's just that the place I frequented was that far away).
The usable distance started to shrink drastically as the airways got more crowded, and as soon as you could see tens of networks at the time then suddenly the cafeteria network was only usable from inside whereas before you could use it a couple hundred meters away, across the large square.
Of course, if it's about managing that in a crowded network space.. but the article was extremely short on details.
let's assume that this takes off and it will become standard addition for our WiFi devices.
Given big range of this technology, how this handle air congestion when we would have hundreds maybe thousands of devices in range?
I expect low througput of this technology and for IoT that's usually fine, but when we need to share this spectrum with lot of devices we might quickly make this non operational. And this is even assuming we do not have some devices that request much more bandwidth that others.
Wirh WiFi 2.4ghz we already struggle with air congestion and quick Google shows that lora have 13 + 8 channels and if I understand it correctly some of them are used explicitly for joining network (?)
I think this technology is really cool only if it won't get much popularity
People are responding to this with the mindset of watching 1080p TV not realizing 1 second of a 1080p Netflix stream will use 5x the total daily bandwidth of an IoT device reporting temperature once every 10 seconds for the whole day. It's entirely different use cases and the impact of congestion between the two is like talking about what matters to a garden on Mars vs Earth.
The big limitation I see here, and where Wi-Fi has historically failed even with 802.11ah specifically built for the IoT use case and standardized back in 2015, is the "uses extra power" bit. Other protocols like LoRa are designed around minimizing power at the end stations. At the end of the day that's often a bigger deal than bandwidth for long IoT.
> 1 second of a 1080p Netflix stream will use 5x the total daily bandwidth of an IoT device reporting temperature once every 10 seconds
Don't have a Netflix file to test with but YT, the video data is 73KiB for the first second (tested by ffmpeg'ing the first second into a new file with codecs set to 'copy'). The page, however, seems to fetch about 9 megabytes in the first second of loading (various features still loading such as comments).
Reporting temperature, let's say it's super secure with IV and MAC headers and trailers and another nonce for replay protection (all in all 3x 32 bytes) plus the actual value and other metadata (like which device the report is from) of, say, another 32 bytes, plus IPv6 headers etc. is a couple hundred bytes. Call it 256. There are a gross ten minutes in a day, coming to 144×256 = 36 KiB.
Huh, that's pretty accurate when considering video data ("any second" rather than the first second specifically that needs to include the page weight). I had expected the video data to be vastly bigger compared to that sensor reporting just one value per ten minutes. That keyframe+video compression is really quite something; raw 24-bit would be 6 MiB rather than 73 KiB
You don't need to watch 1080p TV to report the current temperature and humidity, or to receive a command to turn on a light bulb.
As for channel congestion, check if your WiFi repeater is in mesh mode or not. If not, it literally halves the throughput on your WiFi network, that seems to be already over-congested by whatever is messing with your channel settings. Based on your description of the area, if your 5GHz somehow ran out of space, something seriously weird is going on. Maybe some non-WiFi-device is using the 2.4/5.2GHz band to transmit data? I know stories of cheapo baby monitors wiping out entire neighbourhoods, for instance.
This looks to be about running LoRa like networks on WiFi hardware. Speed on LoRa is not something talked about much as it is more like SMS message passing or the like than IP networking.
Probably why it was taking about IoT use. 500 meters for a couple hundred baud connection doesn't seem too ground breaking. Off the shelf 900mhz radios can easily achieve that
(edit) OK, I got a copy from ResearchGate, and I misunderstood! I had failed to grok the part of the article where LoRa is now supported by the sx128x (as opposed to the sx126x) on 2.4GHz.
https://www.researchgate.net/publication/383692369_WiLo_Long...
> In this article, we introduce a new algorithmic framework called WiLo, designed to enable directional communication from Wi-Fi to LoRa, which employs signal emulation techniques to enable off-the-shelf Wi-Fi hardware to produce a valid 2.4 GHz LoRa waveform.
So, critically, and as far as I can tell this isn't in the summary article, this is purely unidirectional; and so, this isn't about being able to build a network that upgrades the range of WiFi with some tradeoffs: this is about being able to send data from existing WiFi hardware to existing LoRa hardware using a relatively minimal set of changes (though I still don't appreciate how this would practically be done to the existing hardware, and they apparently only simulated this with software-defined radio).
> The core innovation of WiLo lies in the signal emulation technique used to generate a valid 2.4 GHz LoRa waveform. Through sophisticated signal processing algorithms, WiLo transforms the standard Wi-Fi signals into LoRa-like wave-forms, while ensuring compliance with the LoRa modulation specifications. This enables the LoRa hardware to decode WiFi signals without requiring any modifications to the hardware itself. The emulation of LoRa waveforms is achieved by carefully manipulating the parameters of the Wi-Fi signals, such as the modulation index, spreading factor, and BW, to closely match the characteristics of LoRa modulation.
> We would like to emphasize that WiLo is directly supported among commodity devices, and the USRP-B210 devices are used only for evaluation purposes to measure low-level PHY information, which is inaccessible by commodity devices. For example, a commodity Wi-Fi card such as the Atheros AR2425 can replace USRP-B210 devices as the sender.
It doesn't mean that this should be used, or should be the standard, but it absolutely does mean that this is possible to do within the terms of the 802.11g radio protocol spec, which no one had really realized and done the heavy lifting to demonstrate yet.
This leads me to believe you could flip a switch and turn entire swaths of access points into a broadcast fabric for LoRa? Wifi networks meet software defined radio a bit.
It is my understanding that most modern baseband chips can effectively be considered "software defined radios", as most of the modulation/demodulation is performed by the firmware. While the researchers appear to have used a USRP (a dedicated SDR platform), it is conceivable their scheme could be accommodated in the firmware.
Is there a comparably-priced SDR that could be used for WiFi data transmit/receive with GNUradio?
Physics lets me do that with no additional transmit power (shannon's channel capacity formula, combined with signal power dropping off as a function of distance squared).
If suitable modulation was chosen (ie. some kind of CDMA), then a setup which can give 100 Mbit at 100 yards should be able to do 322 kbits at 1 mile, and 3kbps at 10 miles!
However, this will also expose issues with encapsulation of "the connection", which can now vary exponentially in capacity. Operating systems and applications are coded to be 'carrier frequency agnostic', without any capability for a given connection to switch from "low data mode" (avoid unnecessary transmissions) to "high data mode" (do whatever you like), much less to "extremely low data mode" (i.e. push notifications and foreground app only) or to "extremely high data mode" (i.e. 4k / 240fps over 60GHz); all on a single dynamically-adjusting connection.
Cellular fakes this today by saying "5G high data permitted?" but not being able to downgrade OS and app behaviors gracefully when the 5G connection is hella weak or overloaded, i.e. not fetching mail or not autoplaying movies or etc.
Current stuff like LoRa works because there's not many users and the protocols are optimized, but if everyone was iMessaging we'd probably need more spectrum.
We can already do WiFi for miles with narrow beam antennas, we could make mesh network tech cheap if it was standardized and mass produced.
> a setup which can give 100 Mbit at 100 yards should be able to do 322 kbits at 1 mile, and 3kbps at 10 miles!
That's not how noise floor works
This could be used to implement bridges between LANs located at great distances one from the other. There were commercial products implementing such LAN bridges through WiFi.
When an access point must transmit and receive omni-directionally, to reach any station placed randomly around it, that diminishes a lot the achievable range.
You have to be close to a router the first time to connect to it.
And after that you both have some long (ie. 65536 bit) random key which is used as a modulation code for your transmissions.
The receiver demodulates with the key to get the data out. Different key can get different data out of the same airwaves at the same time.
Clock's need to be synced perfectly, which is in itself a big technical challenge.
As the article slides to, low power signal reception is also challenging.
> RNodes can be made in many different configurations, and can use many different radio bands, but they will generally operate in the 433 MHz, 868 MHz, 915 MHZ and 2.4 GHz bands. They will usually offer configurable on-air data speeds between just a few hundred bits per second, up to a couple of megabits per second.
> [...]
> While speeds are lower than WiFi, typical communication ranges are many times higher. Several kilometers can be acheived with usable bitrates, even in urban areas, and over 100 kilometers can be achieved in line-of-sight conditions.
( https://unsigned.io/rnode/ )
> Reticulum is the cryptography-based networking stack for building local and wide-area networks with readily available hardware. Reticulum can continue to operate even in adverse conditions with very high latency and extremely low bandwidth.
( https://reticulum.network/ )
There are huge regulatory moats around everything that costs $20-500/mo recurring and is incurred by large percentages of the population. Internet access is a huge one.
Of course, if it's about managing that in a crowded network space.. but the article was extremely short on details.
Given big range of this technology, how this handle air congestion when we would have hundreds maybe thousands of devices in range?
I expect low througput of this technology and for IoT that's usually fine, but when we need to share this spectrum with lot of devices we might quickly make this non operational. And this is even assuming we do not have some devices that request much more bandwidth that others.
Wirh WiFi 2.4ghz we already struggle with air congestion and quick Google shows that lora have 13 + 8 channels and if I understand it correctly some of them are used explicitly for joining network (?)
I think this technology is really cool only if it won't get much popularity
The big limitation I see here, and where Wi-Fi has historically failed even with 802.11ah specifically built for the IoT use case and standardized back in 2015, is the "uses extra power" bit. Other protocols like LoRa are designed around minimizing power at the end stations. At the end of the day that's often a bigger deal than bandwidth for long IoT.
Don't have a Netflix file to test with but YT, the video data is 73KiB for the first second (tested by ffmpeg'ing the first second into a new file with codecs set to 'copy'). The page, however, seems to fetch about 9 megabytes in the first second of loading (various features still loading such as comments).
Reporting temperature, let's say it's super secure with IV and MAC headers and trailers and another nonce for replay protection (all in all 3x 32 bytes) plus the actual value and other metadata (like which device the report is from) of, say, another 32 bytes, plus IPv6 headers etc. is a couple hundred bytes. Call it 256. There are a gross ten minutes in a day, coming to 144×256 = 36 KiB.
Huh, that's pretty accurate when considering video data ("any second" rather than the first second specifically that needs to include the page weight). I had expected the video data to be vastly bigger compared to that sensor reporting just one value per ten minutes. That keyframe+video compression is really quite something; raw 24-bit would be 6 MiB rather than 73 KiB
I have 2 x 5ghz channels, 2 x 2.4 ghz channels, and then a repeater with another 2 and 2
In the evening there is so much congestion on every available channel on either band that I can’t watch 1080p tv
This long range thing sounds awful.
As for channel congestion, check if your WiFi repeater is in mesh mode or not. If not, it literally halves the throughput on your WiFi network, that seems to be already over-congested by whatever is messing with your channel settings. Based on your description of the area, if your 5GHz somehow ran out of space, something seriously weird is going on. Maybe some non-WiFi-device is using the 2.4/5.2GHz band to transmit data? I know stories of cheapo baby monitors wiping out entire neighbourhoods, for instance.
The specs are open: https://resources.lora-alliance.org/technical-specifications
The Lora Alliance publishes their patent policy here: https://lora-alliance.org/wp-content/uploads/2021/01/LA-IPR-...
I encourage reviewing "7.4 LoRaWAN 1.0 Copyright License to Implementers", which appears responsive to your question.
A patent license is a horse of a different color. 7.4 explicitly denies that it serves as a patent license:
> This license shall not be deemed to grant any right under any patent, patent applications or similar intellectual property right.
Just because the specs are open it does not permit you to manufacture or sell the devices without a patent license.