Sounds like they basically crafted some special messages such that it’s nonsense at 2.4ghz but smoothes out to a LoRa message on a much much lower frequency band (<ghz).
It’s LoRa on 2.4ghz.
It’s just that chirp signals are easy to decode from a lot of noise.
And they don’t really affect most other modulation techniques. I think you can even have multiple CSS coded signals on the same frequency, as long as they are configured slightly differently.
LoRa is incredibly resilient.
It’s just really really slow
I don’t think it’s “just” LoRa on 2.4ghz, because if it were existing lora devices wouldn’t be able to decode the signals off the shelf, as the article claims. From the perspective of the receiver, the messages must “appear” to be in a LoRa band, right?
How do you make a device who’s hardware operates in one frequency band emulate messages in a different band? I think that’s the nature of this research.
And like, we already know how to do that in the general sense. For all intents and purposes, that’s what AM radio does. Just hacking a specific peice of consumer hardware to do it entirely software side becomes the research paper.
This is about hacking WiFi hardware to make WiFi modulated signal intelligible to a receiver expecting CSS modulation, and have the WiFi hardware demodulate a CSS signal.
Thus making WiFi chips work with LoRa chips.
LoRa doesn’t care about the carrier frequency.
So the fact that it’s LoRa at 2.4ghz doesn’t matter. It’s still LoRa.
I’m sure there will be a use for this at some point.
Certainly useful for directly interfacing with LoRa devices from a laptop.
I feel that anyone actually deploying LoRa IoT would be working at a lower level than “throw a laptop at it” kinda thing
I didn’t realize that LoRa didn’t care about carrier frequency, that’s for sure the root of my faulty assumption! Thanks for taking the time to explain
Sounds like they basically crafted some special messages such that it’s nonsense at 2.4ghz but smoothes out to a LoRa message on a much much lower frequency band (<ghz).
It’s LoRa on 2.4ghz.
It’s just that chirp signals are easy to decode from a lot of noise.
And they don’t really affect most other modulation techniques. I think you can even have multiple CSS coded signals on the same frequency, as long as they are configured slightly differently.
LoRa is incredibly resilient.
It’s just really really slow
I don’t think it’s “just” LoRa on 2.4ghz, because if it were existing lora devices wouldn’t be able to decode the signals off the shelf, as the article claims. From the perspective of the receiver, the messages must “appear” to be in a LoRa band, right?
How do you make a device who’s hardware operates in one frequency band emulate messages in a different band? I think that’s the nature of this research.
And like, we already know how to do that in the general sense. For all intents and purposes, that’s what AM radio does. Just hacking a specific peice of consumer hardware to do it entirely software side becomes the research paper.
WiFi uses BPSK/QPSK/OFDM/OFDMA modulation.
LoRa uses CSS modulation.
This is about hacking WiFi hardware to make WiFi modulated signal intelligible to a receiver expecting CSS modulation, and have the WiFi hardware demodulate a CSS signal.
Thus making WiFi chips work with LoRa chips.
LoRa doesn’t care about the carrier frequency.
So the fact that it’s LoRa at 2.4ghz doesn’t matter. It’s still LoRa.
I’m sure there will be a use for this at some point.
Certainly useful for directly interfacing with LoRa devices from a laptop.
I feel that anyone actually deploying LoRa IoT would be working at a lower level than “throw a laptop at it” kinda thing
I didn’t realize that LoRa didn’t care about carrier frequency, that’s for sure the root of my faulty assumption! Thanks for taking the time to explain