I like the ETH result and it is an impressive feat (I know some people involved in that work). That said the article is a lot of rubbish. First there is the weird focus on lasers, all (relevant) Optical Communication uses lasers, including fibre comms. Then they make it sound like this is to replace fibre, again rubbish.
The amount of data going through fibres is absolutely staggering, replacing this with intersatellite links is just not going to happen. First you still need fibre to connect your ground stations (and you need quite a bit of redundancy due to weather) and there are still a lot of unsolved problems (tracking and pointing for example). However there are many interesting applications of optical satellite links and quite a few players are investing in it, the big one is actually data connections for scientific space missions.
Each submarine cable is 12+ strands which are DWDM multiplexed to 20-40 channels at 100-400gbps.
The entire _planned_ starlink constellation has less aggregate switching capacity than a pair of current generation core routers. The market for terrestrial ethernet _routed_ 400Gbps ports is 10k+ annually and 800Gbps is just getting warmed up.
Satellite is not going to even make the slightest dent in terrestrial networking. Starlink isn't even viable without most of the traffic doing two leg ground-satellite-ground for SP-network connected ground stations. 100% of that traffic is landing on the terrestrial internet.
That Starlink estimate must be low, they currently have 1.5 million customers and a significant percentage of total bandwidth is over ocean and thus mostly wasted.
If you assume a pessimistic 100x over subscription and 50% waste over oceans and non serviced countries * 50mbps (upload + download) * 1.5 million customers that’s well over 1.5 TBPs current capacity for customers which then doubles when you include their base stations. And they that’s today when the current 3887 Satilites which mostly v1.0 and v1.5 which have dramatically less individual bandwidth than v2.0.
PS: Also your port estimate overstates total bandwidth as each point to point link requires a port on each side, and packets generally make multiple hops through the internet.
Isn’t the latency of satellite internet always going to be significantly worse? Unless, of course, you could make larger hops, avoiding unnecessary switching delays. But I don’t think that will happen, except some rare circumstances.
Since you seem to have some experience I want to ask:
How much latency is added by routing and switching on a typical transatlantic journey?
And a followup, are there any promising developments for moving the networking infrastructure fully to light as a carrier signal without any electron-based hardware in between?
The Z on ETHZ is important, ETH on its own is just (loosely) "university", the Z makes it the Zürich one ;D. There is exactly one other: EPFL in Lausanne. (And you can technically call them EPFZ and ETHL if you really want to confuse people.)
(EPF [École Polytechnique Fédérale] is just the French version of German ETH [Eidgenössische Technische Hochschule]. Funnily enough both have pretty good international reputations, but due to the primary regional language being different are rarely recognized to be the same type of Swiss education entity.)
Continuing with the off-topic, the E in ETH is pretty significant if you're going to loosely translate, as it stands for "Swiss federal", which is a big part of where those institutions' standing comes from.
Sure, high-bandwidth inter-satellite links are essential for providing service to remote regions (or even just to make ground station placement more economical), but the question here is whether they could ever economically compete with fiberoptics outright.
For extremely latency sensitive applications, I guess they will (e.g. for trading and real-time communications), but for everything else that can wait a few more milliseconds, I'd guess that the operational overhead and reduced reliability will never be worth it, especially if fiber already exists on the same path.
Can't imagine how hard it is to align those in space (and keep them aligned) in a low altitude satellite, but it sounds pretty cool. I'm guessing the range is something like 1000km/2000km?
Current fiber optic cables transmit signals at 2/3rds the speed of light. Modeling has shown that starlink's laser backbone would allow it to beat deep-sea cables in some situations. This article estimates a NYSE -> FTSE reduction from 76ms to 43ms. I wouldn't be surprised to learn that there is a HFT bidding war going on.
There is a newer version of fiber optic cable that uses a hollow core. Testing has shown it transmits signals at nearly the speed of light. It is very difficult to produce at scale and will require a generational shift to deploy.
Am I the only one who thinks HFT should be forbidden? Stocks and investments are a nice thing to have, since you can buy a share of a company you believe in, they get the money, do stuff, you get dividends and in the end can sell the stock after some time too.
HFT changes these "beliefs in company" to "beliefs that the stock will go up", and the timeframe of those "beliefs" is sometimes in milliseconds. This turns classic "investment" into a computer game with a lot of real money.
HFT isn't even that harmful. What's harmful is brokerages that hold large portions of financial instruments who take advantage of customers' live buying and selling with delayed swaps, intentional front running, etc. This is how the stock market was bullied before HFT, options markets, trading bots, etc enabled 'everyone' to do skeezy arbitrage and not just market makers.
The only meaningful way to limit the efficacy of these modern trading styles would be purposeful market friction through minimum holding times, circuit breakers, etc. You can't just legislate that people mustnt have low ping.. it's impossible to enforce. And those market friction mechanisms can create scary market conditions like backlogs, etc and guess what, enable market making brokerages to do internal swaps etc. in spite of the friction and essentially be the only ones able to bypass restrictions.
I agree. HFT goes against the original ethos of a market open to all. Instead of trading on the merit of a given company, it simply reinforces the market is increasingly gamified and unfair. It's like futures, no ones cares about the commodity being traded, or that doing so can really hurt people and business that rely on that item, it's just another item to be gambled.
What would that look like? If you rate limit stock exchanges to something like 1 update per minute then there will likely be the same amount of networking and computation going on to speculate on the next update and calculate optimal plays. It just moves it to behind closed doors where it is harder to know if shenanigans are going on.
It would take a heavier hand to push against this problem. I'm all for it, I'm just not clever enough or knowledgeable enough to know what would be a good regulation that would fly in congress.
In a healthy market, HFTs serve as market makers, and allow normal traders to have faith that prices will be consistent at all exchanges. If there is enough competition, an HFT will have very low profit.
What's the actual harm of HFT for the average person? I can set a $ limit on my orders, so I don't pay more than I want. Why should I care if some HFT guy scrapes 0.01% value off each trade?
Where do you think all your liquidity comes from when you want to buy or sell some instrument? The difference in price isn't all that substantial if all you do is buy now, sell when cash is needed in retirement, but if you're that kind of general index investor, one could argue that your contribution to markets is actually negative (whereas HFT is helpful by making prices better and spreads tighter). Buying SPX makes no differentiation between bad and good companies in the index and allocates money based on current prices (which are mainly calculated by HFT's who did the legwork to get to the current point).
I have a suspicion that at some point, an HFT group will obtain a brief but substantial lead in machine learning (a la RenTech, but on a shorter timeframe) that allows them to suck a decent amount of money out of the stock market before anyone else can respond or close the gap. We probably won’t get any sort of regulation until after that happens.
> since you can buy a share of a company you believe in
I don't think this is generally true. Stock trading isn't charity, it's done to make a profit. Almost trading is done on the basis of second order effects, and has been for as long as there's been a stock market. Stocks aren't given value by the company being profitable, they're given value by what other people are willing to pay for them (okay, yes, dividends also factor in, but many companies don't pay dividends and people still assign value to those stocks).
I am pretty sure HFT is the only reason the average joe can trade in the first place and can definitely say that X's stock is worth Y price. Otherwise you would have to trust the broker to give you a correct estimate.
Suppose I a (hypothetical) very low frequency value based trader buy a million dollars worth of Apple.
Apple should go up in price slightly, because someone (me) just made a big bet that it's going to do well. The market does this automatically.
TSMC should also go up in price very slightly, because Apple doing well makes it more likely that TSMC (a major vendor of theirs) will also do well, and the market just learned that Apple is slightly more likely to do well, so it also just learned that TSMC is more likely to do well. In other words the companies performance is correlated, so the stock prices should be as well.
By my understanding HFT is the mechanism by which the market can make TSMC go up slightly as well. High frequency traders can react to the information that apple is going to do better (as per it's increased stock price) and buy a bit of TSMC.
This seems like it should make the market more accurate, and if you buy that the market accurately assigning values to a company is a good thing, seems like a good thing.
No. It's garbage, trading is supposed to serve human needs and should take place on human timescales. If I had a magic wand I'd randomly fuzz the timing of trades to remove a lot of incentives for HFT and some other kinds of automated trading, which are not much more than banging on a known deficiency in a slot machine.
Proponents argue that HFT and other such innovations provide liquidity to markets; my skeptical take is that this is a nice technical-sounding term for traders with cash to lowball falling asset prices.
> HFT changes these "beliefs in company" to "beliefs that the stock will go up"
No, this is the part most everyone gets wrong about HFT. These systems make money off of volume, not price. They don't care if the price is going up or down. All they care about is that they can capture a small price delta by facilitating a trade faster than someone else and they are willing to do so for smaller fractions of a cent which makes them more attractive to all market participants, including you and your retirement funds. Their concerns are orthogonal to investors and they compete with other HFT firms and market makers.
And HFT setups are also nearly colocated with the exchanges I understand -- as close as they can get physically, so adding a hop to space and back may be adding quite a bit of distance.
Photonic bandgap / Bragg fiber is being produced in quantity, and some subsea cables are using it.
I'm pretty sure it's the satellite lasers that require the generational shift. One subsea cable has no bearing on any other; once the first is installed, you're up and running. not so with laser-connected swarms.
Photonic bandgap fibres are not used in submarine cables their loses are too high. There are recent antiresonant fibres (NANF) which can achieve lower losses than even SMF. However production capabilities are not ready yet to make the amount of fibre necessary for submarine. The startup (out of Southampton) that pioneered these was recently acquired by Microsoft.
We probably see these fibres for links between the exchange and data centres first (although there it is difficult to beat RF links, as they are direct line of sight)
Wouldn't they have to relay between many satellites due to line of sight? And wouldn't that eat up any decrease in time of flight for the signal itself?
I suspect that retransmission with amplification only, without much processing, can be really fast. Modern electronics can routinely do sub-nanosecond latency.
This sure sounds useful, but "might remove need for deep-sea cables" is quite silly.
Anything they do to cram bandwidth into their laser link can reasonably also be applied to fiber optical cables. Except the fiber optical cables come in bundles of 12 to 144 with absolutely no separation issues. Replicating that over open space (air or vacuum), if reasonably possible at all, will chug significant amounts of power in signal processing at the receive end.
There are major benefits to free-space optics -
- quicker to build
- lower latency
- in some cases, large coverage
But deep-sea cables compete on bandwidth, and that's not something free-space optics can beat them on. Why diminish this research achievement by conflating it with that? :(
On top of this I really doubt the claim about working in bad weather. Some weather will work, sure, maybe at reduced speeds. But if there is a proper cloud in the way, the near-visible 1550 nm light will be scattered completely.
I believe the idea is that the ground link is still radio but the interconnect between the satellites is a laser. While radio can be affected by weather, the "works in bad weather" is a long solved issue.
> Although the laser system was not directly tested with an orbiting satellite, they accomplished high-data transmission over a free-space distance of 53km (33 miles).
This is peanuts compared to under sea cables and issues you will find penetrating atmosphere and increasingly, space.
We should keep under sea cables and work on additional connections to make our networks more resilient, both to natural phenomena, equipment failures and sabotage.
> more resilient, both to natural phenomena, equipment failures and sabotage.
I think you're understating the risks from geomagnetic storms. In comparison to satellites, fiber optic cables seem like they'd be relatively unaffected even if the equipment attached to them needs replacement.
Currently, you can cut internet connections to most countries with a small deep water sub. China, for example, was busy building one that can be used at 10k+ depth and has arms.
Perhaps the ocean is protective to some degree, but beyond very short lengths undersea cables have powered repeaters that draw considerable current, and which would plausibly be destroyed if the base stations are too.
yeah, but having both means if one is a victim, the other keeps chugging along. a bad anchor drag take out cables? or a sub cut the wire? lasers go pew pew. weather take out lasers? cables go zoom. and in normal operating conditions, both go and increase bandwidth and speed
It's not peanuts at all, given that it's an entirely different technology. Free space optics will have different applications, of course, but those applications where fiber is currently not feasible stand to profit quite a bit from this.
I can think of all of rural America underserved with fast internet, forced to share puny fiber runs to towers retransmitting over low speed radio.
implementing the same infra, but using lasers as the backbone could speed up a large area of the country, make rolling out dense radio networks like 5g way faster and cheaper, etc.
I could see both being in use, but replacement seems unlikely to me.
If nothing else I could see governments insisting on both just for strategic reasons. Both techs are vulnerable to being fk'd with so doubling up makes sense
The key thing here is it allows for very high bandwidth to go almost anywhere on earth, not just spots where it's easy to land a cable. Starlink's already demonstrated how great that is for consumer bandwidth (50Mbps); a multi-gigabit link terminated at a satellite is fantastic.
The part that impresses me most is they're talking about LEO satellites. Those move fast! Starlink does this with a very impressive phased array antenna design. Conceptually tracking a moving satellite with a laser is as easy as rotating a mirror, not sure how hard it is in practice.
It also circumvents the biggest problem with fiber, politics. Starlink doesn’t need permission to run a backbone across (or rather over) a country. This will be revolutionary for people in Africa, South America, and huge swaths of Asia/Australia where a few telecom monopolies have artificially jacked the price of transit.
IIUC, it only circumvents politics due to existing treaties. Those old treaties are likely ripe for revision with the extensive commercialization and the increasing number of countries capable of launching payloads into space.
It definitely helps mitigate the infrastructure buildout hurdles (which are not small), but they would still need to jump through any "I need to do business in this country" regulations, etc.
Sadly, in most of the rural US. My other options are 12Mbps fixed wireless, 1-100Mbps cellular, or 3Mbps DSL which AT&T stopped selling in violation of various government contracts.
Rural Canada.
On a lake shore.
30km from one of Canada's "Top 50" cities.
Where I used to live, I had 2 options for internet access.
A Wireless ISP that uses a parabolic antenna pointed at a water-tower about ~20km from my home; or a cell-phone based internet connection.
The WISP allowed me on average 300kb/s transmissions.
The Cell-Phone allowed me between 1.5Mb/s and 7Mb/s (to a max use of 5GB/month).
An actual 50 Mbps link is perfectly good for most use cases, you can stream anything and it is not really a bottleneck in determining how quickly pages load. Large file transfers may still take appreciable time, but it is rarely a big issue.
An advertised "50 Mbps" mobile connection is dog food if you are used to 50 Mbps fiber. You are lucky if you get 20 Mbps through, though it can be much less. Worst part is all the packet loss that cause inconsistent latency and speed.
Here where I've been on 1.5 Mbps for 15 years and there is literally no other alternative. No cell coverage. No cable. Mountain blocks GEOsat. Copper is a 6 mile run to nearest town so DSL is out too. Also, I'm literally 12 miles from the Googleplex. Yeah, broadband is a mess outside of cities and suburbs.
In practice, considerable difficulty. Like you said LEO moves fast. You are talking 10 min in line of sight (LOS) depending on the orbit. So the handoff algorithms (and necessary network) are very challenging and cost prohibitive. I mention it in my earlier response but if you have a LEO network you need pretty substantial arrays to accommodate such a power load, and during “night” for whoever is using the network, you need transfer mechanisms to relay the data, using even more power during eclipse. This is a real challenge.
What about solar storms? I would expect the damage would be more catastrophic if a major solar wind storm took down a decent percentage of the the satellites than terrestrial fiber. A satellite network seems more vulnerable and harder to fix. I'm sure deep sea cables are no picnic to fix, and they are susceptible to sabotage too.
https://www.space.com/solar-storms-destroy-satellites
IMHO this article is garbage because moving internet backbone in space isn't a sustainable choice in the short, medium and especially long period. Starlink knows that, they are operating in loss, burning money of investors, hoping that some kind of new sci-fi tech will change the game, but honestly that won't happen. Image if the current cellphone network was in space, every time the tech has a generational step forward you should burn all the satellites and re- launch a new constellation. Moreover, you need to reach consumers (the last mile ) on Earth and you can't do that with lasers. Customers will rely in radio communications, that in this case have low quality compared to other alternatives, when they are available. In fact the point of satellite Internet is to cover a niche market: you use that solution if there are no alternative and if you can afford the costs. So you can accept lower connection speed, variale QoS, etc , hoping in a better alternative in the future and, when that alternative arrive in form of a wired traditional connection, instantly the customers migrate there. So that market constantly decrease, not increase. If competitors realise that exist clusters of satellite customers in some rural zones and they are willing to pay more for Internet connection they will take over completely and , anyway, in times of remote working, someone should invest in providing fast internet in remote places that are characteristic of better quality of life for remote workers want a better life, instead of invest in satellite internet. Laser telecommunications are already a thing but in optic fibre and there in fact they have more sense than in a fragile and expansive satellite constellation.
The amount of data going through fibres is absolutely staggering, replacing this with intersatellite links is just not going to happen. First you still need fibre to connect your ground stations (and you need quite a bit of redundancy due to weather) and there are still a lot of unsolved problems (tracking and pointing for example). However there are many interesting applications of optical satellite links and quite a few players are investing in it, the big one is actually data connections for scientific space missions.
Each submarine cable is 12+ strands which are DWDM multiplexed to 20-40 channels at 100-400gbps.
The entire _planned_ starlink constellation has less aggregate switching capacity than a pair of current generation core routers. The market for terrestrial ethernet _routed_ 400Gbps ports is 10k+ annually and 800Gbps is just getting warmed up.
Satellite is not going to even make the slightest dent in terrestrial networking. Starlink isn't even viable without most of the traffic doing two leg ground-satellite-ground for SP-network connected ground stations. 100% of that traffic is landing on the terrestrial internet.
If you assume a pessimistic 100x over subscription and 50% waste over oceans and non serviced countries * 50mbps (upload + download) * 1.5 million customers that’s well over 1.5 TBPs current capacity for customers which then doubles when you include their base stations. And they that’s today when the current 3887 Satilites which mostly v1.0 and v1.5 which have dramatically less individual bandwidth than v2.0.
PS: Also your port estimate overstates total bandwidth as each point to point link requires a port on each side, and packets generally make multiple hops through the internet.
Being able to pump 40 terabits per second up to a satellite from a single uplink, as they hope, is significant-- exceeding most subocean cables.
How much latency is added by routing and switching on a typical transatlantic journey?
And a followup, are there any promising developments for moving the networking infrastructure fully to light as a carrier signal without any electron-based hardware in between?
The Z on ETHZ is important, ETH on its own is just (loosely) "university", the Z makes it the Zürich one ;D. There is exactly one other: EPFL in Lausanne. (And you can technically call them EPFZ and ETHL if you really want to confuse people.)
(EPF [École Polytechnique Fédérale] is just the French version of German ETH [Eidgenössische Technische Hochschule]. Funnily enough both have pretty good international reputations, but due to the primary regional language being different are rarely recognized to be the same type of Swiss education entity.)
For extremely latency sensitive applications, I guess they will (e.g. for trading and real-time communications), but for everything else that can wait a few more milliseconds, I'd guess that the operational overhead and reduced reliability will never be worth it, especially if fiber already exists on the same path.
There aren't too many common natural disasters that could take out a huge fraction of undersea cables in one go.
https://www.nature.com/articles/s41377-023-01201-7
https://archmeregreenarch.org/1456/news/starlink-and-the-ris...
There is a newer version of fiber optic cable that uses a hollow core. Testing has shown it transmits signals at nearly the speed of light. It is very difficult to produce at scale and will require a generational shift to deploy.
https://www.nature.com/articles/s41467-020-19910-7
HFT changes these "beliefs in company" to "beliefs that the stock will go up", and the timeframe of those "beliefs" is sometimes in milliseconds. This turns classic "investment" into a computer game with a lot of real money.
The only meaningful way to limit the efficacy of these modern trading styles would be purposeful market friction through minimum holding times, circuit breakers, etc. You can't just legislate that people mustnt have low ping.. it's impossible to enforce. And those market friction mechanisms can create scary market conditions like backlogs, etc and guess what, enable market making brokerages to do internal swaps etc. in spite of the friction and essentially be the only ones able to bypass restrictions.
It would take a heavier hand to push against this problem. I'm all for it, I'm just not clever enough or knowledgeable enough to know what would be a good regulation that would fly in congress.
None of this prevents you from investing using whatever long-term paradigm you prefer. The effects of HFT on that investment style are negligible.
I don't think this is generally true. Stock trading isn't charity, it's done to make a profit. Almost trading is done on the basis of second order effects, and has been for as long as there's been a stock market. Stocks aren't given value by the company being profitable, they're given value by what other people are willing to pay for them (okay, yes, dividends also factor in, but many companies don't pay dividends and people still assign value to those stocks).
Apple should go up in price slightly, because someone (me) just made a big bet that it's going to do well. The market does this automatically.
TSMC should also go up in price very slightly, because Apple doing well makes it more likely that TSMC (a major vendor of theirs) will also do well, and the market just learned that Apple is slightly more likely to do well, so it also just learned that TSMC is more likely to do well. In other words the companies performance is correlated, so the stock prices should be as well.
By my understanding HFT is the mechanism by which the market can make TSMC go up slightly as well. High frequency traders can react to the information that apple is going to do better (as per it's increased stock price) and buy a bit of TSMC.
This seems like it should make the market more accurate, and if you buy that the market accurately assigning values to a company is a good thing, seems like a good thing.
Deleted Comment
Proponents argue that HFT and other such innovations provide liquidity to markets; my skeptical take is that this is a nice technical-sounding term for traders with cash to lowball falling asset prices.
No, this is the part most everyone gets wrong about HFT. These systems make money off of volume, not price. They don't care if the price is going up or down. All they care about is that they can capture a small price delta by facilitating a trade faster than someone else and they are willing to do so for smaller fractions of a cent which makes them more attractive to all market participants, including you and your retirement funds. Their concerns are orthogonal to investors and they compete with other HFT firms and market makers.
I think that's only true for IPO shares which often aren't available to retail investors/gamblers like me.
Deleted Comment
https://sniperinmahwah.wordpress.com/2018/05/07/shortwave-tr...
Photonic bandgap / Bragg fiber is being produced in quantity, and some subsea cables are using it.
I'm pretty sure it's the satellite lasers that require the generational shift. One subsea cable has no bearing on any other; once the first is installed, you're up and running. not so with laser-connected swarms.
We probably see these fibres for links between the exchange and data centres first (although there it is difficult to beat RF links, as they are direct line of sight)
Deleted Comment
Deleted Comment
Anything they do to cram bandwidth into their laser link can reasonably also be applied to fiber optical cables. Except the fiber optical cables come in bundles of 12 to 144 with absolutely no separation issues. Replicating that over open space (air or vacuum), if reasonably possible at all, will chug significant amounts of power in signal processing at the receive end.
There are major benefits to free-space optics -
- quicker to build
- lower latency
- in some cases, large coverage
But deep-sea cables compete on bandwidth, and that's not something free-space optics can beat them on. Why diminish this research achievement by conflating it with that? :(
This is peanuts compared to under sea cables and issues you will find penetrating atmosphere and increasingly, space.
We should keep under sea cables and work on additional connections to make our networks more resilient, both to natural phenomena, equipment failures and sabotage.
I think you're understating the risks from geomagnetic storms. In comparison to satellites, fiber optic cables seem like they'd be relatively unaffected even if the equipment attached to them needs replacement.
implementing the same infra, but using lasers as the backbone could speed up a large area of the country, make rolling out dense radio networks like 5g way faster and cheaper, etc.
If nothing else I could see governments insisting on both just for strategic reasons. Both techs are vulnerable to being fk'd with so doubling up makes sense
The part that impresses me most is they're talking about LEO satellites. Those move fast! Starlink does this with a very impressive phased array antenna design. Conceptually tracking a moving satellite with a laser is as easy as rotating a mirror, not sure how hard it is in practice.
https://en.wikipedia.org/wiki/Politics_of_outer_space
50 Mbps is great consumer bandwidth where?
Starlink in practice is 10-200Mbps. Here's their specifications: https://www.starlink.com/legal/documents/DOC-1138-34130-60
Where I used to live, I had 2 options for internet access. A Wireless ISP that uses a parabolic antenna pointed at a water-tower about ~20km from my home; or a cell-phone based internet connection.
The WISP allowed me on average 300kb/s transmissions. The Cell-Phone allowed me between 1.5Mb/s and 7Mb/s (to a max use of 5GB/month).
So 50Mb/s is an incredible upgrade.
An advertised "50 Mbps" mobile connection is dog food if you are used to 50 Mbps fiber. You are lucky if you get 20 Mbps through, though it can be much less. Worst part is all the packet loss that cause inconsistent latency and speed.
Anything that allows video communication/streaming is "great" imho, it certainly is more than enough for most people.