That’s entirely speculative. There are diminishing returns. Unless you’re going to host your own YouTube, the use case for 50Gbps connections to the home is quite small. 4K video streaming at Ultra HD Blu-ray bitrates doesn’t even come close to saturating 1Gbps, and all streaming services compress 4K video significantly more than what Ultra HD Blu-ray offers. The server side is the limit, not home connections.
Now, if you want to talk about self-hosting stuff and returning the Internet to a more peer-to-peer architecture, then you need IPv6. Having any kind of NAT in the way is not going to work. Connection speed still isn’t that important.
We are not even filling out the bandwidth of pipes we have to the home right now. “If you build it, they will come” does not apply when there’s already something there that isn’t being fully utilized.
Take a look at devContainers as an idea that might be generalized. It’s just docker containers so so big but not huge however the use case ….
devContainers are a complete portable development environment, with support from major IDEs. Let’s say I want to work on a Java service. I open my IDE, it pulls the latest Java devContainer with my environment and all my tools, fetches the latest from git, and I’m ready to go. The problem with this use case is I’m waiting this whole time. I don’t want to sit around for a minute or two every time I want to edit a program. The latest copy needs to be here, now, as I open my IDE
there could be some new thing that no one has not even bothered to think about because of the limitations. Imagine streaming back when downloading few kilobytes for an hours was considered reasonable, people would have laughed at the very thought of it.
That’s entirely speculative. There are diminishing returns. Unless you’re going to host your own YouTube, the use case for 50Gbps connections to the home is quite small. 4K video streaming at Ultra HD Blu-ray bitrates doesn’t even come close to saturating 1Gbps, and all streaming services compress 4K video significantly more than what Ultra HD Blu-ray offers. The server side is the limit, not home connections.
Now, if you want to talk about self-hosting stuff and returning the Internet to a more peer-to-peer architecture, then you need IPv6. Having any kind of NAT in the way is not going to work. Connection speed still isn’t that important.
How exactly does NAT prevent that? On good hardware it adds insignificant latency.
It has nothing to do with latency, and everything to do with not being able to directly address things behind NAT.
Unless you’re going to host your own YouTube…
This is exactly what peer tube is struggling with. This bandwidth would solve the video federation problem.
See, you get it!
Except we need IPv6 before that’s at all viable.
We are not even filling out the bandwidth of pipes we have to the home right now. “If you build it, they will come” does not apply when there’s already something there that isn’t being fully utilized.
Oh, maybe. I’m not familiar with bandwidth utilization in China.
Take a look at
devContainers
as an idea that might be generalized. It’s just docker containers so so big but not huge however the use case ….devContainers
are a complete portable development environment, with support from major IDEs. Let’s say I want to work on a Java service. I open my IDE, it pulls the latest Java devContainer with my environment and all my tools, fetches the latest from git, and I’m ready to go. The problem with this use case is I’m waiting this whole time. I don’t want to sit around for a minute or two every time I want to edit a program. The latest copy needs to be here, now, as I open my IDEMaybe don’t rely on cloud garbage for basic development?
there could be some new thing that no one has not even bothered to think about because of the limitations. Imagine streaming back when downloading few kilobytes for an hours was considered reasonable, people would have laughed at the very thought of it.