I just calculated an estimate of the data rate for the Crackdown demo shown at Build. Obviously there's a couple more variables involved, for example, how the building breaks and the shape of the chunks. Would they derive from the local box which then gets sent up to Azure? Presumably a server application which would have the collision meshes of the map so it can sync up with the local box, it'd first receives the variables around the explosion like size, direction, radius etc.
Data Rate
UPDATED: Rather than real-time calculating of every chunk, 32 times a second, /u/caffeinatedrob recommended drawing paths which I've just substituted into the calculations
32 bits * 6 - Float
9 bits * 2 - 9 Bit Integer
Compression Ratio: 85%
Chunks: 10,000
Total Bits per Chunk: 210 bits
Total Bits for Chunks: 2,100,000
Total Bits Compressed: 315,000
Typical Ethernet MTU = 1500 bytes = 12000 bits
Data Frames per Initial Explosion of 10,000 Chunks: 27
Typical UDP Overhead = 224 bits
Total Overhead per Explosion = 6048 bits
Total Bits Needing to Be Sent Per Explosion: 321,048
Throughput Needed Per Initial Explosion: 313Kbps
All of Chunks Collide in 4 seconds: 2500 Chunks re-drawn every second
2500*210 = 525000
Compressed: 78750 bits
Data Frames per second needed for re-draw: 7
UDP Overhead = 1568 bits
Total Bits Needed per re-draw: 80318 bits
Throughput Needed per re-draw: 78kbps
Overall throughput needed in the first second: 391kbps
Every second after initial explosion would be: 78kbps
For the data, I've used float values for the X,Y,Z starting co-ordinates and the same for the finishing co-ordinates of the path on the map. I've assigned 9 bit integers for the rotation values on the path and the radius of the arc of the path.
The compression used is a given in this scenario. With the data being compressed featuring purely floats/ints the compression is very high, in around the 80's which I've substituted in.
To compare this to services which are used daily, for example Netflix, which uses 7Mbps for a Super HD stream which is pretty much standard these days. Next-gen consoles, previous gen support Super HD.
Latency
Average RTT (Round Trip Time) to Azure: 40ms
Calculation Time at Server: 32ms (For 32FPS)
Total RTT = 72ms
In Seconds = 0.072 Seconds
That means it takes 0.072 seconds from the beginning of the explosion for it to come back and start happening on your screen. Once the first load has occurred, you only have to receive data if the chunks collide with anything which would result in the re-draw of paths. The latency on that would be the calculation time, call it 16ms which is a lot considering that only a few may have to be-drawn. Then, add the half trip time of 20ms which would result in waiting 36ms or 0.036 seconds before the re-drawn path gets updated on-screen.
Packet Loss
In regards to packet loss, in 2014, you simply don't have any. ISPs these days tend to be both Tier 3 and 2 with peering often directly to large services which make up a lot of the bandwidth. This includes services like Google, Facebook, Twitter, Netflix etc. Honestly, unless you have a poor wireless signal inside your house which often causes some slight packet loss, you're not going to get any. Even if you drop a couple of packets, you'd loose a handful of chunks for that one frame and in-terms of gameplay it's not going to be noticeable really.
Conclusion
After taking suggestions on-board and drawing paths rather than real-time chunk calculation, the data rates which are needed there a significantly lower and the requirements for the internet connection are perfectly acceptable with only needing to transmit at 391kbps.
If anyones got any suggestions how to increase accuracy, or anything, let me know.
TL;DR: Cloud computing is definitely feasible on normal ISP connections. Would require 391kbps when the explosion starts.
Response:
Would the total data set need to be sent each frame though? Wouldn't they send snapshots of changes to the world objects and then when events occur (other player shoots stuff) sync up...
I'm sure they will optimise by I.e. sending a chunk with direction etc and do some calcs locally to draw it's path until any changes on next sync. This would also allow for drops in connection where you would just see the local objects updated from the last known positions.
Similar to how GGPO works in online fighters
OP
Never even thought of that. Drawing the paths would be a good solution and would drop the throughput significantly. The only downside I can think of with that is if for example anyone gets in the way. There wouldn't be any form of collision so if there needs to be, the box would have to locally calculate the collision detection which may take up quite a lot of CPU on the box. In addition, any factors of the explosion after the paths are drawn would have to be re-sent and calculated. For example, chunks colliding.
The way I imagined it in my head for it to work the way the calculations are done is by the server having a mesh of the world and understand the buildings which can be destructible and the chunks they have.
With that, when it receives a request to go boom, it would understand how the building would react based on the explosion variables which its sent so it the client doesn't need to send as much information. This is important as upload speeds are significantly lower.
http://www.reddit.com/r/xboxone/comments/27yczf/i_just_calculated_an_estimate_of_the_internet/Is there anyone out there who can provide some insight into his calculation? He is pretty much saying that the calculations would just tell your XBO where to put the stuff removing the need for your XBO to do the calculations.