back to article Can your code survive crappy 2G? This open-source traffic controller will test it

A two-year project inside Facebook has culminated in the release of software to test how well applications and servers work under degraded network conditions – all the way down to rickety 2G. The idea behind Augmented Traffic Control, open-sourced on GitHub, is to improve the delivery of material on under-performing networks …

  1. Anonymous Coward
    Anonymous Coward

    2G?

    The entire internet dies at 2G speeds. So no, you can't surf the web.

    But if you're writing an app that does minimal client server? Maybe...

    1. Anonymous Coward
      Anonymous Coward

      Re: 2G?

      I used to browse the web at 9600bps, a lot slower than 2G speeds. Still have the modem (Maestro Executive 144) and yes, I have used it this century (gotta love work-related projects with legacy hardware).

      The only thing we didn't have then was that Flash was pretty much unknown and HTML4, CSS and the JavaScript DOM were new concepts. XMLHTTPRequest didn't exist then either.

      A well written modern web app should still be able to cope with such a link. It won't be fast, but there are efficiencies in the use of re-usable cacheable components such as JavaScript libraries, CSS and XHR which would reduce the bandwidth if (hah!) used correctly.

  2. Anonymous Coward
    Anonymous Coward

    Perhaps Facebook could test their own app and site on these speeds. Considering it should just be plain text and low res pictures Facebook can be painful on 2g of fail to work at all.

  3. skeptical i

    Back when I was a boy ....

    We assumed everyone would be on dialup and built for decent download/ performance at that speed, and those on a fat pipe got our pages screaming fast. Yeah, yeah, the internet's growed up a bit since then but the same principal applies, dunnit? Build lean and mean for the slowest connexion and take the faster-connexion performance as bonus gravy.

    1. Herby

      Re: Back when I was a boy ....

      Some people use dialup to this day. Unfortunately web sites don't understand this slowness. Perhaps this will educate them, but I'm not holding my breath!

      As for the "back when..." I used an ASR33 teletype. Programming a "reasonable size" project at that speed REALLY humbled you! It is a lesson many people still haven't learned!

      1. This post has been deleted by its author

    2. Anonymous Coward
      Anonymous Coward

      Re: Back when I was a boy ....

      We assumed everyone would be on dialup and built for decent download/ performance at that speed, and those on a fat pipe got our pages screaming fast.

      The industry practice at the turn of the century was apparently to design for a screen resolution of 800×600 and a 28.8Kbps modem. Horizontal scrollbars or load times greater than 20 seconds were to be avoided.

      I don't know what they design for today.

  4. Buzzword

    Speed vs security

    Unfortunately one of the biggest bottlenecks on pre-3G speeds is the HTTPS handshake setup time. By the time client and server have agreed on encryption protocol and keys, you've already used several seconds.

    It's not too bad for apps, but it is particularly problematic for secure websites which grab resources from many different sites: each host needs a new HTTPS handshake, and you can't have any HTTP or the browser complains about insecure content. Without 3G you really need a browser like Opera Mini which renders content remotely.

  5. RAMChYLD

    2G?

    Correct me if I'm wrong, but the screenshot shows a minimum of 100kbps down- iirc real 2G (GPRS) was 64kbps down, 48kbps in real world scenarios due to overhead. Dial up 2G was even worse (I remember my Mitsubishi Trium - 14.4kbps connection before overhead because it needed to dial an arbitrary WAP gateway number instead of doing things over GPRS- because GPRS apparently didn't exist then, only GSM. Those were the days). I remember using it to check Hotmail and making a reply, and do a Google search several times through the period I owned that phone. Blew my mind then, and looking back, it blows my mind all over on how usable 14.4kbps actually was on a monochrome device with a 96x64 resolution at that time. Nowadays, 64kbps on GPRS means only Opera Mini works and even then surfing the net involves waiting quite a bit.

    Yes, I still fall to 64kbps every now and then. Bloody ISP throttles me down to GPRS speed when I run out of quota...

  6. IglooDude

    Yeah, 2G (and sometimes 3G) latency is a big consideration in some corners of the machine2machine space, too. Mostly you get the pushing of tiny config or data files back and forth, and don't much care if it is measured in minutes vice seconds, but you sometimes run across folks trying to replace ethernet with 2G for their not-particularly-well-tuned client/DB app and then insisting that the cellular network is broken.

  7. The Wegie

    Somebody sell National Rail a copy

    So that they can work out how useless their app is round here, where GPRS is all there is...

  8. rob miller
    Unhappy

    imho this is an industry wide problem

    So many apps and systems I use just hang with a poor connection or a slow VPN. In the UK I have reasonably good networking - at least at home, but I've spent many years in developing countries where it is just sh*t. Seems like everything is written in fibre-land, and the builders think network latency is outside of their responsibility.

    Shame this suite clearly does not work as advertised given how poor facebook messenger seems to be.

  9. Colin Miller

    Already done?

    There are already programs that will simulate slow or unreliable connections. See http://stackoverflow.com/questions/1094760/network-tools-that-simulate-slow-network-connection for a discussion.

    Netem on a linux bridge can add latency, re-order, duplicate and drop packets. It can do this either at a constant rate or in burst mode, which may be useful to simulate dodgy WiFi or GPRS connections. It can also do it to a different extent depending on which way the traffic is going.

  10. J.G.Harston Silver badge

    Reminds me of when I was testing some buffered data transfer code, I progressively thottled it down all the way to one-byte buffer and it worked fine.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like