The Advanced Wireless Technology Group (AWTG), in early 2013 undertook an indoor and outdoor testing of the quality of experience of the four major UK networks in Central London, including a recently-launched LTE service operating at 1.8 GHz. Telecom TV reports that:
“The quality of experience with static web browsing showed rough parity between WiFi and LTE. But AWTG also undertook a static FTP uplink/downlink test to measure throughput performance. WiFi beat LTE on the critical downlink performance but lost out to LTE on uplink”.
(Read the full Telecom TV article and more technical info from AWTG)
Interesting data indeed, but how conclusive is it ? I would say that such a test says more about our perception of the service and how we adapt to it, rather than saying so much about the technical performance. It seems to be more important that the service works really well when its actually available (typical for WiFi), than that it works decently everywhere (typical for LTE). WiFi may have “spotty coverage”, but in indoor environments, the (indoor deployed) has a huge performance advantage over the outdoor deployed LTE systems (that has to penetrate the walls). Have we adapted so much that we accept to move to an area were WiFi coverage is good?
Watch also the discussion after Lauri Oksanens talk “Your Gigabyta a Day” (27min in) on these issues in the Johannesberg Summit
It would be interesting to mention how the testing was done, even at q high level. For instance, how many users were simultaneously active (i.e. uploading/downloading) in the Wi-Fi Access Point and LTE cell when the tests were done?
Did the test users have similar SINR or RSRP in both the Wi-Fi and LTE network? What features are available in the LTE network and how much bandwidth was available (i.e. 1.25MHz, 5, 10, 15 or 20)?
Without any of this information, i do not really understand how the results of the above article can be treated as “scientific”…