Well, I say newsflash… The actual workshop took place on Dec 9th. Forgive the late-ish update, as the new year vacation came in between.
Some observations based on cellular networks today:
- Number of subscribers, number of devices are still increasing. Data consumption increases exponentially.
- Reliability in wireless networks becoming more important.
- If implemented correctly, small cells boost the network performance considerably.
- Self optimizing networks are highly effective compared to 3G deployment.
Some predictions concerning future wireless research and 5G in particular:
- Overall network capacity rather than peak data rates will be crucial. In connection with this concern, network densification will be an important research item.
- An authorized/licensed spectrum usage model which will promote better utilization of spectrum needs to be developed.
- Spectrum agility; i.e. how many different bands that the network can simultaneously utilize will be important.
- Aggregation of many carriers in the handset will be challenging, potentially driving up device costs.
- Driving the cost of LTE radios down is necessary for retiring the GSM network and also for M2M to be widely used.
- New RAN design is necessary to help simplify core network operation.
- We may well abandon the cellular concept in 5G.
- Network infrastructure must be designed such that most modifications can be performed in software, giving operators more flexibility. So, network virtualization will be important.
- There are still gains to be had in energy efficiency and spectral efficiency fronts.
- Massive MIMO, cooperative communications and network information theory (full duplex transmission) will be important in 5G.
- mm-wave technologies seem to be promising to provide the high rates and low latencies that are envisioned in 5G.
- 5G must have a very strong value proposition for operators to consider upgrading to 5G after LTE-A.
- High reliability and very low latencies will be crucial or will not be crucial in 5G, depending on who you ask. If they will be important, some PHY advancements will be necessary.
- D2D is going to be important or not so important, depending on who you ask.
This was the 9th International Workshop on Broadband Wireless Access (BWA), and it was co-located with IEEE Globecom 2013. Overall, the workshop was a learning experience; the keynote addresses and the panel discussion were both informative and thought provoking. The hot question that was raised many times during the keynotes and the panel discussion was “What is 5G going to entail, and when is it going to happen?”, which incidentally was one of the major themes in Globecom 2013 as well.
Disclaimer: The following is not a verbatim transcript of the presentations and discussions in the workshop. Although I tried to relate the discussions as faithfully as possible, I may not have represented the presenters’ views with perfect accuracy. So please don’t blame the speakers for what you read here O:-)
The first keynote in BWA was delivered by Hank Kafka, who is the VP of Radio Access & Devices at AT&T. Here are some highlights from his speech and some points that I found to be particularly interesting.
Mr. Kafka started off with some remarks on trends in cellular networks today:
- There are more subscribers, more devices, sophisticated devices e.g. tablets. As devices increase, total data usage naturally increases. Additionally, newer devices with higher resolutions make it more enjoyable to consume video on these devices, therefore data consumption per user is also increasing. As a result, the total data consumption grows exponentially.
- For the last 5 years, AT&T have been making predictions on growth of data usage every 3 months. Looking back, they observed that their predictions had always been smaller than the actual growth of data. Currently, the volume of data usage is already 6x more than the volume of voice usage; the predictions they made 5 years ago would not be anywhere near this figure. Therefore, when the research community is making predictions about 2020, there is a good probability that we will be off by some margin.
- The total number of connected devices is increasing: Not only people, but devices are connected. We are more and more dependent on these devices, therefore reliability becomes an issue.
- There is tremendous need to handle the increasing capacity demand, therefore LTE deployments have ended up with a very complex band structure, which is going to get even more complicated in the future.
- Densification is one of the approaches to address the capacity problem. Many new macrosites are being deployed, but also several times as many sites with small-cell capabilities are being deployed. AT&T call these sites “metrocells”: If they can’t get a permit to deploy a full fledged site, they deploy such a metrocell on e.g. a street light pole. One interesting observation regarding the operation of metrocells is that, these small cells take over some of the traffic from the macro cell, but the macro cell also starts to serve more traffic than before, which is a synergistic and desirable effect. When small cells serve the traffic in indoor/bad SNR conditions, the macro cell is in a good position to serve other outdoor users with good SNR conditions. So, when implemented correctly, small cells boost the network performance considerably.
- AT&T find self optimizing networks highly effective compared to 3G deployment; they are using LTE-A (release 10-12) to handle the data demand.
Some current and future-looking problems mentioned by Mr. Kafka:
- He predicts that peak-rate capability will not be the most important thing in the future. This is similar to the MHz competition in the CPU market during the last decade. That is, the CPU clock speed lost importance once computers become fast enough. Similarly, peak data rates will reach high enough levels, consequently, how much capacity you get out of the network will become a more important issue.
- As capacity demand continue to increase, we cannot afford to go through the standardization process for each different spectrum band. Therefore, spectrum agility becomes an important issue. That is, how many different bands the network can use at the same time.
- On the issue of machine-to-machine communication, producing low-cost LTE radios is an important challenge. If we are going to put a wireless unit in every small device like a dog-collar, we need to produce these units very cheaply. Moreover, achieving this low production cost is important to be able to phase out the GSM network so that the spectrum can be used by LTE-A technologies.
- The 3GPP standard currently defines a 5-way carrier aggregation. AT&T and other companies have recently implemented 2-way carrier aggregation. Some vendors consider 3-way carrier aggregation. But larger carrier aggregation schemes pose a problem: How can you implement a device that can perform 5-way carrier aggregation? There are some tunable devices, but the RF part is going to be a challenge. SDR is a useful solution, but it is always more expensive to do a SDR solution compared to a specific design, which brings us back to the spectrum agility and low production cost questions.
- Because of the increase in traffic usage, we are approaching a point where we cannot design the RAN independently of the core network. We will need to re-define the RAN such that it simplifies the handling of the huge volume of data in the core network.
- What is the future of LTE? How long is the “long” in LTE? Since 2008, AT&T has invested [a staggering amount of billions of dollars] in its cellular network. In 2014 alone, AT&T expects to invest [tens of billions of dollars more]. This indicates that, by the time 5G comes round, a lot of networks in a lot of countries will have invested a lot of billions of dollars in LTE technologies. So, the question is: What will make operators invest in 5G? Observing the generation changes in the past we observe this: The change from 2G to 3G was justified by the 60% improvement in voice cost, 45-70% improvement in data cost. The change from Release-6 to LTE was justified by the increased throughput, increased spectral efficiency, high spectrum flexibility and lower latencies. These examples indicate that there has to be a similar compelling advantage that will justify the change from LTE-A to 5G. That is, we need to provide better arguments than 20% improvement in spectral efficiency.
Dr. Bill Payne, who is VP of Small Cells and CDMA at NSN also focused on 5G in his keynote. His vision was that 5G would be a combination of revolutionary technologies and existing ones. Until 2020 or thereabouts, we may accommodate 1000x capacity increase through evolution of LTE technologies, assuming approximately 10% userbase growth and 25-50% data consumption growth year-on-year. But we need to look into different, possibly disruptive solutions for beyond 2020, which is when 5G will be needed to accommodate 10000x more capacity, near zero latencies and 10-100 times more devices. As an example, Dr. Payne mentioned mm-Wave technologies, which promise:
- Abundance of spectrum: 70-90 GHz.
- 10 Gbps peak rate, 100 Mbps cell edge rate
- Low latency, simple air interface.
- 100-150 meters inter-site distance
- Massive antenna arrays implemented in chip scale.
The focus of the third keynote speech, delivered by Prof. Gerhard Fettweis, who is the Vodafone Chair Professor at TU Dresden, was extremely low latency requirements in 5G, that is, “tactile” Internet. Some examples to tactile Internet that he gave in his speech were virtual reality applications and mobile robotics; i.e., industrial robots which are controlled wirelessly, which will require sub-millisecond latencies. If we do some back-of-the-envelope calculations, we see that 1 ms latency requirement from sensor to actuator allows for about 100 microseconds time in the air interface. This short transmission time means that such a low latency system cannot be realized using LTE technologies because the OFDM symbol time is in the order of 70 μs. Therefore, a future 5G technology will have to be designed such that resource blocks have larger bandwidth and smaller time duration. Furthermore, these requirements lead to other problems concerning coherence bandwidth and delay spread such that we cannot assume the wireless channel to be flat fading any more. That is, there will be frequency selectivity on the subcarriers of an OFDM symbol, which will create inter symbol interference. One solution that Prof. Fettweis mentions is GFDM. It requires a certain novel approach of applying the cyclic prefix, filtering, pulse shaping. So we may see this as yet another support for the argument that PHY is not dead, I suppose.
The other important issue that Prof. Fettweis discussed in his talk was the issue of outage in applications that required less than 1 ms latency, such as manufacturing lines. He argued that we need to provide reliable, resilient, carrier grade access with outage in the order of 10E-10, which is a requirement usually overlooked in wireless networks. Some more back-of-the-envelope calculations show that carrier grade wireless access in e.g. a manufacturing line may generate data in the order of 100 Mbps. On the other hand, virtual reality type applications with less than 1 ms latency requirements may generate data in the order of 100Gbps. The sub-millisecond latency requirement implies that the data processing and data storage centers cannot be too far away from the data source, considering that light travels only 300 km per 1 ms. So, there is already some research ideas on “mobile-edge-cloud”.
The highlight of the workshop, in my opinion, was the panel discussion titled “What will be the key changes in future broadband wireless access, and when can we expect these?” Panelists were Dr. Gabor Fodor (Ericsson), Dr. Chih-Lin I (China Mobile), Hank Kafka (AT&T), Dr. Bill Payne (NSN), Prof. Gerhard Fettweis (TU Dresden). The panel was moderated by Dr. Patrick Marsch (NSN).
Some bits that caught my attention from the panel discussion:
- Dr. Fodor argued that the paradigm that will define 5G will be “Networked Society“, which encompasses mobile broadband, internet of things, in general where everything is connected. Technological developments that will enable this vision will include massive MIMO, cooperative communications and network information theory. He emphasizes that how we may apply concrete results from network information theory to practice will be important rather than solving nice mathematical problems. This remark aligns nicely with the concerns raised in the “Is the PHY layer dead?“article. Other areas of interest will be development of an authorized/licensed spectrum access model for better utilization of spectrum, and also infrastructure densification.
- Dr Fodor also considers the possibility that 2-3 times more expensive handsets will appear in the market, which will contain some features of a pico base station, which are meant to act as relays as I understand. Therefore, we may expect that the distinction between a base station and a device to blur. So, the research community should move from the uplink/downlink transmission point of view of the network to full-duplex transmission and network coding theory.
The presentation by Dr. Chih-Lin I (pronounced /ee/) was also very interesting. Some highlights from her presentation:
- She mentioned that in the last 5 years, the data traffic growth in China Mobile’s network has been 81x. Extrapolating from this figure, in the next 10 years, we can expect a 6000x growth in traffic volume, which falls somewhere between the 1000x and 10000x estimates.
- Her vision is that the network infrastructure should go “soft” as in software; that is, much of the infrastructure should be reconfigurable through modifications in software. This would enable operators to make changes to their networks in week/month time spans rather than years, which is the current situation in the communications industry.
- On energy efficiency: She argues that we should rethink the Shannon limit in the sense that we should aim for the optimum energy efficiency vs. spectral efficiency operating point rather than pushing for highest spectral efficiency. Her argument was that, if we plot EE against SE we find that in order to increase energy efficiency, we need to sacrifice spectral efficiency. If we also factor in the other energy consuming parts of a communication system (not the backhaul, but aspects like signal processing, PA and so on) then the resulting EE vs. SE curve has an optimum point. If we plot the typical operating points of the 2G/3G/4G systems on this curve, we see the following trend. This means that there is still improvement to obtain both in terms of EE and SE when we look at the cellular networks in use today.
- She argues that the cellular concept has been around in cellular network design since its inception by Ring & Young in 1947, and it is time that we start to think outside the cellular concept. Similarly, signalling and control has to be redesigned to make them more intelligent and application aware for both connection oriented and connectionless communication.
- We should try to make base stations in a small form factor, which will easily blend into the urban environment so that they can be easily installed in may places, e.g. on building facades and so on.
- She adds that full-duplex communication is exciting, but argues that the challenge in realizing full-duplex communication is on the networking side, not on the physical interface side.
- She emphasizes that there needs to be a way to find a better inter-operation between IEEE and 3GPP technologies.
Some other interesting points from the panel discussion:
- Question to the panel: It is argued that one of the features of 5G will be very low latency and very high reliability. Do we need some disruptive change in order to achieve these goals? Is there any other design goal in 5G that warrants a disruptive change? In response, Mr. Kafka argues that the change need not be disruptive; the sub-millisecond latencies can be achieved through evolutionary improvement of existing technologies. Dr. Fodor adds that accommodating device-to-device communication is an important design goal in 5G, and this goal also can be achieved without a disruptive change; that is, if the cellular network can facilitate D2D operation then it will not be disruptive. An example is device discovery that is being standardized in 3GPP Release 12: When the cellular network is down, the devices discover each other and communication somehow works. But when the cellular network is available, the communication works much better.
- Hank Kafka emphasized the impact of network virtualization; he argued that software defined networks must be considered in network design so that operators can use as much software defined components as possible, thus minimizing the amount of dedicated hardware. I think this view is similar to Dr. Chih-Lin’s arguments on how the infrastructure should become “soft”.
- On the D2D note, Dr. Chih-Lin argues that D2D has always been possible, for example using Bluetooth. She then questions the point of involving the network operator in D2D. Her point is essentially that the users have to see a value that is worth paying for in order for the operator to come into D2D, but D2D is not compelling for users at the moment. She is also sceptical about Internet of things. She argues that IoT will happen but there’s no sense in connecting these tremendous number of devices directly onto the 5G network. So, there should be a gateway that IoT devices connect through to access the 5G network. She is also sceptical about the latency/reliability goal in 5G: The low latency and high reliability is a worthy effort, she says, but you can always find a niche market that will be interested in such qualities. So, her opinion is that latency and reliability is not going to be the focus of 5G. I think it was rather interesting to hear the world’s biggest mobile operator’s take on some of the design goals of 5G as advocated by academia and infrastructure vendors. I suppose it is acceptable for cutting-edge research to make relaxed assumptions and set ambitious goals. However, it was interesting to see that some of these assumptions may not be justified from operators’ point of view. In a way, when 5G was promising the moon, Dr. Chih-Lin’s argument was that it would be nice to have the moon, but most people don’t need that much cheese anyway, besides we don’t have the kind of fridge to store all of that. It was a refreshing opinion to hear, I think.
Looking back, I think the workshop has been well worth the trouble of crossing the pond, even tough we conference goers had our share of mishaps. On the day we traveled, the traffic control system at Heathrow had a meltdown. Thankfully, air traffic control could find a vacant slot in a different air corridor (something like 6000 ft instead of the usual 10k ft) so that our flight was not delayed much, and we could catch our connecting flight, making our way through this sea of distraught travelers.
When we finally arrived in Atlanta, we were still down on our luck because the weather in Atlanta was quite miserable. We got to learn that the day before our arrival, it had been a balmy 22 degrees Celsius. That was a pity because I would have loved to see the city in finer weather. Shopping mall food left much to be desired, but luckily we came across this nice restaurant called “Aviva by Kameel“, which prepared Mediterranean dishes. If you happen to be near the Peachtree Center in downtown Atlanta, do give their aubergine with Parmesan a try. They also make a mean shawarma.
All in all, the contents of the keynotes and the panel discussion were terrific. I extend my thanks and congratulations to the workshop organizers for their hard work. Looking forward to the next installment of BWA.