LTE-U: Watch Out Wi-Fi
Recent developments confirm my long-held belief that mobile operators would tap unlicensed spectrum to handle explosive data volumes on cellular networks.
A technical battle over unlicensed spectrum that has been playing out in the background found its way into the public eye last week via The Wall Street Journal. As discussed in the piece, "Cell Carriers Battle for Wi-Fi Airwaves," at issue is LTE-Unlicensed, a new way of increasing the amount of radio spectrum for 4G LTE networks, and its potential impact on Wi-Fi.
As the name indicates, the plan for LTE-Unlicensed, or LTE-U for short, is to allow cellular operators to use unlicensed radio spectrum, specifically the 5-GHz band now used for 802.11a, n, and ac wireless networks, to carry their LTE transmissions. (You can find versions of the LTE-U standard in 3GPP Rel-10/11/12 and Rel-13, which define a technique called Licensed Assisted Access (LAA.) The outcome has ramifications for private Wi-Fi networks and for companies like Republic Wireless, Google, and Cablevision, all of which are looking at "Wi-Fi first" mobile services.
Hey, What's Going On Here?
Now, you might be asking, "With all of the expensive licensed spectrum the carriers own, why do they need unlicensed spectrum?" And, "Can they just bully their way in?"
The answer to the first question is obvious. The volume of data traffic on cellular networks is exploding while the cost of licensed spectrum continues to rise. Plus, there's only so much licensed spectrum coming onto the market. According to wireless association CTIA, cellular data usage increased 177% from 1.47 to 4.06 terabytes between December 2012 and December 2014. Other estimates show cellular data traffic doubling every year for the next five years. In the meantime, roughly 500 MHz is available in the 5-GHz band (compared to 83.5 MHz in the 2.4-GHz band), the assigned bands are relatively similar worldwide, and it's free!
With regard to carriers' ability to push their ways up to the bar, the answer is, "Yes, as long as they play by the rules." These rules state that all applications and services that use that unlicensed spectrum must grant others fair usage of the spectrum. Which is to say, everyone has equal rights to use the unlicensed spectrum, as long as they don't abridge the rights of other users.
Oh, The Irony!
For those of us who have worked in wireless for a time, this whole thing is somewhat ironic. Cellular engineers have traditionally turned their noses up at the idea of using unlicensed spectrum, what with its potential for interference. Rather, they've extolled the virtues of licensed spectrum, where there could be only one legitimate user and an assurance of service quality. However, the explosive growth in cellular data traffic following the introduction of Apple's iPhone and subsequent Android devices has driven mobile operators to their knees.
With available radio spectrum nowhere near the amount needed to augment the cellular network, the mobile industry has pursued a number of stratagems to stay ahead of the tidal wave. Standards bodies like 3GPP have come up with increasingly more-efficient signal encoding techniques -- from Enhanced Data rates for Global Evolution (EDGE) to High Speed Packet data Access (HSPA) to LTE and LTE-Advanced -- that allow more bits per second to be carried on the same amount of radio spectrum. These tricks include Multiple Input Multiple Output (MIMO) antenna systems, carrier aggregation (for combining spectrum from different bands like 700, 800, and 1,900 MHz to form a larger channel), and protocol tweaks like Hybrid Automatic Retransmission reQuest (HARQ).
From High Horse to HetNet
Even with those developments, Shannon's Law says only so many pounds can fit in a 5-lb. bag. So in something close to desperation, the cellular engineers got down off their licensed high horses and started talking about Heterogeneous Networks or "HetNets." The idea was to augment the licensed network capacity with unlicensed spectrum, specifically Wi-Fi and particularly in traffic-dense environments, using small cells, essentially a requirement given the transmission range provided.
While the initial HetNet idea centered on Wi-Fi, the mobile industry (with Qualcomm taking the lead) is pushing LTE-U into the conversation for a number of reasons. The big advantage, advocates say, is that using LTE in the 5-GHz band can deliver two to four times the capacity that Wi-Fi, even 802.11ac, can deliver. While I haven't read through all of the engineering studies to support that, at face value it makes a lot of sense.
LTE and Wi-Fi are vastly different protocols. While they have incorporated many of the same Layer 1 characteristics, like MIMO and Orthogonal Frequency Division Multiplexing (OFDM), the Layer 2 elements are vastly different -- with LTE delivering significantly more efficiency. Wi-Fi's access mechanism is based loosely on Ethernet's Carrier Sense Multiple Access/Collision Detection (CSMA/CD) access and even has a similar name, CSMA/CA, with that latter "A" being "Avoidance." In Wi-Fi, the access point and the client stations vie for access to the shared radio channel on an equal basis, and take turns using it; in radio, we refer to that as Time Division Duplex, or TDD.
Cellular networks have always worked more along the lines of a master-slave relationship, where the base station is the master and schedules when the various mobile devices get to send. That was why incorporating capabilities like quality of service was relatively easy -- everyone knows who the "boss" is.
In addition, cellular designers focused on squeezing idle time out of the channel. But in Wi-Fi, every station waits for the channel to be idle for a defined period, called an Inter-Frame Spacing, before initiating a transmission. If the channel is busy when the station goes to send, it backs off a random interval but doesn't start running that timer down until the channel is clear.
Continue to Page 2 for a look at LTE-U and Wi-Fi coexistence