What are 1G, 2G, 3G and 4G networks ?
The “G” in wireless networks refers to the “generation” of the underlying wireless network technology. Technically generations are defined as follows:
1G networks (NMT, C-Nets, AMPS, TACS) are considered to be the first analog cellular systems, which started early 1980s. There were radio telephone systems even before that. 1G networks were conceived and designed purely for voice calls with almost no consideration of data services (with the possible exception of built-in modems in some headsets).
2G networks (GSM, CDMAOne, D-AMPS) are the first digital cellular systems launched early 1990s, offering improved sound quality, better security and higher total capacity. GSM supports circuit-switched data (CSD), allowing users to place dial-up data calls digitally, so that the network’s switching station receives actual ones and zeroes rather than the screech of an analog modem.
2.5G networks (GPRS, CDMA2000 1x) are the enhanced versions of 2G networks with theoretical data rates up to about 144kbit/s. GPRS offered the first always-on data service.
3G networks (UMTS FDD and TDD, CDMA2000 1x EVDO, CDMA2000 3x, TD-SCDMA, Arib WCDMA, EDGE, IMT-2000 DECT) are newer cellular networks that have data rates of 384kbit/s and more.
The UN’s International Telecommunications Union IMT-2000 standard requires stationary speeds of 2Mbps and mobile speeds of 384kbps for a “true” 3G.
4G technology refers to the fourth generation of mobile phone communication standards. LTE and WiMAX are marketed as parts of this generation, even though they fall short of the actual standard.
The ITI has taken ownership of 4G, bundling into a specification known as IMT-Advanced. The document calls for 4G technologies to deliver downlink speeds of 1Gbps when stationary and 100Mbps when mobile, roughly 500-fold and 250-fold increase over IMT-2000 respectively. Unfortunately, those specs are so aggressive that no commercialized standard currently meets them.
Historically, WiMAX and Long-Term Evolution (LTE), the standard generally accepted to succeed both CDMA2000 and GSM, have been marketed and labeled as “4G technologies,” but that’s only partially true: they both make use of a newer, extremely efficient multiplexing scheme (OFDMA, as opposed to the older CDMA or TDMA), however, WiMAX tops at around 40Mbps and LTE at around 100Mbps theoretical speed. Practical, real-world commercial networks using WiMAX and LTE range between 4Mbps and 30Mbps. Even though the speeed of WiMAX and LTE is well short of IMT-Advanced’s standard, they’re very different than 3G networks and carriers around the world refer to them as “4G”. Updates to these standards — WiMAX 2 and LTE-Advanced, respectively — will increase througput further, but neither has been finalized yet.
2G, 3G, 4G, 4G LTE – What are They?
Quite simply, the “G” stands for Generation, as in the next generation of wireless technologies. Each generation is supposedly faster, more secure and more reliable. The reliability factor is the hardest obstacle to overcome. 1G was not used to identify wireless technology until 2G, or the second generation, was released. That was a major jump in the technology when the wireless networks went from analog to digital. It’s all uphill from there. 3G came along and offered faster data transfer speeds, at least 200 kilobits per second, for multi-media use and is still the standard for wireless transmissions regardless of what you hear on all those commercials. It is still a challenge to get a true 4G connection, which promises upwards of a 1Gps, Gigabit per second, transfer rate if you are standing still and in the perfect spot. True 4G on a wide spread basis may not be available until the next generation arrives. 5G?
What are the Standards of the G’s
Each of the Generations has standards that must be met to officially use the G terminology. Those standards are set by, you know, those people that set standards. The standards themselves are quite confusing but the advertisers sure know how to manipulate them. I will try to simplify the terms a bit.
1G – A term never widely used until 2G was available. This was the first generation of cell phone technology. Simple phone calls were all it was able to do.
2G – The second generation of cell phone transmission. A few more features were added to the menu such as simple text messaging.
3G – This generation set the standards for most of the wireless technology we have come to know and love. Web browsing, email, video downloading, picture sharing and other Smartphone technology were introduced in the third generation. 3G should be capable of handling around 2 Megabits per second.
4G – The speed and standards of this technology of wireless needs to be at least 100 Megabits per second and up to 1 Gigabit per second to pass as 4G. It also needs to share the network resources to support more simultaneous connections on the cell. As it develops, 4G could surpass the speed of the average wireless broadband home Internet connection. Few devices are capable of the full throttle yet. Coverage of true 4G is limited to large metropolitan areas. Outside of the covered areas, 4G phones regress to the 3G standards. We have a ways to go. For now, 4G is simply a little faster than 3G.
4G LTE– Long Term Evolution – LTE sounds better. This buzzword is a version of 4G that is becoming the latest advertised technology but still not true 4G as the standards are set. When you start hearing about LTE Advanced and WIMAX Release 2, then we will be talking about true fourth generation wireless technologies because they are the only two formats realized by the International Telecommunications Union as True 4G at this time.
WiMAX – Worldwide Interoperability for Microwave Access – should be capable of around 40 megabits per second with a range of 30 miles. It is one of the closest technologies to meet the standards of true 4G and as it develop should surpass the 100MB/second which is the 4G standard. Mobile WiMAX allows the use of high speed data transfers and is the main competition for the 4G LTE services provided by cellular carriers.
The major wireless networks are not actually lying to anyone about 4G, they simply stretch the truth a bit. A 4G phone has to comply with the standards but finding the network resources to fulfill the true standard is difficult. You are buying 4G capable devices but the network is not yet capable of delivering true 4G to the device. Your brain knows that 4G is faster than 3G so you pay the price for the extra speed. Marketing 101.
Where does it go from here and why does this page exist? Not sure where this path will lead but the reason I wrote this page was to try to understand the lingo a bit better. I think I cleared it up for myself so I thought I would pass it along. Hope it helps!
2G, 3G, Next-G, 4G – What’s the difference?
In this guide we’re going to cover off the major differences between the different ‘generation’ networks and check out some of the technical aspects of the different technologies. You’ll need to have some understanding of basic technological terms like megahertz. To make it easier we will link some of the terms back to our Understand the Jargon page.
Before we start: What exactly is a ‘G’ or ‘Generation’?
In a nutshell, each Generation is defined as a set of telephone network standards, which detail the technological implementation of a particular mobile phone system.
1G – Analog
Introduced in 1987 by Telecom (known today as Telstra), Australia received its first cellular mobile phone network utilising a 1G analog system. The analog network was responsible for those bulky handheld ‘bricks’ that you might have had the displeasure of using and your wallet the displeasure of buying (originally retailed at around $4250).
The technology behind 1G was the AMPS (Advanced Mobile Phone System) network. Permanently switched off at the end of 1999, AMPS was a voice-only network operating on the 800MHz band. Being a primitive radio technology, AMPS operated in the same manner as a regular radio transmission, much like your UHF radio where the 800MHz band was split up into a number of channels (395 voice, 21 control) via FDMA (Frequency Division Multiple Access). Each channel was 30KHz wide and could support only one user at any time, meaning that the maximum number of mobile phone users per cell tower was 395. The tower assessed the signal strength of each user and assigned channels dynamically, ensuring that channels could be reused by multiple towers without interference.
Problematic? Yes, and not just a limited number of users..
Just like your UHF radio, anyone with a radio scanner capable of receiving/transmitting on the 800MHz band could drop in on your call. Being analog, the 800MHz band was also susceptible to background noise and static caused by nearby electronic devices. However the simplicity of the AMPS design meant it did have one advantage over later 2G networks – coverage. An AMPS user could connect to a cell tower as far as the signal could be transmitted (often >40km depending on terrain).
At its peak, the 1G network had around 2 million subscribers.
2G – Digital
Fast forward to 1993 Telecom, now known as Telstra, introduces the digital network. The introduction came about to overcome many of the issues with the AMPS network highlighted above, with network congestion and security being the most important two motivators. With this new technology came many of the services we now take for granted – text messaging, multimedia messaging, internet access, etc, and also introduced us to the SIM card.
This fancy new digital network is called GSM – Global System for Mobile Communication, and its technological backbone of choice is TDMA (similar to FDMA). The radio frequency band utilised by GSM is the 900MHz spectrum and later introduced on the 1800MHz band.
So how is this network any better than AMPS? The secret lies in TDMA – Time Division Multiple Access. The FDMA component splits the 900MHz (actually 890MHz to 915MHz) band into 124 channels that are 200KHz wide. The ‘time’ component then comes into play in which each channel is split into eight 0.577us bursts,significantly increasing the maximum number of users at any one time. We don’t hear a ‘stuttering’ of a persons voice thanks to the wonders of digital compression codecs, which we’re not going to go into here.
Aside from more users per cell tower, the digital network offers many other important features:
– digital encryption (64bit A5/1 stream cipher)
– packet data (used for MMS/Internet access)
– SMS text messaging
– caller ID and other similar network features.
Problems? You bet. Unlike its AMPS predecessor, GSM is limited severely in range. The TDMA technology behind the 2G network means that if a mobile phone cannot respond within its given timeslot (0.577us bursts) the phone tower will drop you and begin handling another call. Aside from this, packet data transmission rates on GSM are extremely slow, and if you’re on Vodafone/3/Virgin/Optus you’ve probably had first hand experience on this when you go outside your networks defined ‘coverage zone’.
To overcome these two problems we’re going to introduce two new networks – CDMA and EDGE.
Code Division Multiple Access. This branch of 2G was introduced by Telstra in September 1999 as a replacement for customers who could receive a good signal on AMPS, but were outside GSM’s limited range. The extended range is achieved by removing the ‘time’ based multiplexing with a code-based multiplexing. A lower frequency band (800MHz) also assisted in range by reduced path loss and attenuation.
Picture a room full of people having conversations – under TDMA each person takes their turn talking (ie time division), conversely CDMA allows many people to talk at the same time but is the equivalent of each person speaking a different language, ie in a unique code. This of course isn’t exactly how it works, if you want to know more there are some resources at the bottom of the page.
Enhanced Data Rates for GSM Evolution. GSM introduced a GPRS based packet data network in 2001, with a max speed of around 60-80kbps (downlink), equating to a download speed of 10kB/s – slightly faster than dial-up.
EDGE was later introduced as a bolt-on protocol (no new technology was required) increasing the data rate of the 2G network to around 237kbps (29kB/s).
3G – The Mobile Broadband Revolution
Introducing the 2100MHz network. Three Mobile in conjunction with Telstra brought the 3G standard to life in 2005, servicing major metropolitan areas initially and over the following years expanding coverage to 50% of the Australian population. Leased out to Optus/Vodafone/Virgin, the 2100MHz combined with a 900MHz network forms the basis of all non-Telstra mobile broadband services, servicing around 94% of Australian residences.
The 3G standard utilises a new technology called UMTS as its core network architecture – Universal Mobile Telecommunications System. This network combines aspects of the 2G network with some new technology and protocols to deliver a significantly faster data rate.
The base technology of UMTS is the WCDMA air interface which is technologically similar to CDMA introduced earlier, where multiple users can transmit on the same frequency by use of a code based multiplexing. Wideband CDMA (WCDMA) takes this concept and stretches the frequency band to 5MHz. The system also involves significant algorithmic and mathematical improvements in signal transmission, allowing more efficient transmissions at a lower wattage (250mW compared to 2W for 2G networks).
The new network also employs a much more secure encryption algorithm when transmitting over the air. 3G uses a 128-bit A5/3 stream cipher which, unlike A5/1 used in GSM (which can be cracked in near real-time using a ciphertext-only attack), has no known practical weaknesses.
So how is 3G faster than EDGE?
UMTS employs a protocol called HSPA – High Speed Packet Access, which is a combination of HSDPA (downlink) and HSUPA (uplink) protocols. The Telstra HSDPA network supports category 10 devices (speeds up to 14.4Mbps down) however most devices are only capable of category 7/8 transmission (7.2Mbps down), and its HSUPA network supports category 6 (5.76Mbps up). These protocols have an improved transport layer by a complex arrangement of physical layer channels (HS-SCCH, HS-DPCCH and HS-PDSCH). The technological implementation of HSPA will not be discussed here but for a basic explanation feel free to watch the below video.
The only major limitation of the 3G network is, not surprisingly, coverage. As stated earlier the 2100MHz network is available to around 50% of Australia’s population and when combined with a 900MHz UMTS network available to about 94%. As expected, the higher 2100MHz component suffers far more attenuation and FSPL and is often considered a ‘short range’ mobile network which is why a lower 900MHz network is required to service many regional and rural areas.
Next-G – 3G on Steroids
To overcome the coverage limitations of regular 3G, Telstra introduced its Next-G network (considered a ‘3.5G’ network) in late 2006, operating on the 850MHz spectrum. The lower radio frequency coupled with a far greater number of phone towers is responsible for Telstra’s Next-G network being over twice the geographical size (around 2.2 million square km) of any other network, and servicing 99% of Australian residences.
Aside from coverage, the other major selling point behind the Next-G network is its blisteringly fast network speed. Rated up to 42Mbps (up to 5.25MB/s) the network has the ability to operate faster than the theoretical maximum of most high speed cable internet services. This is the result of an enhanced packet data network – HSPA+ which was implemented in 2008 as an upgrade to large portions of the Telstra network.
HSPA+ also known as Evolved HSPA, utilises Dual Carrier technology and 64QAM modulation order to deliver these high speeds. HSPA+ is responsible for the ‘Elite’ and ‘Ultimate’ series modems released in 2010, with the Elite capable of up to 21Mbps, and the Ultimate up to 42Mbps.
The Ultimate series modems theoretically double the speed of the Elite device by the utilisation of Dual Carrier HSPA+. This big increase in speed is achieved by the use of dual antennas, you can think of an Ultimate modem as having two Elite modems in the one unit. Combining this technology with MIMO “Multiple In Multiple Out” architecture we can hope to see speeds increased to 84Mbps (ie doubling the 42Mbps) on the Telstra Next-G network in the near future.
Image courtesy of wikipedia
If you’d like a simple explanation of Next-G, I’d recommend this (somewhat corny) video series produced by Telstra http://www.telstra.com.au/mobile/nextg/index.html?vid=1
4G – LTE-Advanced
Initially available in major cities, airports and selected regional areas in October 2011, Telstra’s 4G network offers significantly faster speeds, lower latency, and reduced network congestion.
The 4G network is based on LTE-Advanced – 3GPP Long Term Evolution. LTE is a series of upgrades to existing UMTS technology and will be rolled out on Telstra’s existing 1800MHz frequency band. This new network boosts peak downloads speeds up to 100Mbps and 50Mbps upload, latency reduced from around 300ms to less than 100ms, and significantly lower congestion. For more technical details on peak 4G speeds check out our fastest 4G speed guide.
Most areas in Australia 4G has a 15MHz bandwidth and operates on the following frequency ranges:
Tower Tx: 1805-1820MHz
Tower Rx: 1710-1725MHz
New South Wales and Victoria have a much smaller bandwidth of 10MHz and operate on the following frequencies:
Tower Tx: 1805-1815MHz
Tower Rx: 1710-1720MHz
4G bandwidth (ie the width of frequencies we can send and receive on) is critical in supporting high speed and a high number of users. Because in order for your connection not to get confused with someone else’s, each user is allocated a small sliver of frequencies that they can transmit on and nobody else can. You’ll notice this most during peak usage hours, where as more people start using the tower it will reduce the width of your (and everyone else’s) sliver of frequencies, resulting in each person getting a reduced download/upload speed.
Naturally this is a very simplified explanation (for more info read up on OFDMA and SCFDMA) but for our purposes it will suffice.
When will I get 4G?
Telstra 4G is advertised as available within 5km of CBD areas and airports offering speeds between 2Mbps and 40Mbps. When launched the network was limited to major towns and cities, but since late 2011 the network has expanded to include most major regional towns, with plans to cover about 66% of the Australian population by mid-2013 by deploying 1000 new base stations.
The Telstra Next-G 850MHz network will no doubt remain the backbone of Australian mobile coverage, with LTE 1800MHz serving in high density residential and metro areas effectively creating a ‘hybrid’ network. Multi-mode and multi-frequency 4G modems such as the Telstra 320U USB allow seamless transition between 4G and Next-G networks when on the move, often a slight pause or delay is the most you’ll notice when your modem switches over to the other network.
Given the big reduction in coverage you might be wondering why Telstra chose to deploy its 4G network on the 1800MHz band. Like most decisions the biggest factor governing the choice is money. Already licensed by Telstra, the underutilised 1800MHz network was previously used to provide 2G voice calling and text messaging services, and 2G EDGE data services (often indicated by the ‘E’ symbol on your phone). By converting this band from 2G over to 4G, the network can be deployed with drastically reduced cost and time to market. Instead of building new cell towers, the existing 1800MHz antennas could be swapped with antennas designed for MIMO LTE services and other hardware changes kept to a minimum.
The limited choice of available mobile spectrum means that for the next few years 1800MHz will remain the band of choice for 4G services. Around 2015 the 700MHz “digital dividend” band will become available and we can expect to see a much higher performing 4G network with far greater coverage, speed and signal penetration.
What about backhaul?
With a massive increase in speed, how can the cell tower transmit and retrieve all this extra data from the Internet? Your 4G connection is only as fast as what the phone tower can provide you. Older EDGE or HSPA networks can get away with E1 or optical fibre backhaul links (ie the link that connects the tower into the wider network), but LTE services require a far more advanced Ethernet-based backhaul link. The transition from circuit-switched to packet switched (IP based) networks affords better QoS (through MPLS and other link/network layer protocols) and significant reductions in latency.
4G uses a technology called MIMO “Multiple In Multiple Out” where your modem uses two separate antennas at once to deliver super fast speeds.
Normal 3G and Next-G signals are broadcast vertically polarised, where the wave travels “up and down”. LTE MIMO waves are slant polarised where each wave is rotated 45 degrees from the horizontal, mirrored so the first is at 45 degrees and the other at 135 degrees. This smart little trick is called polarisation diversity and allows your modem to distinguish two independent streams of data over the same frequency allocated by the cell tower.
Because our modem has two internal antennas each responsible for receiving one stream of data, it is absolutely crucial we have two separate external antennas. We cannot use a ‘Y’ patch lead or some other trick to connect both ports of the modem into one antenna, nor can we connect both external antennas into one port.
It is important to know MIMO is switched on and off by the modem. The decision whether to use MIMO is negotiated with the cell tower, whereby the quality of the received and transmitted signals are assessed (a metric known as CQI). When signal strength or quality is low it’s difficult for the modem to distinguish between the two data streams, so when signal levels drop below a certain threshold level, MIMO is switched off and the modem operates with only one antenna (Port 1 on Sierra Wireless modems).
What is Bluetooth, WiFi and WiMAX?
Bluetooth, WiFi and WiMAX are wireless technologies which allow devices to inter-connect and communicate with each other. Radio waves are electomagnetic waves and have different frequencies. These technologies are radio frequencies. Similar to the analogue radio, or FM radio. Bluetooth works on 2.45GHz frequency. WiFi works in two frequency bands 2.4GHz and 5GHz. WiMAX works in two frequency bands, 2 – 11GHz and 10 – 66GHz. See chart below for a comparison of these technologies.
Named after the Danish king, Harold Bluetooth,was the first to emerge, several devices like mobile phones, pdas, headsets, keyboards, mice, medical equipment and even cars now come with this feature. Due to its low cost, manufacturers are willing to implement this technology in most devices. It is designed for short range communications with a range of about 10m. As a result, it consumes less power and are suited for very small battery powered devices and portable devices. Problems associated when devices communicate via infrared or cables are removed. Infrared requires a line of sight, bluetooth only needs to be in reasonable vicinity. As cables are not required, it would be less cumbersome carrying a personal bluetooth device and space would be less cluttered. As bluetooth devices automatically communicate with each other, it requires very little from the user. Bluetooth allows for a wireless Personal Area Network (PAN) with it’s short range. See chart below for a comparison of these technologies. For more technical resources and information relating to Bluetooth see the official Bluetooth site.
WiFi or Wireless Fidelity, has a range of about 100m and allows for faster data transfer rate between 10 – 54Mbps. There are three different wireless standards under WiFi, 802.11a, 802.11b and 802.11g. 802.11 being the wireless standard set by The Institue of Electrical and Electronic Engineers (IEEE). WiFi is used to create wireless Local Area Networks (WLAN). The most widely used standard is 802.11b and 802.11g is expexcted to grow rapidly. These two standards are relatively inexpensive and can be found providing wireless connectivity in airports, railway stations, cafes, bars, restaurants and other public areas. The main difference between the two is the speed. 802.11b has data transfer rate of upto 11Mbps and 802.11g has a rate of upto 54Mbps. 802.11g is a relatively new and has yet to be adopted widely. 802.11a is more expensive and as a result it not available for public access. See chart below for a comparison of these technologies.
WiMAX is Worldwide Interoperability for Microwave Access. The IEEE standard for WiMAX is 802.16 and falls under the category of wireless Metropolitan Area Network (WMAN). WiMAX operates on two frequency bands, 2 – 11GHz and 10 – 66GHz and has a range of about 50km with speeds of upto 80Mbps. This enables smaller wireless LANs to be interconnected by WiMAX creating a large wireless MAN. Networking between cities can beachieved without the need for expensive cabling. It is also able to provide high speed wireless broadband access to users. As it can operate in two frequency bands WiMAX can work by line-of-sight and non-line-of-sight. At the 2 – 11GHz frquency range it works by non-line-of-sight, where a computer inside a building communicates with a tower/antenna outside the building. Short frequency transmissions are not easily disrupted by physical obstructions. Higher frequency transmissions are used for non-line-of-sight service. This enables to towers/antennae to communicate with each other over a greater distance. Due to infrastructure and costs involved it would be more suited to provide the backbone services for ISPs and large corporations providing wireless networking and internet access. See chart below for a comparison of these technologies.
Wireless Technology Comparison Chart
|Bluetooth||WiFi (a)||WiFi (b)||WiFi (g)||WiMAX|
|Frequency (GHz)||2.45||5||2.4||2.4||2 – 66|
|Advantages||Low Cost||Speed||Low Cost||Speed||Speed, Range|
The two leading laptop processor manufacturers are AMD and Intel. Wide range of processors are available from both manufacturers to suit various needs, from the very basic to high end graphics and multimedia usage. Intel’s latest mobile offering is the Intel Pentium M processor with Centrino technology. AMD have come up with the 64bit architecture, Mobile AMD Athlon 64. Both companies are manufacturing smaller and lighter processors. The architecture is based on 90nm process technology, 130nm (0.13-micron) being the standard used so far in process technology.
Mobile processors have integrated wireless connectivity, allowing notebooks to connect to 802.11 a,b and g based networks. Another key development is low voltage and ultra-low voltage for power efficiency. There are several Other technologies offered by both AMD and Intel, some of the main ones are listed below. These enhance performance and a time saving by running faster and loading applications faster. Thereby producing a power saving and prolonging the battery life.
Check the processor section of Tom’s Hardware Guide for more information about processors.
- Optimizes battery life
- Provides performance on demand when required by the application
- Allows the processor to dissipate less heat under normal operating conditions, providing a cooler and quieter running notebook
- Operates automatically in the background
- Combined with an AMD Mobile processor, this technology delivers outstanding 3D graphics and multimedia capabilities
- Reduces I/O bottlenecks
- Increases system bandwith
- Reduces system latency
- Provides extra security at the platform level in conjuntion with Microsoft Windows XP Service Pack 2
- Intelligent power distribution
- Power optimized logic design – optimizes consumption and dissipation levels for lower CPU average power
- Automatically adjusts and powers down to preserve battery life whenever possible. (When using less computing intensive tasks)
- Reduces the number of instructions per task
- Reduces the number of micro-ops per instruction
- Reduces the number of transistor switches per micro-op
- Reduces the amount of energy per transistor switch
Enhanced Intel SpeedStep Technology
- Minimizes system and processor unavailability
- Self-managed voltage and frequency stepping
- Optimizes power and performance according to demand
CDMA vs. GSM: What’s the Difference?
If you’re shopping for a mobile phone, you’re in for a lot of acronyms. Here’s what you need to know about two basic, yet important, terms.
Two basic technologies in mobile phones, CDMA and GSM represent a gap you can’t cross. They’re the reason you can’t use many AT&T phones on Verizon’s network and vice versa. But what does CDMA vs. GSM really mean for you?
CDMA (Code Division Multiple Access) and GSM (Global System for Mobiles) are shorthand for the two major radio systems used in cell phones. Both acronyms tend to group together a bunch of technologies run by the same entities. In this story, I’ll try to explain who uses which technology and what the real differences are.
Which Carriers are CDMA? Which are GSM?
In the U.S., Sprint, Verizon and U.S. Cellular use CDMA. AT&T and T-Mobile use GSM.
Most of the rest of the world uses GSM. The global spread of GSM came about because in 1987, Europe mandated the technology by law, and because GSM comes from an industry consortium. What we call CDMA, by and large, is owned by chipmaker Qualcomm. This made it less expensive for third parties to build GSM equipment.
There are several variants and options carriers can choose, like toppings on their technological ice cream. In this story we’ll focus on U.S. networks.
What CDMA vs. GSM Means to You
For call quality, the technology you use is much less important than the way your carrier has built its network. There are good and bad CDMA and GSM networks, but there are key differences between the technologies. Here’s what you, as a consumer, need to know.
It’s much easier to swap phones on GSM networks, because GSM carriers put customer information on a removable SIM card. Take the card out, put it in a different phone, and the new phone now has your number. What’s more, to be considered GSM, a carrier must accept any GSM-compliant phone. So the GSM carriers don’t have total control of the phone you’re using.
That’s not the case with CDMA. In the U.S., CDMA carriers use network-based white lists to verify their subscribers. That means you can only switch phones with your carrier’s permission, and a carrier doesn’t have to accept any particular phone onto its network. It could, but typically, U.S. carriers choose not to.
Many Sprint and Verizon phones now have SIM cards, but that isn’t because of CDMA. The SIM cards are generally there for Sprint’s and Verizon’s 4G LTE networks, because the LTE standard also uses SIM cards. The phones may also have SIM slots to support foreign GSM networks as “world phones.” But those carriers still use CDMA to authenticate their phones on their own home networks.
3G CDMA networks (known as “EV-DO” or “Evolution Data Optimized”) also, generally, can’t make voice calls and transmit data at the same time. Once more, that’s an available option (known as “SV-DO” for “Simultaneous Voice and Data Optimization”), but one that U.S. carriers haven’t adopted for their networks and phones.
On the other hand, all 3G GSM networks have simultaneous voice and data, because it’s a required part of the spec. (3G GSM is also actually a type of CDMA. I’ll explain that later.)
So why did so many U.S. carriers go with CDMA? Timing. When Verizon’s predecessors and Sprint switched from analog to digital in 1995 and 1996, CDMA was the newest, hottest, fastest technology. It offered more capacity, better call quality and more potential than the GSM of the day. GSM caught up, but by then those carriers’ paths were set.
It’s possible to switch from CDMA to GSM. Bell and Telus in Canada have done it, to get access to the wider variety of off-the-shelf GSM phones. But Verizon and Sprint are big enough that they can get custom phones built for them, so they don’t see the need to waste money switching 3G technologies when they could be building out their 4G networks.
The Technology Behind CDMA vs. GSM
CDMA and GSM are both multiple access technologies. They’re ways for people to cram multiple phone calls or Internet connections into one radio channel.
GSM came first. It’s a “time division” system. Calls take turns. Your voice is transformed into digital data, which is given a channel and a time slot, so three calls on one channel look like this: 123123123123. On the other end, the receiver listens only to the assigned time slot and pieces the call back together.
The pulsing of the time division signal created the notorious “GSM buzz,” a buzzing sound whenever you put a GSM phone near a speaker. That’s mostly gone now, because 3G GSM (as I explain later) isn’t a time division technology.
CDMA required a bit more processing power. It’s a “code division” system. Every call’s data is encoded with a unique key, then the calls are all transmitted at once; if you have calls 1, 2, and 3 in a channel, the channel would just say 66666666. The receivers each have the unique key to “divide” the combined signal into its individual calls.
Code division turned out to be a more powerful and flexible technology, so “3G GSM” is actually a CDMA technology, called WCDMA (wideband CDMA) or UMTS (Universal Mobile Telephone System). WCDMA requires wider channels than older CDMA systems, as the name implies, but it has more data capacity.
Since its inception, GSM has evolved faster than CDMA. As I mentioned above, WCDMA is considered the 3G version of GSM technology. To further speed things up, the 3GPP (the GSM governing body) released extensions called HSPA, which have sped GSM networks up to as fast as 42Mbps, at least in theory.
Our CDMA networks, meanwhile, are stuck at 3.6Mbps. While faster CDMA technologies exist, U.S. carriers chose not to install them and have instead turned to 4G LTE to be more compatible with global standards.
The Future is LTE
The CDMA vs. GSM gap will close eventually as everyone moves to 4G LTE, but that doesn’t mean everyone’s phones will be compatible. LTE, or “Long Term Evolution,” is the new globally accepted 4G wireless standard. All of the U.S. carriers are turning it on. For more, see 3G vs. 4G: What’s the Difference?
The problem is, they’re turning it on in different frequency bands, with different 3G backup systems, and even, in the case of the new Sprint Spark network, using an LTE variant (TD-LTE) that doesn’t work with any other U.S. carrier’s phones. There are very few phones that support all of the carriers’ LTE bands.
Verizon has said it aims to start selling LTE-only phones in 2015, but for now, those will require special Verizon software to make voice calls, so that move won’t make it any easier to switch carriers with your phone. Even without CDMA, the CDMA philosophy of carrier control of your phone will remain intact.
A growing number of phones support all of these standards, but it can be hard to tell which ones. The iPhone 6, the iPhone 6 Plus and the Google Nexus 6 are the most flexible. iPhone 6 and 6 Plus units from AT&T, T-Mobile, and Verizon can all be used on all three carriers, but they lack Sprint’s special LTE bands. Sprint iPhones have all the bands, but Sprint has strict unlocking policies. Nexus 6 phones will technically work on all four carriers, but Sprint only allows phones purchased from Google or Sprint on its network.
HTC One (M8) and Samsung Galaxy S5 phones from Verizon will work somewhat on AT&T’s and T-Mobile’s networks, albeit with limited coverage because while they have CDMA, GSM and LTE, they don’t have all the frequency bands AT&T and T-Mobile use. Variants of those same models sold by AT&T and T-Mobile won’t work on Verizon at all, because they lack the CDMA radio needed for Verizon. It’s a mess.
So what does all of this mean for you? If you want to switch phones often, use your phone in Europe, or use imported phones, just go with GSM. Otherwise, pick your carrier based on coverage and call quality in your area and assume you’ll probably need a new phone if you switch carriers. Our Readers’ Choice and Fastest Mobile Networks awards are a great place to start.