Welcome!

News Feed Item

GSMA Report Reveals Licensed Spectrum For Mobile Offers Best Possible Economic Benefit

- Spectrum Sharing Agreements for Mobile Broadband Should Not Replace the Need for Exclusive Spectrum Licensing

LONDON, Feb. 11, 2014 /PRNewswire/ -- The GSMA today issued a new report indicating that shared spectrum can complement but in no way replaces the need for exclusive-access spectrum in the provision of mobile broadband. The report, "The Impacts of Licensed Shared Use of Spectrum", developed by Deloitte, highlights how strict limitations associated with Licensed Shared Access (LSA)(1) spectrum agreements - such as shorter terms, build obligations, lack of certainty and small allocations - can significantly reduce the likelihood of a mobile operator to invest. This means that the potential economic benefits derived from spectrum sharing are ultimately lower than those achieved through exclusive-access spectrum.

"The GSMA commends efforts by regulators around the world to rapidly find a solution for the current spectrum crunch," said Tom Phillips, Chief Regulatory Officer, GSMA. "While sharing schemes could provide a complementary approach to ease rapidly growing demand for spectrum, exclusive access to spectrum for mobile use is the optimal regulatory approach, providing the necessary market certainty to stimulate investments in networks and services."

The report is based on a model that assesses the prospective value of two potential Licensed Shared Access scenarios: the release of 50MHz in the European Union in the 2.3GHz band from 2020 and of 100MHz in the 3.5GHz band in the United States from 2016. The many variables involved and the additional risks, complexities and uncertainties involved with spectrum sharing mean that each sharing opportunity should ideally be evaluated on a case by case basis, making a generalised approach impossible. Findings from the report include:

European Union:

  • Exclusive licensed spectrum in the 2.3GHz band could add €86 billion (US $116 billion) to the EU's economy in the period 2016-2030.
  • Shared licensing could sharply reduce economic benefits to €70 billion (US $95 billion) or as low as €5 billion (US $6.7 billion), due to a lack of common approach in spectrum allocation across the Member States, combined with significant geographic and timing exclusions as well as potential contracting limitations.

United States:

  • For the same time period, exclusive spectrum licensing in the 3.5GHz band would add US $260 billion (€192 billion) to the US economy.
  • In the case that sharing terms strictly limit the use of spectrum by mobile operators, this value would be sharply reduced to US $210 billion (€155 billion) or as little as US $7 billion (€5 billion).

The report is released amidst continued rapid growth in mobile traffic and consumer demand for smartphones, tablets and other devices that provide access to communications and information services. The study further finds that the release of exclusive-access spectrum for mobile broadband offers wider socio-economic benefits for the United States and European Union over the period 2016-2030, including future job creation. It is estimated that the deployment of mobile broadband would generate approximately 2.1 million jobs in the US and nearly 1.6 million jobs in the EU across this period.

"Spectrum is the lifeblood of the mobile industry. To attract investment and reap the full economic benefits of mobile broadband, regulators need to provide access to a critical mass of spectrum," continued Phillips. "For the EU and US, this can be achieved through the harmonisation of bands, on similar contractual terms and conditions, as well as limited geographic and timing exclusions. For these reasons, shared spectrum is not a substitute for exclusive-access spectrum, and governments and regulators should not fully rely on shared spectrum for the provision of mobile broadband in the future."

To access the report please visit: http://www.gsma.com/spectrum/the-impact-of-licensed-shared-use-of-spectrum/

Note to Editors
(1) The report defines Licensed Shared Access as an individual licensed regime of a limited number of licensees in a frequency band, already allocated to one or more incumbent users. This type of spectrum sharing model involves a "vertical" sector, such as the military or broadcasters, selling, leasing or otherwise providing access to its underutilised licensed spectrum to a mobile operator in areas or at times when it is not being used.

About the GSMA
The GSMA represents the interests of mobile operators worldwide. Spanning more than 220 countries, the GSMA unites nearly 800 of the world's mobile operators with 250 companies in the broader mobile ecosystem, including handset and device makers, software companies, equipment providers and Internet companies, as well as organisations in industry sectors such as financial services, healthcare, media, transport and utilities. The GSMA also produces industry-leading events such as Mobile World Congress and Mobile Asia Expo.

For more information, please visit the GSMA corporate website at www.gsma.com. Follow the GSMA on Twitter: @GSMA.

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Contextual Analytics of various threat data provides a deeper understanding of a given threat and enables identification of unknown threat vectors. In his session at @ThingsExpo, David Dufour, Head of Security Architecture, IoT, Webroot, Inc., discussed how through the use of Big Data analytics and deep data correlation across different threat types, it is possible to gain a better understanding of where, how and to what level of danger a malicious actor poses to an organization, and to determin...
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
Digital transformation has increased the pace of business creating a productivity divide between the technology haves and have nots. Managing financial information on spreadsheets and piecing together insight from numerous disconnected systems is no longer an option. Rapid market changes and aggressive competition are motivating business leaders to reevaluate legacy technology investments in search of modern technologies to achieve greater agility, reduced costs and organizational efficiencies. ...
CI/CD is conceptually straightforward, yet often technically intricate to implement since it requires time and opportunities to develop intimate understanding on not only DevOps processes and operations, but likely product integrations with multiple platforms. This session intends to bridge the gap by offering an intense learning experience while witnessing the processes and operations to build from zero to a simple, yet functional CI/CD pipeline integrated with Jenkins, Github, Docker and Azure...
Fact: storage performance problems have only gotten more complicated, as applications not only have become largely virtualized, but also have moved to cloud-based infrastructures. Storage performance in virtualized environments isn’t just about IOPS anymore. Instead, you need to guarantee performance for individual VMs, helping applications maintain performance as the number of VMs continues to go up in real time. In his session at Cloud Expo, Dhiraj Sehgal, Product and Marketing at Tintri, sha...
"We do one of the best file systems in the world. We learned how to deal with Big Data many years ago and we implemented this knowledge into our software," explained Jakub Ratajczak, Business Development Manager at MooseFS, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Traditional IT, great for stable systems of record, is struggling to cope with newer, agile systems of engagement requirements coming straight from the business. In his session at 18th Cloud Expo, William Morrish, General Manager of Product Sales at Interoute, will outline ways of exploiting new architectures to enable both systems and building them to support your existing platforms, with an eye for the future. Technologies such as Docker and the hyper-convergence of computing, networking and...
Containers, microservices and DevOps are all the rage lately. You can read about how great they are and how they’ll change your life and the industry everywhere. So naturally when we started a new company and were deciding how to architect our app, we went with microservices, containers and DevOps. About now you’re expecting a story of how everything went so smoothly, we’re now pushing out code ten times a day, but the reality is quite different.
The hierarchical architecture that distributes "compute" within the network specially at the edge can enable new services by harnessing emerging technologies. But Edge-Compute comes at increased cost that needs to be managed and potentially augmented by creative architecture solutions as there will always a catching-up with the capacity demands. Processing power in smartphones has enhanced YoY and there is increasingly spare compute capacity that can be potentially pooled. Uber has successfully ...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addres...
You want to start your DevOps journey but where do you begin? Do you say DevOps loudly 5 times while looking in the mirror and it suddenly appears? Do you hire someone? Do you upskill your existing team? Here are some tips to help support your DevOps transformation. Conor Delanbanque has been involved with building & scaling teams in the DevOps space globally. He is the Head of DevOps Practice at MThree Consulting, a global technology consultancy. Conor founded the Future of DevOps Thought Leade...
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes high” is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee A...
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the p...
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...