|By PR Newswire||
|March 4, 2014 08:07 AM EST||
SYRACUSE, N.Y., March 4, 2014 /PRNewswire-USNewswire/ -- Researchers at the School of Information Studies (iSchool) at Syracuse University have released an innovative proposal to resolve the 15-year controversy over the United States government's special relationship to the Internet Corporation for Assigned Names and Numbers (ICANN).
The proposal, which involves removing root zone management functions from ICANN and creating an independent and neutral private sector consortium to take them over, will be presented at the Singapore ICANN meeting March 21, and then formally submitted to the "NETMundial" Global Multistakeholder Meeting on the Future of Internet Governance in SaoPaulo, Brazil, to be held April 23 and 24.
"We think this plan provides the roadmap for making ICANN into a truly global and multistakeholder institution," said Dr. Milton Mueller, professor at the iSchool and the proposal's co-author.
ICANN is a key institution in the global governance of the Internet. It develops policy for the domain name system and also plays an important role, with Verisign, Inc., in managing the root of the Internet's name and address spaces. Both Verisign and ICANN fulfill their respective operational and policy making roles under separate contracts with the U.S. government. ICANN's contract with the U.S. is known as the IANA Functions contract.
While the contracts are an understandable legacy of the Internet's origins in the U.S. Defense Department and National Science Foundation contracts, the U.S. has maintained control of ICANN long after it promised to let go. This has invited other governments, including authoritarian ones, to demand equal oversight authority over the DNS. "Unless we take a consistent and principled approach to non-governmental Internet governance," Dr. Mueller claimed, "it is only a matter of time before other governments succeed in bringing the coordination and management of the Internet under the control of intergovernmental treaty organizations."
Dissatisfaction with the exclusive U.S. role reached a turning point in October 2013, when the directors of all the major Internet technical organizations (ICANN, the Internet Engineering Task Force, the Internet Architecture Board, the World Wide Web Consortium, the Internet Society, and all five of the regional Internet address registries) issued a statement calling for "the globalization of ICANN and IANA functions, towards an environment in which all stakeholders, including all governments, participate on an equal footing."
Mueller's proposal is an attempt to develop a blueprint for globalization of the IANA functions. In summary, the plan would
- Structurally separate the IANA functions from ICANN's policy process, and ensure that the IANA functions are never used for political or regulatory purposes
- Integrate the DNS-related IANA functions with the Root Zone Maintainer functions performed by Verisign, and put them into a new, independent "DNS Authority" (DNSA)
- Create a nonprofit controlled by a consortium of TLD registries and root server operators to run the DNSA.
- Complete the transition by September 2015, when the current IANA contract expires
"It's important/essential not to conflate policy with the operation of the root zone," said Dr. Brenden Kuerbis, the co-author of the study, and a postdoctoral researcher at the iSchool. "It makes sense to put operational authority in the hands of an entity comprised of the registries and root server operators, as they are directly impacted by operation of the root, and have strong incentives to ensure its stability and security."
"Contractually binding the DNSA to ICANN ensures adherence to the policy development process, and provides an important accountability function," Kuerbis added. "It's an institutional design that is consistent with the multistakeholder model and achievable in the near term."
A more detailed paper outlining the proposal is available on the Internet Governance Project website. The proposal was also formally submitted to the NETMundial (Brazil) meeting on March 2.
Housed at the iSchool, the Internet Governance Project is an alliance of academics in the fields of global governance, Internet policy, and information and communication technology. The Project both researches and publishes analysis of global Internet policy issues. Its goals are to:
- Inform and shape Internet public policy choices by providing independent analysis and timely recommendations.
- Identify and analyze new possibilities for improving Internet governance institutions
- Develop policy positions guided by the values of globalism, democratic governance and individual rights.
SOURCE School of Information Studies (iSchool) at Syracuse University
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again. In his session at Big Data Expo, Greg Bruno, Vice President of Engineering and co-founder of StackIQ...
Jan. 19, 2017 01:15 AM EST Reads: 7,781
The cloud market growth today is largely in public clouds. While there is a lot of spend in IT departments in virtualization, these aren’t yet translating into a true “cloud” experience within the enterprise. What is stopping the growth of the “private cloud” market? In his general session at 18th Cloud Expo, Nara Rajagopalan, CEO of Accelerite, explored the challenges in deploying, managing, and getting adoption for a private cloud within an enterprise. What are the key differences between wh...
Jan. 19, 2017 01:15 AM EST Reads: 6,087
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
Jan. 19, 2017 01:00 AM EST Reads: 1,271
"Tintri was started in 2008 with the express purpose of building a storage appliance that is ideal for virtualized environments. We support a lot of different hypervisor platforms from VMware to OpenStack to Hyper-V," explained Dan Florea, Director of Product Management at Tintri, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jan. 19, 2017 12:45 AM EST Reads: 4,645
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and containers together help companies achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of Dev...
Jan. 19, 2017 12:00 AM EST Reads: 4,173
One of the hottest areas in cloud right now is DRaaS and related offerings. In his session at 16th Cloud Expo, Dale Levesque, Disaster Recovery Product Manager with Windstream's Cloud and Data Center Marketing team, will discuss the benefits of the cloud model, which far outweigh the traditional approach, and how enterprises need to ensure that their needs are properly being met.
Jan. 18, 2017 11:15 PM EST Reads: 4,460
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
Jan. 18, 2017 09:45 PM EST Reads: 6,513
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Jan. 18, 2017 09:30 PM EST Reads: 7,628
Big Data, cloud, analytics, contextual information, wearable tech, sensors, mobility, and WebRTC: together, these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at @ThingsExpo, Erik Perotti, Senior Manager of New Ventures on Plantronics’ Innovation team, provided an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it m...
Jan. 18, 2017 09:30 PM EST Reads: 5,741
In their general session at 16th Cloud Expo, Michael Piccininni, Global Account Manager - Cloud SP at EMC Corporation, and Mike Dietze, Regional Director at Windstream Hosted Solutions, reviewed next generation cloud services, including the Windstream-EMC Tier Storage solutions, and discussed how to increase efficiencies, improve service delivery and enhance corporate cloud solution development. Michael Piccininni is Global Account Manager – Cloud SP at EMC Corporation. He has been engaged in t...
Jan. 18, 2017 08:15 PM EST Reads: 4,857
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
Jan. 18, 2017 07:30 PM EST Reads: 3,146
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus o...
Jan. 18, 2017 06:15 PM EST Reads: 4,197
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
Jan. 18, 2017 05:30 PM EST Reads: 4,886
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
Jan. 18, 2017 05:00 PM EST Reads: 1,166
Hardware virtualization and cloud computing allowed us to increase resource utilization and increase our flexibility to respond to business demand. Docker Containers are the next quantum leap - Are they?! Databases always represented an additional set of challenges unique to running workloads requiring a maximum of I/O, network, CPU resources combined with data locality.
Jan. 18, 2017 05:00 PM EST Reads: 330