Is the Internet Fragmenting, Part 2: The Technical Lens
On Wednesday, June 15, Microsoft and the Greater Washington DC Chapter of the Internet Society (ISOC-DC) held Part 2 of our series on Fragmentation, “Is the Internet Fragmenting? The Technical Lens” at the Microsoft Innovation and Policy Center. Stakeholders from government, industry, the technical community, civil society, and other organizations examined how technology choices are fragmenting the Internet and the role of technology in business and policy decisions.
This event is Part 2 of a four-part series of dialogues organized in response to recent developments related to the Internet that have prompted alarming questions about whether it is fragmenting. They include a diverse set of technical, economic, and policy developments and decisions that have been taken in response to the continued growth and globalization of the Internet and its evolving role as critical infrastructure for the digital economy. Taken together, they raise an overarching concern on whether the global Internet may be fragmenting from a universal system due to the intended or unintended consequences of technical, commercial, and/or political decisions taken without full consideration of their potential impact.
Dr. David Farber
Adjunct Professor of Internet Studies and Distinguished Career Professor of Computer Science and Public Policy, School of Computer Science, Carnegie Mellon University
A PANEL DISCUSSION FEATURING:
Dr. Eric Burger
Research Professor of Computer Science and Director, Security and Software Engineering Research Center, Georgetown University
Internet Policy Advisor, U.S. Department of State
Principal Engineer, Cisco Systems
Dr. Milton Mueller
Author & Professor, Georgia Institute of Technology School of Public Policy
Internet infrastructure consultant, Member of Internet Architecture Board, Liaison to ICANN Board of Directors for the Root Server System Advisory Committee
Dr. M-H. Carolyn Nguyen – Moderator
Technology Policy Strategist, Microsoft
The keynote speaker, Dr. David Farber, started the dialogue stating that fragmentation is a relatively recent phenomenon. Initially, the Internet was designed to connect researchers from around the world, enabling them to share information and results. As such, issues such as security and privacy were not addressed. Researchers were focused on how to make the Internet work – this is still the primary focus of the technical community, and thus fragmentation is more political than technical. Differing views on IPR, privacy, and security drive government policy and regulations that fragment the Internet. As a result, Dr. Farber predicts even more fragmentation in the future. First, in the protocol structure, the machines we use, and software are not robust enough and thus it is hard to implement changes. Second, security will be one of the biggest causes of fragmentations. Security concerns will make countries hesitant to connect to networks and resources in other countries unless there is assurance that their information and systems won’t be compromised. Consequently, more and more governments will try to impose their own encryption standards and require storage of their citizens’ data either in a secure place, or more commonly, within their sovereign borders. These issues are not unsolvable technically, but are very challenging on the policy level.
Dr. Milton Mueller did not think that the Internet is in danger of any major fragmentation technically. The economic and technical benefits of compatibility are what makes the system more valuable. Fragmentation is cause by mis-alignment between political goals and technology, e.g., when governments try to bring Internet communications under their territorial control. Data localization is an example. It fragments the service of cloud providers and destroys efficiencies, but doesn’t break the Internet. This is an example of misalignment between political control and technical capabilities. The big danger is that governments want to make the global Internet match their maps of the territorial sovereignty.
Mr. Elliot Lear discussed some of the challenges intrinsic to having one stable technical layer. A stable layer 3 allows anyone to connect on layer 3 with anyone else with only a few exceptions. The shortage of IPv4 addresses was the primary reason for the development of IPv6. However, this transition has been very slow and difficult – without some very stark regulations, this can cause fragmentation. Higher layer fragmentation, however, can get solved by the market. For example, with Instant Messaging, there are numerous standards and applications. However, market needs have driven the development of programs that can provide interoperability between these applications.
Mr. Lear mentioned that government can cause fragmentation. For example, if one government mandates a certain type of encryption and another mandates a different type, people will not be able to connect between the two. Another example is if one government mandates a particular routing paradigm and another mandates a different one. We saw a little of this friction between ITU and IETF between 2010 and 2012.
Ms. Suzanne Woolf stated that even asking these questions assumes a certain level of interoperability in the infrastructure and there are quite a few challenges in maintaining stability of the infrastructure as the Internet continues to grow and evolve. Just keeping up is a big challenge, and it can be very difficult to deploy new technologies, e.g., the challenges in updating the DNS protocol standards. There can be good reasons within networks for deploying new technologies that can inadvertently create challenges to the infrastructure, such as networks that create fast lanes for select contents. Ms. Wolfe is a big proponent of open standards and open source as keys to maintaining interoperability and interconnection.
Dr. Eric Burger framed fragmentation as what happens when a country cuts itself off from the Internet deliberately. Technology is neutral – neither good nor evil. The same spam filters that are used by service providers to protect consumers from harmful codes can be used by governments to block dissent. The great firewall of China was deployed by the government to protect their citizens from malware as well as control dissent. A deeper problem occurs when the bad guys figure out ways around spam filters and use encryption techniques. The good guys have to look deeper to identify sources of disruption. Again, the same tools that are used for to disrupt criminal activity can be used to silence dissent. Good policy and regulations are necessary.
According to Ms. Micaela Klein, much of the misalignment, as noted earlier in the discussion, is related geopolitics rather than technology. Governments are trying to retain power in a more decentralized world, especially in countries that feel that development has favored western nations, and so they gravitate towards centralized inter-governmental institutions such as the UN. Data localization requirements are the results of protectionist policies.
Technology standardizations is a very relevant part of this discussion. Governments are leveraging inter-governmental institutions such as ITU to drive standardization of technologies that will give them competitive advantages, instead of participating in the traditional voluntary technical standards organizations such as the IETF. Instead of technical discussion on interoperability and development of voluntary standards, specific standards are mandated. This process can lead to choking innovation. This will impact underserved countries more than it will the U.S.
Some takeaways from the discussion include the following:
- A big part of the challenge stems from policy that is misaligned with technology.
- Much of the tension is the result of governments trying to impose sovereignty over elements of the Internet.
- Correspondingly, technology is value neutral, and only informed policy can create solutions to many of the issues driving fragmentary pressures.
- The Internet is growing and evolving rapidly. Implementing new technologies to keep up with this is challenging and can cause fragmentary pressures. The shortage of ipV4 addresses and the transition to ipV6 addresses is an example.
- Internet innovations require technology standardization processes that are voluntary, and not mandated by inter-governmental organizations.