Are You Happy With Your Internet? A Discussion about Broadband Competition with Blair Levin
What is the state of broadband competition in the US? Ask 4 experts and you will get about 6 different answers. How does the US stack up against other advanced economies? What can and should be done to improve broadband competition in the US? ISOC-DC recently hosted special discussion with Blair Levin, currently a Brookings Non-Resident Fellow, about policies for intensifying competition in the broadband era. Based on lessons learned as a senior government official in the development and aftermath of the 1996 Telecom Act and the 2010 National Broadband Plan, as well as working with such efforts as Gig.U and Republic Wireless, Levin will outline a framework for what society should want competition to deliver, where competition comes from and the current policy levers most likely to intensify competition.
Suggested Reading
Does the FCC’s Open Internet Order Survive a Cost-Benefit Test? These 13 Economists Don’t Think So. By Hal Singer
The FCC’s Competition Agenda By Hal Singer
Does The Tumble In Broadband Investment Spell Doom For The FCC’s Open Internet Order? By Hal Singer
Broadband Opportunity Council, Report and Recommendations Pursuant to the Presidential Memorandum on Expanding Broadband Deployment and Adoption by Addressing Regulatory Barriers and Encouraging Investment and Training August 20, 2015 Co-Chairs: Secretary Penny Pritzker, U.S. Department of Commerce Secretary Tom Vilsack, U.S. Department of Agriculture
The Next Generation Network Connectivity Handbook: A Guide for Community Leaders Seeking Affordable, Abundant Bandwidth Vol. 10 July, 2015 Gig.U
Achieving Bandwidth Abundance: The Three Policy Levers for Intensifying Broadband Competition Version 1.1
Blair Levin – Internet Policy Forum, Fall ‘15
Today I want to talk about competition and broadband. Much has been written on the topic. In the last year there have been three speeches, by Chairman Wheeler[1], FCC General Counsel Sallet[2], and Antitrust Division Chief Baer[3], which have been both important and wise. My comments do not conflict with theirs but I’ll address questions outside the scope of their remarks. Consistent with their official positions, they made policy pronouncements on regulatory approaches and merger analysis. My comments represent more a progress report and work in progress from the field, deriving more from game theory and lessons I learned in the government with both the 1996 Act and the National Broadband Plan, as well as working with broadband competition initiatives, such as Gig.U and Republic Wireless. My comments are, however, in conflict with a great deal of what has been written about competition and broadband. I could cite many examples but let me offers two illustrations:
- Techdirt blogger Karl Bode’s piece arguing that Google Fiber proved the worthlessness of the National Broadband Plan, ignoring how the Plan was the stimulus for Google’s Fiber effort, that Google and Plan made similar recommendations for policy changes and most of all, how his own proposal—unbundling–would have killed Google Fiber; and
- Chairman Genachowski’s speech articulating the need for Gigabit networks, but which offered no analysis for why we don’t have them or any strategy for getting them deployed, other than to “challenge” cities and states to cause them to be built, as if the only thing holding us back was his failure to act or the only power the FCC had was to say “pretty please.”
I have critiqued the substance of these pieces elsewhere[4] but for purposes of today’s talk, what Mr. Bode and Chairman Genachowski had in common was a belief in the magic of words, as if broadband existed in some Harry Potter like universe in which the incantation of the word competition or gigabit, if said enough, or loudly enough is a substitute for a realistic plan followed by concrete steps to achieve it. Sadly, much of the commentary on the topic suffers from a similar flaw. This fundamental aspiration error[5]—that policy thought leadership is the mere statement of aspiration—affects much of the debate about broadband. Too many only wish to own a narrative, instead of owning a problem.[6] Owning a problem requires starting with a framework, but then engaging in action, allowing for experimentation, and course correcting in light of evidence. The trial and many errors of my own work have led me to believe in the following bottom line: that the highest priority for government broadband competition policy ought to be to lower input costs for adjacent market competition and network upgrades.[7] Today I will make the case for that bottom line and illustrate where I think the greatest opportunity is; to create a virtuous cycle of upgraded mobile stimulating low-end broadband to upgrade, which in turn causes an upgrade of high-end broadband which, by using its assets to enter mobile, accelerates the need for mobile to accelerate its upgrade further. My purpose today is not so much to convince you that I am right as to move the broadband competition discussion away from the emptiness of much of what is written to the reality of how enterprises have incentives to invest in the faster, cheaper, better delivery of bits. And if someone has a better bottom line, great. But let me start the discussion by telling you how I got to mine. I start with three questions:
- What do we want broadband competition to accomplish?
- Where does broadband competition come from?
- Given the current market, what are the appropriate government levers to intensify competition at this part of the cycle?
First, what do we want broadband competition to accomplish? Competition is generally thought of as the means, not the ends, of improving consumer welfare. That is, we believe competition is the most likely means to deliver the optimal goods and services. In the debate leading up to and in the implementation of the 1996 Act, the vision was increased competition in all communications markets but most of the debate focused on the voice market. The outcome sought was clear: lower prices.[8] Broadband is different. There are a number of variables we wish competition to deliver. The two most prominent are lower prices and improved performance[9], but ubiquity, security, privacy protection, and providing a platform for free and diverse speech, among others, are also desired outcomes. Optimizing for multiple factors makes policy calls more complex than when aiming for a single goal.[10] Different policies can deliver better outcomes on some metrics and worse outcomes on others, requiring decisions about priorities and trade-offs for which there may be no “right” answer. This makes competition more important, as competition can optimize for multiple factors according to what customers want more adroitly than a policy process. My own view of what we want competition to deliver at this point in the cycle is the elimination of bandwidth as a constraint to innovation, economic growth and social progress.[11] As the global economy moves from being primarily about the manipulation and transportation of atoms to being primarily about knowledge exchange, bandwidth becomes our commons of collaboration and bandwidth constraints would present a major obstacle economic and social progress.[12] Further, I believe that goal is likely to be achieved when there are at least two next generation networks capable of delivering all foreseeable needs for the next decade and with a viable upgrade path. With only one such network, economic forces will likely price the marginal use of bandwidth at a level that constrains growth and progress. Thus, we need multiple networks to upgrade to next generation networks. In short, we want competition to help move us from today’s world, where the dominant business model is how to allocate bandwidth scarcity, to the world we need, which is competition over who can best deploy bandwidth abundance.[13] Second, where does broadband competition come from? In my experience, there are two related answers.[14] The first goes to the nature of the competitive enterprise and the second involves an economic equation. There are only four kinds of enterprises capable of intensifying competition: existing players and new entrants, which come in three varieties:
- Greenfield new entrants, constituting new ventures;
- Adjacent market entrants who bring asymmetric assets and interests into the market; and
- Entrants who depend on inputs sold on a wholesale basis, a strategy that can include regulated access to unbundled elements.
All four follow three similar economic patterns. First, intensified competition always requires a new capital allocation decision by one of those four kinds of enterprises. Perhaps now my conservative friends are nodding and my progressive friends are getting nervous. But as a factual matter we should all agree that every time we have seen intensified competition, it follows a company shifting its capital allocation from one purpose to the purpose of providing or upgrading a communications service. A second pattern is that the new capital allocation decision follows a change in the same formula. Ask yourself, why don’t we have more intense competition now? The answer is this formula: C + O > (1-r)R + SB+ (-CL) That is, the reason we don’t have greater competition is that the new or incremental Capital and Operating Expenses of a network capable of intensifying competition are greater than the Risk adjusted new or incremental Revenues, plus the Benefits to the System[15], plus the risk of lost revenues due to Competition.[16] If we want to intensify competition, we have to change that math, causing, where possible, cap ex, op ex and risk to go down and revenues, system benefits and competition to go up. Third, historically, the biggest changes in the competitive landscape in communications result from changes in the formula which themselves result directly from changes in government policy.[17] This is where progressives get interested and conservatives get nervous. I won’t do the full history here, but a few examples of companies reallocating capital to intensify competition should suffice:
- Cable intensified competition with broadcast when government rules lowered its cap ex and op ex through pole attachment rules and copyright rules that gave it access to programming.
- DBS intensified MVPD competition when the government lowered its op ex by granting non-discriminatory access to programming and the telcos did so as well when local franchising monopolies were prohibited and then, state franchising was adopted, lowering costs for the telcos.
- Wireless began competing with wire line voice when the government both enabled more wireless competition with the PCS spectrum auctions and lowered its op ex by reducing the terminating access charges wireless had been paying wired providers.
- Cable began competing with the Telcos dial up Internet service when faced with the loss of revenue due to intensified video competition from DBS.
- Google devoted more capital to its fiber project when cities expressed a willingness to reform construction-related and other regulations in ways that reduced cap ex, op ex and risk, and increased potential revenues. In turn, the telcos facing Google Fiber competition were able to take advantage of these same streamlined regulations and have devoted more capital to fiber deployment, causing cable to accelerate deployment of its next generation product.
These examples demonstrate how policy affects capital allocation and competition. They also suggest not all four categories are equal in producing long-term competitive effects. Baer cites Online Video Distribution as “disruptive innovation.” He explains, “some innovation comes from incumbents smart and nimble enough to take advantage of these new opportunities. But new entrants deserve a lot of credit, too. Companies like Netflix and Amazon offer consumers flexibility and control; established players like CBS and HBO have been forced to respond.” I agree about the value of disruptive, instead of traditional, competition. Indeed, after some period of time, markets tend to stabilize and it is difficult to affect the incentives of existing players without introducing a new competitor or better and/or cheaper technology substitute.[18] To bring improvements in price and quality to such mature markets, disruptive competition has proven key. Indeed, that decision on wireless to wired terminating access that I noted above, and a similar decision for data that enabled inexpensive VOIP is the reason the discussion of pennies per minute long-distance charges in now an anachronism.[19] But I disagree in nomenclature with Baer. Wireless, VoIP, Netflix, Amazon or other disruptors are not really new entrants. Rather, they are adjacent market entrants. They had assets and motives different than existing players. The experience of the last 20 years suggests that the asymmetry of those assets and motives, if unleashed in an adjacent market, leads to far greater disruptions than existing competitors or new entrants in a mature market are likely to cause. Similarly Google Fiber could be seen as a new entrant but it had both existing network assets to lower its cost structure and motive to improve its search business revenues through better broadband performance.[20] With Gig.U, we worked with some true new entrants but those efforts failed, and as we discuss in the handbook, efforts involving true new entrants have a higher likelihood of failure.[21] Reflecting those experiences, I would argue that regulators should be cautious about betting on a true new entrant but rather look to strategies that enable asymmetric, adjacent market entry. Unbundling can work to reduce prices but it discourages broad network upgrades. I think unbundling can be appropriate when the government finances the facility, as it did in the BTOP program, or when there are economic reasons that there is no appropriate way to make the economics work for providing an essential facility.[22] Some argued that we had reached that point in 2009 and bitterly criticized the National Broadband Plan for not recommending unbundling.[23] As of today, I think Google Fiber and other fiber efforts prove them wrong but we are still in the early innings. If those fiber efforts end before we reach bandwidth abundance in a critical mass of the country, then perhaps, the critics were right.[24] In short, if we want intensified competition to deliver abundant bandwidth, we should be looking at how government affects that equation today, with particular attention to how it can incent adjacent market entry.[25] How exactly do we do that? That leads to our third question: given the current market, what are the appropriate government levers to intensify competition at this part of the cycle? I say appropriate because I think all agree some government actions are not appropriate even if in the short-term they would improve bandwidth abundance. To understand the levers, first, you have to understand the environment. In 2009, it looked liked three broadband markets:
- a high-speed wired market, generally characterized by a single cable provider. The first government acknowledgement of that was in a slide we presented to the Commission in September 2009,[26] and recently resurrected by government officials.[27]
- Second, low-speed wired market, generally characterized by a single telephone company; and
- Third, the mobile market, generally characterized by at least 4 providers.
Some would argue that it is a single market. Certainly AT&T’s DSL service provides some competition to Comcast’s DOCSIS 3.0 service. Reasonable minds can differ but as all the previously noted speeches by the government officials concluded, the competition is not much, particularly as we move to streaming video, and will be even less as we move to 4K and Virtual Reality. Some might also argue that wireless competes with both wires. Baer directly addressed that in noting “today wireless is too capacity-constrained and costly to provide a meaningful alternative for consumers.” I don’t have the expertise to provide an economist’s answer to the current state of competition. But I do have the expertise to tell you how we analyzed it with the plan. That brings us to the game theory. In the summer of 2009, the National Broadband Plan Team looked at the data and realized that for the first time since the beginning of the commercial Internet there was no national carrier with plans to deploy a better network than the current best available network. The data suggested, and subsequent experience confirmed, that current market forces would not drive deployment of world leading wire line networks in the U.S. As we noted in that slide, for 75% of the country, cable had the faster network and the cheapest upgrade path. The future looked like a cable v. copper competition that would be premised on allocating scare bandwidth instead of building on technological advances to deploy abundant bandwidth. In thinking about moving from scarcity to abundance we started thinking about the prisoners’ dilemma as a way to understand the challenge. In that classic bit of game theory, the prisoners are both better off if they both don’t talk but that requires that they trust each other not to talk. The cop wants them to talk and to do so, must cause a defection. Let’s substitute the idea of talking with investing. Economic logic would suggest that if cable and telco trusted each other not to invest in next generation networks they would both be better off simply harvesting from past investments. But if society wants to remove bandwidth constraints on innovation, economic growth and social progress, we have to cause a defection. So this brings us to the core dynamic: how do we intensify competition between the three adjacent markets to drive each to invest in more abundant bandwidth. Our first thought was consistent with Baer’s observation: remove capacity constraints by providing the wireless sector more spectrum. Not only is that a good in and of itself, but it also would negate the telco’s harvest strategy. It would change the capital allocation decisions for both the wireless and telco sectors, improving the economics of the upgrade for wireless, and also, by increasing competition, increasing the motive for the telco’s to upgrade. The plan has a lot of recommendations for improving the spectrum position of the mobile providers. While there have been some problems, the government has made significant progress[28] in replenishing the empty spectrum cupboard we saw in 2009 and creating new supplies.[29] But there are three problems with spectrum as our single strategy. First, it takes a very long time. Second, the two largest wireless providers are also the two largest fixed line telcos, changing the incentives for what it would be if they were different companies. Third, the next generation of mobility, sometimes referred to as 5G, will rely on small cells, an architecture that will require greater fiber connectivity.[30] These problems don’t mean we shouldn’t proceed, but only that we should be realistic about the timing and impact of the first leverage point of more spectrum. Now let’s look at the second leverage point; improving the economics of a telco upgrade. We made a number of proposals at the national level, but frankly, cities have greater leverage to improve the math than the federal government. This has become clear through the Google Fiber effort. Google has turned out to be the cop that has caused the greatest level of defection. The project, which came out of discussions with the Plan[31], has been the principal driver of the “game of gigs.”[32] Everywhere Google Fiber announces, the telco announces a matching upgrade.[33] Further, everywhere Google Fiber announces, the prices of others go down to match Google.[34] Google is highly unlikely to cover the entire country[35] but the project inspired other activities such as the Gig.U project. As discussed in the handbook we put out this summer, over 25 of our communities have moved in ways that have accelerated the deployment of next generation networks. Further, even some rural communities, which have more difficult economics, have found ways to use smart dig-once and dark fiber policies to stimulate public private partnerships to bring new choices for their residents.[36] Some are now Google Fiber communities, but most have done so through other means. The lessons are the same in terms of generating a positive competitive response. Indeed, there are a variety of adjacent market entrants beyond Google, including electric utilities, municipalities, small ISPs, and non-profits, all of which have had the same positive affect. The lessons are also the same as to how cities have changed the capital allocation equation through three key strategies.
- Asset utilization and improvement. [37]
- Regulatory flexibility to accommodate new business models.[38]
- Demand Aggregation.[39]
The handbook provides details on all the tactics communities have used to support these strategies. But this is not to let the federal government off the hook. It was interesting to see Google and AT&T’s filings to the Broadband Opportunity Council. Many of their proposals mirrored proposals we made with the plan that had not yet been implemented. Further, certain legislative efforts, such as the Dig Once bill[40] just introduced by Representatives Walden and Eschoo is consistent with and improves on the recommendation in the Plan.[41] This week’s hearing on a broadband deployment, widely praised on all sides, included many ideas I recall with great fondness from the Plan.[42] But there were also new proposals on topics such as pole attachments. The Plan made several proposals on pole attachments[43] but I have learned it is an even bigger issue than I thought then. Indeed, if there were one thing that I think would accelerate competition more than anything else, would be cities updating their as-builts.[44] From a federal perspective, the most helpful change would be a rule that amended the pole attachment rules to reduce delays associated with pole attachments and conduit occupancy.[45] In the category of good problems to have but must still be solved, we are actually seeing delays caused by the fact that we have cities with multiple parties upgrading at the same time. The more successful federal, state, and local governments are in creating the conditions for investment in new networks, the more we are going to see multiple competitive network builds, which under the current regime is handled by a queuing system that basically blocks simultaneous construction. We are already seeing that in some markets, and that should focus attention on reform of make-ready policies. Another area of interest is access to video programming. Google Fiber wanted to offer a pure broadband service but found the economics didn’t make sense without a video offering. At the same time they have found the difficulties in obtaining programming have limited the pace and expanse of their Fiber effort. They have proposed a number of adjustments to the current rules to enable smaller broadband players to obtain the programming they need to invest and compete.[46] Another cost to deployment is related to access to multiple dwelling units and inside wiring rules.[47] These policy adjustments to our current pole attachment, programming and other regimes are, to most folks, boring. They are not nearly as much fun as blaming incumbent providers for limited bandwidth.[48] But based on the experience of Google Fiber and Gig.U, if you want to have a serious discussion about intensifying competition, it requires acting to lower cap ex by, for example, improving the economics of make ready work for poles. These first two levers address the issue noted by the Broadband Plan’s Slide 4.G and provide telcos two incentives to upgrade: better economics for deployment of upgraded networks and the threat of new competition. Both of these levers help put greater competitive pressure on cable to upgrade. My understanding of a third lever to intensify competition stems from discussions with my friend David Morken, the CEO of Bandwidth.com. In the summer of 2011, he suggested his company could use its existing assists to launch a Wi-Fi based mobile service. I first thought he was crazy but became a convert. A few months later, the company launched Republic Wireless, which is today the largest Wi-Fi first mobile company in the United States. One reason I thought he was crazy was that everything Bandwidth could do, cable could do and with superior economics for all of the inputs. David suggested that while cable would eventually enter the market, there would always be a niche that would be profitable for Bandwidth.com and that cable would take a long time to enter and would price its product differently. As to his first assertion, time will tell but so far, so good. As to his second, he is certainly correct. While Republic and others have launched and operated Wi-Fi based mobile competition, cable, other than a frankly weak effort by Cablevision, has yet to do so. To be fair to cable, a company like Comcast cannot afford to make the kind of mistakes that Bandwidth could. The product launch has to be close to perfect. But David was right for another reason, which goes back to the prisoners’ dilemma, with a bit of the classic innovator’s dilemma thrown in. Its not logical that a company with a couple hundred employees in North Carolina can develop and deliver a product that a company with tens of thousands has not yet done until you consider motive. Why would Comcast attack a market that might cause a counter attack and potentially reduce prices throughout all broadband markets? In this light the logical path is not to attack but to focus on harvesting until one is forced to attack. And that brings us to the third lever. If there are sufficient forces threatening Cable’s existing revenue streams of multi-channel video and broadband, it will, as it did when DBS threatened its revenue, attack new markets. Alternatively, if enough players like Republic enter the space and the wireless providers seek new revenue streams by aggressively pursuing cord cutting in the broadband market, Cable will enter the mobile market, as the game is already afoot. And, with Verizon and AT&T ramping up the competition in the video market and OTT threatening as well, Comcast is now more aggressively pursuing its mobile strategy.[49] That in turn will intensify competition in all three broadband markets. Making sure this lever can continually drive competition, I think, requires two elements. First, the government should assure the current regime of unlicensed continues to have sufficient spectrum and does not suffer degradation.[50] Second, the cellular market structure should be sufficient robust to have market forces drive a wholesale market.[51] In short, the government policy ought to assure all three submarkets have the means, motive and opportunity to enter the adjacent market, as that will create a competitive virtuous cycle that drives toward bandwidth abundance. Summarizing my answer to all three questions, if we understand that what we want is to remove bandwidth constraints on innovation, growth, and social progress, we will want policy to create incentives for competitive upgrades. For policy to play that role, it must drive changes in capital allocations and the economics of deployment. To do that, policy should look at where it can lower the input costs for all potential competitors but particularly for adjacent market entrants. In such a market all the major enterprises will have incentives to upgrade their networks for defensive reasons and the opportunity to play offensive in attacking the offerings and market share of others in currently well entrenched positions. While policy should not and cannot pick the winner in the market, it can and should assure that all the existing networks have some incentives, mostly from competitive threats, to accelerate their upgrade to networks offering abundant bandwidth. Let me close with this. Last month, Jeff Greenfield, writing in the New York Times, sought to explain the explosion of great television and wrote this: “when technology replaced scarcity with abundance, every core assumption about TV began to crumble. Everything about the medium — how we receive it, how we consume it, how we pay for it, how we interact with it — has been altered, and TV is infinitely better for it.” The purpose of broadband competition is to cause that same explosion of bandwidth. We are much better off than we were five years ago, thanks in no small part to the three government officials whose speeches I have cited and their willingness to act in accordance with their analysis. If we continue to have such leadership, if we can avoid empty words and stay focused on the key leverage points, we can create bandwidth abundance and five years from now our broadband offerings, and our country, and I think the world, will be abundantly better for it. [1] https://apps.fcc.gov/edocs_public/attachmatch/DOC-329161A1.pdf [2] https://www.fcc.gov/document/speech-general-counsel-jon-sallet-lessons-recent-merger-reviews [3] http://www.justice.gov/opa/speech/assistant-attorney-general-bill-baer-delivers-keynote-address-future-video-competition [4] See http://www.cnet.com/news/what-have-we-learned-from-google-fiber/ and https://gigaom.com/2014/01/17/why-its-time-for-the-u-s-to-get-serious-about-its-broadband-problem/. [5] This is different than the fundament attribution error (correlation does not imply causation) which is we are never supposed to commit after taking any statistics or economic class but nonetheless is also a hallmark of modern political rhetoric, such as in candidate Ronald Reagan’s classic debate line, “are you better off than you were four years?” [6] This is true about policy debates generally, but I will leave that for others to address. [7] Some might argue that closing the adoption gap, sometimes referred to as the digital divide, should be a higher priority for broadband policy. While I agree that it ought to be a high priority for the policy, I am focused here on competition. While bringing more customers to the market will help with the competition issues, it will not, in and of itself, drive the network upgrades that I believe necessary. [8] In both vision and specifics, it succeeded, but not necessarily in a way that reflected the most heavily debated provisions, the 14 point check-list for Local Exchange entry into long-distance. Wireless and VoIP entry, as discussed below, was the bigger factor. [9] This is generally expressed in terms of greater bandwidth, and often illustrated by the time it would take to download an HD movie. History will probably regard this as the least important use of next generation networks, recalling Henry Ford’s comment that before he produced his cars, his customers, if asked, would have said they wanted “faster horses.” [10] I personally encountered this when I was involved in cable rate regulation, as called for in the 1992 Cable Act. To the extent the law sought to lower prices, that was relatively easy and the February 1994 decision did so initially. But the law also, correctly in my view, wanted the cable industry to be able to continue to invest in more and better programming. The initial price cuts were then reversed by the “going forward” rules, which allowed such investments. Optimizing for both proved difficult, if not impossible, for regulation. The difficulties of that effort are well described in Reed Hundt’s memoir of his FCC’s Chairmanship “You Say You Want a Revolution: A Story of Information Age Politics.” (2011) [11] Expressed this way, the vision captures a number of different variables, including affordability, ubiquity, performance and others. [12] There are a number of important government initiatives, including the reform of the E-Rate and Lifeline programs and ConnectedHome, which also are part of the effort to remove bandwidth constraints. As they are not directed toward changing the current mass-market competitive market structure, they are beyond the scope of this paper. Nonetheless, issues of adoption and anchor institution connectivity are critical to the vision that animates the framework I present here. [13] For a more complete discussion of the transition from moving from bandwidth scarcity to bandwidth abundance, see “The North Star of Bandwidth Abundance”, at http://www.gig-u.org/the-north-star-of-bandwidth-abundance/. I should note that the goal of bandwidth abundance might strike an economist as encouraging an overproduction of bandwidth, not justified by actual consumer demand, and that that goal could lead to stranded investment. This is unlikely, in my view, for a variety of reasons, some of which are discussed in the speech. The principle point is that given the transition to the information economy, abundance is a good in and of itself, which drives new use and consumer surplus. I would also add that unlike cyclical industries, where demand goes up and down, the use of bandwidth seems to go up and up. Of course the timing of such investments can lead to financial losses, as occurred in the early years of this century, but the assets were not stranded but rather picked up by a number a enterprises, like Google to accelerate their own network operations. [14] I am consciously relying on my own experience rather than the uber text of competition, “How Competitive Forces Shape Strategy”, by Michael Porter, Harvard Business Review, March 1979, which lays out five forces that determine competition in a market. I would note that Porter’s work was brilliantly updated for the digital era in “Unleashing the Killer App”, (1998) by Chunka Mui and Larry Downes, which lays out how digitalization, globalization, and regulation/deregulation are overshadowing Porter’s five forces. My purpose here is not to fit what I have seen into either framework but to try to try to describe how policy has and could in the future use policy to intensify competition. [15] Benefits to the system refers to the benefits a service provider may obtain in markets outside of the area of the investment. For example, AT&T, by building out fiber in Raleigh, North Carolina, may derive some benefit in a market, such as Wilmington, North Carolina. In the experience of Gig.U, this is significant for Google but not significant for incumbent ISPs. Further, we could not see examples of where government policy could affect this factor. Nonetheless, it is a factor that is relevant to the formula for upgrades. [16] There are certainly other factors that affect the equation. For example, as the investments we seek are long-term, there is significant sensitivity to interest rates. Two that are not reflected in the equation but were significant in the Gig.U experience were entrepreneurial talent in network services and local leadership that could organize local resources to improve the economic opportunity. As to the first, it appears that the generation of entrepreneurial network talent that grew up at MCI and went on to start a number of CLECs and DLECs in the late 1990’s has largely left the sector, though a new generation is starting to emerge. As to the second, there has been a significant increase in local government interest and talent related to broadband networks, owing to a number of factors, including the sharing of lessons learned from the dozens of cities that have now successfully accelerated the deployment of next generation networks. [17] This is not always true. One counter-example would be Netflix, which transformed from a postal delivered service to a streaming service and an original programming service, thereby creating competition to MVPD. The critical change was the increase in broadband capacity and customers, making the streaming service viable. However, Netflix would not have made that transition if it were not for earlier government policies, requiring interconnection, banning terminating access charges for data and looking unfavorably upon blocking or throttling traffic. Government policy played a critical role but the timing was different from the examples cited. Going back even further, Netflix would probably not exist but for 17 USC 109, which codifies the first sale doctrine. If Netflix had had to ask Hollywood’s permission first before buying and then lending out DVDs (or at least if first sale were not there as a backstop should negotiations fall through), the original business plan would have been unlikely to get off the ground. [18] For example, government policy did successfully enable new wireless new entrants into wireless through the 1994/5 PCS auction. In that case, the existing market penetration was low enough and the potential high enough to induce new entrants. Despite many efforts, subsequent auctions have not done so, as it is too difficult to dislodge existing efforts. T-Mobile has recently intensified competition but only after it got a boost from a spectrum and financial payment from AT&T for the rejected merger. Adjacent market entry, through Wi-Fi, discussed later, is most likely to be the next disruptive competition. [19] The one exception is prison, where the FCC recently acted to lower rates. Without commenting on that decision or the unique market structure for prison phone services, it is worth noting that bandwidth abundance in prisons could also do a lot to increase communications, security, education and job training, while reducing the cost of prison operations and bringing the cost of voice services to where it is in the non-prison market. But that is a subject for another time. [20] In Porter’s model, this would be described as competition from both a buyer and supplier as Google is both a supplier to ISPs and a buyer from ISPs. [21] See Gig.U handbook, http://www.gig-u.org/cms/assets/uploads/2015/07/Val-NexGen_design_7.9_v2.pdf, at page 25. [22] This is the heart of the economic inquiry in the FCC’s current review of the special access market. In that inquiry, the FCC has to make an assessment of, among other issues, under what circumstances is it economically feasible for a CLEC to be able to build its own last-mile fiber loops to a location, to what extent do lower wholesale rates provide negative incentives for a CLEC to construct its own fiber loops, and given that the ILEC, as the historical monopolist, likely has a first-mover advantage and thus a larger market share than the CLEC, how does that larger market share affect comparative costs between the ILEC and the later entrant? Those are issues far beyond the scope of this speech but is the subject of extensive economic analysis in the FCC docket. [23] http://www.nytimes.com/2010/03/21/opinion/21Benkler.html?_r=0. [24] The equity research firm Bernstein, in its October 7, 2015 report on Google Fiber, suggested that an “aggressive expansion” of the project would reach 15-20 million homes in 6-8 years. If that were to occur, I believe it would drive a number of developments, including competitive responses and new products that would improve the economics of deployment throughout most of the rest of the country. But again, I could be wrong. [25] This is not the occasion for a full discussion of the FCC’s decision to pre-empt state laws restricting local broadband efforts except to note that the threat of competitive losses is, as demonstrated by the competitive response to Google and by our experiences with Gig.U, is the single biggest driver of incumbents accelerating their deployment of next generation networks. Whether it is wise for cities to build their own networks is subject to a reasonable debate. (For such a debate listen at http://muninetworks.org/content/transcript-community-broadband-bits-episode-132. ) On the other hand, there shouldn’t really be a debate about whether a city having the ability to build its own increases the probability that the incumbent will act to make it unnecessary for a city to build its own. That is a factual question for which all the evidence is on the side arguing that just like any negotiation, more leverage increases the odds of a successful outcome. Which is why the National Broadband Plan favored pre-emption of such laws. See National Broadband Plan, recommendation 8.19. [26] It was also Exhibit 4.G of the Plan, where the text noted “in areas that include 75% of the population, consumers will have only one service provider (cable companies with DOCSIS 3.0 enabled infrastructure) that can offer very high peak download speeds. National Broadband Plan, Page 42. [27] Chairman Wheeler presented a similar slide in his competition speech and as Mr. Baer noted, “One characteristic stands out most of all – today most consumers do not enjoy competition for high-speed Internet access. As Chairman Wheeler put it, “as bandwidth increases, competitive choices decrease.” The Broadband Opportunities Council similarly wrote “Three out of four Americans do not have a choice of providers for broadband at 25 Mbps, the speed increasingly recognized as a baseline for broadband access.” See Broadband Opportunity Council Report at Page 6. http://www.ntia.doc.gov/files/ntia/publications/broadband_opportunity_council_report_final.pdf [28] See section on spectrum in http://scholarship.law.edu/cgi/viewcontent.cgi?article=1556&context=commlaw, at pp. 294-296 [29] I have financial affiliations with several enterprises seeking to bring more spectrum into the marketplace. Each has a idiosyncratic issue that prevents the spectrum from being utilized. Now is not the appropriate place to discuss these issues except to note that while there is a political consensus that our country needs to put more spectrum to work, when it comes to specific cases, the consensus breaks down. [30] It has always been true that most of the distance a “mobile communications” travels is over a wired network. It will be even more true in the future. [31] http://www.cnet.com/news/google-exec-sees-google-fiber-as-a-moneymaker/ [32] For a discussion of the early rounds of the “game of gigs” see http://www.gig-u.org/cms/assets/uploads/2012/12/81714-Gig.U-Final-Report-Draft-1.pdf [33] For more on the game theory, somewhat akin to a game of chicken, between Google and incumbent ISPs, see https://www.washingtonpost.com/news/the-switch/wp/2014/10/28/google-fibers-playing-a-multibillion-dollar-game-of-chicken-with-traditional-isps/ [34] See, for example, http://www.tennessean.com/story/money/2015/09/29/t-drops-fiber-prices-google-fiber-levels/73023434/. But the reverse is also true. Prices stay higher in non-Google areas. See http://consumerist.com/2015/09/30/att-touts-lower-prices-for-gigabit-internet-still-charges-40-more-if-google-fiber-isnt-around/ [35] As noted in footnote 21, Bernstein estimates a maximum coverage of 20 million homes in 6-8 years. [36] See, for example, http://www.carrollcountytimes.com/news/local/ph-cc-fiber-lighting-ceremony-20150626-story.html and http://www.newsobserver.com/news/local/community/southwest-wake-news/article40803345.html. [37] The key inquiry is what assets does the city have that can be provided at no or little incremental cost that improve the economics of deployment and operations. This can include: physical assets, like rights-of-ways (ROWs), utility poles, conduit, buildings, etc.; information assets, like information regarding conduit, ducts, and other ROWs; and processes to improve current assets, such as ensuring that make-ready work is done expeditiously, coordinating with new providers to save costs or allowing them to perform work themselves through approved contractors. [38] The key inquiry here is what rules does the city have that may have made sense in a different time and with a different market structure that in today’s market creates a barrier to an upgrade or new deployment. For example, all the projects with national ISPs, including Google Fiber, have allowed neighborhood-by-neighborhood builds, which significantly reduces capital expenditures and risk through a pre-commitment strategy. [39] The key inquiry here is how to aggregate demand to demonstrate to existing players the value of an upgrade and to potential new entrants the opportunity in the community. This can be done on both the institutional and residential level. [40] The current draft of the Broadband Conduit Deployment Act of 2015 can be found at http://eshoo.house.gov/wp-content/uploads/2015/10/10.22.15-Dig-Once-Bill-Text.pdf [41] See National Broadband Plan recommendation 6.8. [42] The success of the hearing raises the question of why these bi-partisan ideas did not get aired in Congress immediately after the release of the Plan. Indeed, Congresswoman Eschoo commented, correctly, that “It is so common sense that I wonder why we didn’t come up with this a decade ago”. http://www.rollcall.com/news/lawmakers_push_dig_once_and_other_bipartisan_policies_to_expand_high_speed-244530-1.html. There were a variety of factors but one of them was that the broadband political capital at that moment focused on how the FCC should respond to its loss in the Comcast Net Neutrality case. Comcast Corp. v. FCC, 600 F.3d 642. (2010). Another was a focus on specific issues of the moment, such a West Virginia mind disaster. http://voices.washingtonpost.com/posttech/2010/04/for_senator_jay_rockefeller_d-.html. [43] See National Broadband Plan, recommendations 6.1, 6.2, 6.3, 6.4, 6.5, 6.6. [44] Not only would this make those cities more attractive for new fiber investment, it would minimize the risk to their infrastructure from fiber construction, and it would also improve their own plant maintenance capabilities. [45] Among other things, such a rule should introduce shorter timeframes and establishing higher pole-count thresholds before additional time allowances are triggered, accelerating deployments. infrastructure owners should be required to negotiate access agreements in good faith with a broadband provider as soon as the provider has begun the process of obtaining necessary regulatory approvals. The rule should allow use of utility-approved contractors to perform all pole attachment and conduit make-ready work. Further, broadband providers be permitted to use independent contractors if, in their estimation, utility-approved contractors alone cannot meet the deployment timetables. [46] See Google’s filing to the Broadband Opportunities Council at http://www.ntia.doc.gov/files/ntia/google_inc_boc.pdf, page 8. In the long-run, I am certain such measures will not be necessary, but as Keynes said, “In the long-run we are all dead.” [47] Id., at pp. 9-10. See National Broadband Plan at Recommendation 4.6. [48] As I hope is clear, I don’t regard our need for more abundant bandwidth as representing any kind of a moral failure by incumbent providers. Rather, I see it as reflecting economic incentives. I am somewhat perplexed by arguments that go after the character of companies as if they should read David Brooks’ book “The Road to Character” and reform themselves. But then, the Supreme Court appears to think they are people so maybe I am wrong. [49] http://www.bloomberg.com/news/articles/2015-10-21/comcast-said-to-be-planning-wireless-push-with-verizon-s-network. [50] This raises the issue of whether LTE-U threatens Wi-Fi. For an explanation, see http://www.wetmachine.com/tales-of-the-sausage-factory/my-insanely-long-field-guide-to-the-lte-u-dust-up-part-i-spectrum-game-of-thrones/ and http://www.wetmachine.com/tales-of-the-sausage-factory/my-insanely-long-field-guide-to-the-lteu-dust-up-part-ii-a-storm-of-spectrum-swords/. [51] In this regard, the speeches by Wheeler, Sallet and Baer were all correct in taking a victory lap for several government efforts to assure that the market structure continue to have four national players. This was a mixed blessing for Republic as the rejection of the AT&T/T-Mobile deal lead to T-Mobile becoming more aggressive on pricing, thereby reducing the attractiveness of Republic’s pricing plan. Nonetheless, without a wholesale option, Republic would not exist.