After blogging about ICANN’s new gTLD policy or lack thereof, I’ve had several people ask me why I care so much about ICANN and new top-level domains. Domain names barely matter in a world of search and hyperlinks, I’m told, and new domains would amount to little more than a cash transfer to new registries from those trying to protect their names and brands. While I agree that type-in site-location is less and less relevant, and we haven’t yet seen much end-user focused innovation in the use of domain names, I’m not ready to throw in the towel. I think ICANN is still in a position to do affirmative harm to Internet innovation.
You see, I don’t concede that we know all the things the Internet will be used for, or all the things that could be done on top of and through its domain name system. I certainly don’t claim that I do, and I don’t believe that the intelligence gathered in ICANN would make that claim either.
Yet that’s what it’s doing by bureaucratizing the addition of new domain names: Asserting that no further experiments are possible; that the “show me the code” mode that built the Internet can no longer build enhancements to it. ICANN is unnecessarily ossifying the Internet’s DNS at version 1.0, setting in stone a cumbersome model of registries and registrars, a pay-per-database-listing, semantic attachments to character strings, and limited competition for the lot. This structure is fixed in place by the GNSO constituency listing: Those who have interests in the existing setup are unlikely to welcome a new set of competitors bearing disruptions to their established business models. The “PDP” in the headline, ICANN’s over-complex “Policy Development Process” (not the early DEC computer), gives too easy a holdout veto.
Meanwhile, we lose the chance to see what else could be done: whether it’s making domain names so abundant that every blogger could have a meaningful set on a business card and every school child one for each different face of youthful experimentation, using the DNS hierarchy to store simple data or different kinds of pointers, spawning new services with new naming conventions, or something else entirely.
I don’t know if any of these individually will “add value.” Historically, however, we leave that question to the market where there’s someone willing to give it a shot. Amazingly, after years of delay, there are still plenty of people waiting in ICANN queues to give new gTLDs a try. The collective value in letting them experiment and new services develop is indisputably greater than that constrained by the top-down imaginings of the few on the ICANN board and councils, as by their inability to pronounce .iii.
“How do you get an answer from the web?” the joke goes: “Put your guess into Wikipedia, then wait for the edits.” While Wikipedians might prefer you at least source your guess, the joke isn’t far from the mark. The lesson of Web 2.0 has been one of user-driven innovation, of launching services in beta and improving them by public experimentation. When your users know more than you or the regulators, the best you can do is often to give them a platform and support their efforts. Plan for the first try to break, and be ready to learn from the experience.
To trust the market, ICANN must be willing to let new TLDs fail. Instead of insisting that every new business have a 100-year plan, we should prepare the businesses and their stakeholders for contingency. Ensuring the “stable and secure operation of the Internet’s unique identifier systems” should mean developing predictable responses to failure, not demanding impracticable guarantees of perpetual success. Escrow, clear consumer information, streamlined processes, and flexible responses to the expected unanticipated, can all protect the end-users better than the dubious foresight of ICANN’s central regulators. These same regulators, bear in mind, didn’t foresee that a five-day add-grace period would swell the ranks of domains with “tasters” gaming the loophole with ad-based parking pages.
At ten years old, we don’t think of our mistakes as precedent, but as experience. Kids learn by doing; the ten-year-old ICANN needs to do the same. Instead of believing it can stabilize the Internet against change, ICANN needs to streamline for unpredictability. Expect the unexpected and be able to act quickly in response. Prepare to get some things wrong, at first, and so be ready to acknowledge mistakes and change course.
I anticipate the counter-argument here that I’m focused on the wrong level, that stasis in the core DNS enhances innovative development on top, but I don’t think I’m suggesting anything that would destabilize established resources. Verisign is contractually bound to keep .com open for registrations and resolving as it has in the past, even if .foo comes along with a different model. But until Verisign has real competition for .com, stability on its terms thwarts rather than fosters development. I think we can still accommodate change on both levels.
The Internet is too young to be turned into a utility, settled against further innovation. Even for mature layers, ICANN doesn’t have the regulatory competence to protect the end-user in the absence of market competition, while preventing change locks out potential competitive models. Instead, we should focus on protecting principles such as interoperability that have already proved their worth, to enhance user-focused innovation at all levels. A thin ICANN should merely coordinate, not regulate.