The NFL just doesn't know when to stop. The Washington Post reports on a new NFL policy limiting journalists' use of video online:
In a move designed to protect the Internet operations of its 32 teams, the pro football league has told news organizations that it will no longer permit them to carry unlimited online video clips of players, coaches or other officials, including video that the news organizations gather themselves on a team's premises. News organizations can post no more than 45 seconds per day of video shot at a team's facilities, including news conferences, interviews and practice-field reports.
Now this policy isn't copyright-based -- the NFL doesn't have copyright in the un-fixed statements of its players and coaches -- but good old real property law. The NFL teams own their facilities, and with them have the right to exclude people physically, as trespassers. So the NFL is telling sportswriters, who depend on physical access to gather the background for their stories, they'll be barred at the gates if they use more than 45 seconds of video online.
Houston Chronicle columnists John McClain and Anna-Megan Raley show the absurdity of this policy by trying to complete interviews in 45 seconds, stopwatch in hand. Even stopping at 45 seconds, they apparently violate the policy if the video is not removed after 24 hours and doesn't link to nfl.com!
While the football league may be within its legal rights on this one, its policy still reflects a fundamental misunderstanding of the medium. The league depends on independent journalists to do the research that keeps people following the sport between games, and journalists have turned to the Internet to dig deeper than they could in print or time-constrained TV. Readers go to sportwriters' websites and blogs precisely for perspectives they don't get from the official NFL.com website. Limiting the richness of media available on these sites is more likely to alienate fans and journalists than to drive traffic to NFL.com. Just look where the Olympics is.
Sometimes rights to exclude are best left un-exercised. By contrast, the National Hockey League has taken a better course, striking deals with YouTube, Sling Media, and Joost to permit people to see hockey when and where they want. "We're not content fascists," Keith Ritter, president of NHL Interactive Cyber Enterprises, which represents the league's interests in new media, tells the LA Times. Perhaps it's time for the Houston Chronicle team to battle global warming and pick up hockey sticks!
After blogging about ICANN's new gTLD policy or lack thereof, I've had several people ask me why I care so much about ICANN and new top-level domains. Domain names barely matter in a world of search and hyperlinks, I'm told, and new domains would amount to little more than a cash transfer to new registries from those trying to protect their names and brands. While I agree that type-in site-location is less and less relevant, and we haven't yet seen much end-user focused innovation in the use of domain names, I'm not ready to throw in the towel. I think ICANN is still in a position to do affirmative harm to Internet innovation.
You see, I don't concede that we know all the things the Internet will be used for, or all the things that could be done on top of and through its domain name system. I certainly don't claim that I do, and I don't believe that the intelligence gathered in ICANN would make that claim either.
Yet that's what it's doing by bureaucratizing the addition of new domain names: Asserting that no further experiments are possible; that the "show me the code" mode that built the Internet can no longer build enhancements to it. ICANN is unnecessarily ossifying the Internet's DNS at version 1.0, setting in stone a cumbersome model of registries and registrars, a pay-per-database-listing, semantic attachments to character strings, and limited competition for the lot. This structure is fixed in place by the GNSO constituency listing: Those who have interests in the existing setup are unlikely to welcome a new set of competitors bearing disruptions to their established business models. The "PDP" in the headline, ICANN's over-complex "Policy Development Process" (not the early DEC computer), gives too easy a holdout veto.
Meanwhile, we lose the chance to see what else could be done: whether it's making domain names so abundant that every blogger could have a meaningful set on a business card and every school child one for each different face of youthful experimentation, using the DNS hierarchy to store simple data or different kinds of pointers, spawning new services with new naming conventions, or something else entirely.
I don't know if any of these individually will "add value." Historically, however, we leave that question to the market where there's someone willing to give it a shot. Amazingly, after years of delay, there are still plenty of people waiting in ICANN queues to give new gTLDs a try. The collective value in letting them experiment and new services develop is indisputably greater than that constrained by the top-down imaginings of the few on the ICANN board and councils, as by their inability to pronounce .iii.
"How do you get an answer from the web?" the joke goes: "Put your guess into Wikipedia, then wait for the edits." While Wikipedians might prefer you at least source your guess, the joke isn't far from the mark. The lesson of Web 2.0 has been one of user-driven innovation, of launching services in beta and improving them by public experimentation. When your users know more than you or the regulators, the best you can do is often to give them a platform and support their efforts. Plan for the first try to break, and be ready to learn from the experience.
To trust the market, ICANN must be willing to let new TLDs fail. Instead of insisting that every new business have a 100-year plan, we should prepare the businesses and their stakeholders for contingency. Ensuring the "stable and secure operation of the Internet's unique identifier systems" should mean developing predictable responses to failure, not demanding impracticable guarantees of perpetual success. Escrow, clear consumer information, streamlined processes, and flexible responses to the expected unanticipated, can all protect the end-users better than the dubious foresight of ICANN's central regulators. These same regulators, bear in mind, didn't foresee that a five-day add-grace period would swell the ranks of domains with "tasters" gaming the loophole with ad-based parking pages.
At ten years old, we don't think of our mistakes as precedent, but as experience. Kids learn by doing; the ten-year-old ICANN needs to do the same. Instead of believing it can stabilize the Internet against change, ICANN needs to streamline for unpredictability. Expect the unexpected and be able to act quickly in response. Prepare to get some things wrong, at first, and so be ready to acknowledge mistakes and change course.
I anticipate the counter-argument here that I'm focused on the wrong level, that stasis in the core DNS enhances innovative development on top, but I don't think I'm suggesting anything that would destabilize established resources. Verisign is contractually bound to keep .com open for registrations and resolving as it has in the past, even if .foo comes along with a different model. But until Verisign has real competition for .com, stability on its terms thwarts rather than fosters development. I think we can still accommodate change on both levels.
The Internet is too young to be turned into a utility, settled against further innovation. Even for mature layers, ICANN doesn't have the regulatory competence to protect the end-user in the absence of market competition, while preventing change locks out potential competitive models. Instead, we should focus on protecting principles such as interoperability that have already proved their worth, to enhance user-focused innovation at all levels. A thin ICANN should merely coordinate, not regulate.