Ray, I hear what you're saying. I agree that if .biz and .info are mostly a failure in comparison to .com, either in total registrations or in used registrations, that does nothing to lessen internet stability. That is assuming of course that one or both of them (or others) don't fail entirely due to lack of funds, which conceivably could lead to problems without proper data escrow and/or a competent successor to take it over.|
Ditto for .ca or other similar ccTLDs, though I think directly comparing those to .tv, .ws, and .cc is a mistake, the latter have been marketed and used a la open gTLDs for some time now, which I think is one of Ben Edelman's points. As he also points out, they aren't entirely like gTLDs as they aren't governed by the same ICANN agreements. ccTLDs like .ca (a non-complete list would include .uk, .jp, .de, .au) are, I think, becoming more successful. I suspect their registration numbers aren't that different from .tv et al, and I suspect their actual use is considerably higher. I also think that while the so-called open gTLDs registrations are probably dropping, in ccTLDs like .ca, and an ever increasing number of others, both registration and use is growing. Perhaps Ben could next turn his attention there. :) So I don't expect .ca to go away any time soon, not due to the number of registrations or use at least.
What I am pointing out is that Ben Edelman is doing this research on his own, if he wasn't doing it, what data would ICANN have to go by in evaluating their new gTLDs? They would have some lesser amount of data to go by, but they seem in no rush in doing the work necessary to come by more data on their own. Indeed, M. Stuart Lynn stated many months ago that it was likely any future gTLDs would be restricted, and not only did he not use any hard data, there is SFAIK also no record of the committee coming to that decision. So I am pointing to the reality that ICANN could and probably will use this data, not that they should. In general I think it should be left to market forces to decide.
I happen to think that 100 new TLDs a year or whatever would not likely lessen internet stability, though there might be some vanishing point where the numbers become unsustainable through non-use. In that case one could either leave them static (as long as someone competent was keeping an eye on them) or do the equivalent of a remove group message on Usenet and delete the TLD, one doesn't see Usenet crashing despite a massive database and numerous rmgroups. As well, the decision to take such an action is handled differently depending on where one is in the hierarchy. Further, that holds true for which groups are created, there is no central ICANN like authority and yet Usenet is generally stable. Of course that is not to say that it is particularily useful.
So I do question why we should add TLDs if they are essentially useless. I don't think ccTLDs like .ca are useless, they are based on geopolitical realities, I also don't think restricted gTLDs like .aero are useless, they are based on subject. While .tv is ostensibly based on subject (with .ws and .cc to a lesser extent), they have not been used in that fashion. I don't think it is particularily useful to have comnetorgbizinfotvwscc as essentially indistinguishable, one might as well have TLDs numbered one to a hundred, while we could do that, I question if we should. comnetorg were originally intended to differentiate based on subject, if we are going to create new gTLDs we should get back to that model. However I don't think we should compel ccTLDs back to a geopolitical model, that is their business. The market seems to be saying that that business model largely depends on speculation and defensive registrations, I don't think that is particularily useful to internet users in general, nor is it likely to be a sustainable business model.
Lou Kerner, then CEO of dottv said a couple of years ago that .tv will be bigger than .com. This was a corporation that had fancy digs and over 75 employees and was supposed to pay $50 million to Tuvalu. Even with their bid model (they finally got about 3/4 of a $million for sex.tv after a supposedly minimum bid of one $million, it wasn't successful and wasn't re-registered and is now available again for $1 million, no bid necessary. SFAIK its previous registration is also tied up in court). Part of the reason for some advocating hundreds or thousands of new open gTLDs dates back to the time when many believed the hype. If one takes the hype out of the equation, what rationale is there for creating new open gTLDs? So the speculators and defensive registrators can fight it out again and again? Ben's research seems to show that even that business model isn't sustainable. Both speculators and defensive registrants seem to finally be turning their backs on the hype. Where then will the demand come from for more open essentially undifferentiated TLDs? -g