>I can't help but feel though that if DMOZ streamlined it's inclusion process and made it more transparent ...
I'm not sure how the inclusion process could be streamlined. The UI mechanics are polished and optimized, especially for suggested sites. A browsing editor takes:
-- 1 click to get into "edit" mode, if indeed he's not already there (you can browse the directory just fine in edit mode, page views are just slightly slower that way)
-- 1 click to see a list of the suggested sites.
-- 1 click to choose a suggestion for review
-- 1 click to bring up the website (in the same window or another window)
-- whatever typing is needed, to modify the user-suggested URL, title, or description
----------------
then 1 click to add that suggestion to the live directory, or 2 clicks (and either browsing a dropdown or typing/pasting another category) to move that suggestion to any other category.
As you can see, the vast majority of the time involved is spent in irreduceable activity: actually reviewing the website in question. The actual typing involves only 2-3 dozen words, even if the listing is built from scratch, and the UI overhead is is negligable.
The overhead of rejecting a site is comparable: typing a few words of explanation, and one click to reject rather than add. (No explanation is needed for an "add" operation: presumably the unique content mentioned in the site description is adequate justification.
That's efficient.
The guidelines that editors use reviewing a website are public, see the links at
http://editors.dmoz.org/about.html .
Nobody on the outside of the ODP has EVER made ANY specific concrete suggestion to streamline THAT part of the process: and, based on that description, I think you can see why nobody ever will. There's simply no overhead to cut--no supereratory repetition, no irrelevant communication, no unnecessary nano-management (although you'd be surprised how many people thinking adding any or all of those things would SPEED UP the process).
But of course that's only a tiny part of the editors' work. By most editors' estimates, half or more of all listings were found by editors on their own--and surely the better half. Any ideas that can be _demonstrated_ to make ANY of those processes more efficient, would be welcomed eagerly.
And, of course, it would always be good to streamline the process of rejecting inappropriate submissions, automating handling of repeated suggestions up to and including mass spam, clever detection of the standard doorway/affiliate spamscams or plagiarized content, etc.
And anyone with practical experience in those areas is welcome to discuss improvements (and there are always discussions in the internal forums!)
-------------------
The other half, "transparency" is also pretty close to optimal. All listed sites are immediately broadcast to the world. The suggestion pools aren't public, but there's basically never any information useful to honest site suggestors. True, editors make occasional 'negative' mistakes (1% or so of the time, in my experience), but addressing that issue from the suggestion side would be HORRIBLY inefficient -- doubling the workload for a 1% increase in listed sites.
There has to be a better way. There probably can not be a worse way. The proper approach is to improve site-finding techniques to the point that editors don't NEED suggestions -- because every good site is found effectively. Well, that's an ideal that'll never be reached, but one can see how there MIGHT be room for improvement -- unlike, of course, the process for handling site suggestions, which is clinging pretty close to the asymptote.
Th