Bobrat speaks well to the point that what you propose is impossible.
You have to remember, though, that what editors are doing is not SUPPOSED to be a traversal (random or not) of the unreviewed queue. It's SUPPOSED to be a traversal of the whole internet; and it should focus on whatever tools are most efficient at getting to as-yet-unreviewed sites.
What's most efficient? Well, that may depend on who's working, and where. I created a new category yesterday, based on some links in a niche-subject mailing list. (a centenary of historical importance in the field is coming up.) Obviously there were NO submittals; and it turned out half the links in the message were non-responding, and I went 300 deep in Google search results without result. _Nothing_ all that efficient, but of course ... and this is the point ... I DIDN"T KNOW THAT BEFOREHAND.
So, how do you pick the most efficient approach, when you don't know ahead of time how efficient ANY approach would be?
Simple. You get ten thousand editors, each with some theory as to which approach is most efficient. Let each one try his own approach. Let them talk to each other. Some people who aren't having much success are likely to try something someone else claims works better; and so more people work on what seems most efficient at the moment. Other unsuccessful site searchers just dig in and try harder, and so some people keep testing alternative approaches, covering the possibility that the most efficient approach is ineffective at finding some things, or in preparation for the day when spammers figure out how to spam the living daylights out of the more efficient tool.
Very non-deterministic, very dynamic, very very effective at making the best possible use of informed human judgment.
Replacing with a single tool (submittals) and a single dictum about how they must be used in a robotic assembly line would be an unmitigated disaster.
Actually, it would be MUCH worse than that.
It is not just editors who are people and can adjust their methods to achieve their goals most efficiently. Spammers are also, in a way, human (or partially human, or as C.S.Lewis might say, in the process of ceasing to be human. And they are also capable of adjusting their machinations based on information they have.
Ignore bobrat's reality: the totally insoluble problem of centrally allocating editors with specific skills to submittals -- mapping 600000 category queues into 60,000 editor queues with unknown service times. if the ODP announced that henceforth all submittals would be reviewed in order, and that there was currently a 140-day, 12-hour, 47-minute wait.
What would the spammers do?
Hmm. I know people pay $300 for a quick review at Yahoo. A domain name costs $6.95, or I could set up a name server and create subdomains virtually free.
All I have to do is submit a zillion dummy URLs today. I could probably hire third-world keyboardists for five cents a submittal, but even if it cost a dollar, no biggie. Because in only 130 days, I'm going to be selling those URLs at $300 a pop to webmasters who want immediate reviews. I'll be rich, Rich, RICH!
Of course, after submitting a year's supply (say half-a-million URLs), I'd mention this scam to my closest friends, who will all think, "that's a good idea, but I could just as well use deeplinks on a Geocities site, cut the price by $6.95, and take over the market." But it'll be too late, those third world keyboardists will have figured out the stunt and started moonlighting on their own quick-ODP-reviews-for-sale business.
And the ODP would have given complete control of its editing priorities into the hands of the most evil people on the internet.
Sorry, but that's not a first step towards anywhere I ever want to go.