We have a lot of fairly sharp techies -- people with C.S. degrees and extensive experience with complex software. My experience has been that once editors tell them what patterns of problems are actually occurring, they are very good at devising automated ways of addressing them within the ODP environment. Bobrat is asleep at the switch here, I think, or he'd remember (I'm sure he knows) that we're seeing hijacked websites leave the original home page up to delude editors -- so checksumming the whole site is impractical, and checksumming the home page is inefficacious.
And spammers really really really want us to go more towards automated systems, because once you find the weakness of an automated system, you can exploit it repeatedly with impunity: whereas an editor may say "Hey! wait a minute!" the third or fifth time you try to sneak the same dumb exploit past him -- and then hunt you down with a pitchfork. So it's important for the integrity of the system that people keep looking at sites -- and not letting automated systems run rampant (or, more realistically, let spammers run riot through the cracks in the automated system.)
And if it is slower than automation -- it does no harm to have the ODP and Google both; something that slips past one may get caught by the other, and the surfer has a choice of either or both. I've worked on several other large indexing projects of one sort or another (besides the ODP), and I never believed that one index could encapsulate all the reality of the indexed corpus. So -- if you have a better idea for an index, go implement it! Most heavy surfers, including many ODP editors, would love to see it!