Probably SE engineers and CEO’s are asking themselves the same question.
Directory frenzy begun when Google announced his PR mechanism, slower in the beginning but extremely accelerated lately.
Keeping the same trend it’s very accurate to say than in 6 months, with continuously improved automated tools for submitting, the quantity of links to be considered by crawlers will grow, artificially, extremely.
And the content itself will be almost the same while the services provided by directories to visitors (others than submitting webmasters) are practically null.
There is one axiom, one question and one estimation :
The axiom :
Search engines will degrade and penalize some but not all directories.
The question :
Will this happen on automated algorithmic bases or a human rating will be involved ?
This question is linked to the definition of “bad neighborhood” and “link farm”. The algorithm will decide that a directory has over a certain % links to “bad sites” and penalize it ?
Let’s take a look at DMOZ : based on this algorithm Google will need to penalize...