Don't know if anyone has read the following Q&A with Matt Cutts (http://searchengineland.com/matt-cut...ow-tool-137664), but let me quote what he says:
Question:
Agree, disagree?
Quote:
Question:
What prevents, and I can’t believe I’m saying this, but seemingly inevitable concerns about “negative negative SEO?” In other words, someone decides to disavow links from good sites as perhaps an attempt to send signals to Google these are bad? More to the point, are you mining this data to better understand what are bad sites?
Answer:Right now, we’re using this data in the normal straightforward way, e.g. for reconsideration requests. We haven’t decided whether we’ll look at this data more broadly. Even if we did, we have plenty of other ways of determining bad sites, and we have plenty of other ways of assessing that sites are actually good.We may do spot checks, but we’re not planning anything more broadly with this data right now. If a webmaster wants to shoot themselves in the foot and disavow high-quality links, that’s sort of like an IQ test and indicates that we wouldn’t want to give that webmaster’s disavowed links much weight anyway. It’s certainly not a scalable way to hurt another site, since you’d have to build a good site, then build up good links, then disavow those good links. Blackhats are normally lazy and don’t even get to the “build a good site” stage. :)
Agree, disagree?