Paul Graham advocates that the next generation of spam filters should use automated 'bots to track down URLs listed in the spams. If a large number of users do that, that will amount to massive distributed denial of attacks against spam advertisements. Clearly, the software needs to be written properly, and protect large public websites needs to be protected.
This idea has merit though, which is to take the battle to the spammers. Get on the offensive.
What do you think?
This idea has merit though, which is to take the battle to the spammers. Get on the offensive.
What do you think?
Comment