During the last two weeks, I spent much of my time in focus groups on search. While I can’t talk about the concepts we were testing, one thing that struck me was the degree to which people viewed all of the concepts through fear of being spammed and scammed.
- Will people be able to use this technology to take over my machine?
- Will I get a virus?
- Will I get spammed?
- Will this cause popups?
- How will advertisers use this to hijack me from what I really want to do?
Although reaction to the concepts varied by demographic, the fears were universal.
Of course, we’ve designed the new concepts to proactively address spams and scams. But many of the industry-standard practices for eliminating spams and scams have the unfortunate effect of increasing the barriers to adoption.
Take digg as an example. The people with the greatest economic incentive to post stories on digg are spammers. To thwart spammers, digg has implemented security mechanisms such as CAPTCHAs. As the spammers and scammers get better defeating CAPTCHAs, the CAPTCHAs get harder. (On the CAPTCHA at right, my success rate is 1 in 3.)
These types of security measures increase the burden of submitting to digg to the degree that only the most motivated people bother to do it. On a given day, I will read 6-12 stories that I consider digg-worthy, but it’s just too much trouble to digg them.