Today marks the launch of a groundbreaking new app called Color.
The app, available for iPhone and Android, has users automatically share pictures with those around them. Take a picture and people in close proximity can see them. No logins, no passwords, no need to build a social network — it’s automatically defined by your proximity to people. If you’re around people regularly, those people and their pictures will become sticky.
It’s a whole new dynamic in photo sharing. Not only is everything public, there is virtually zero latency.
While I’m trying to get my arm around what it will mean for sharing in general, there is one clear application for Color: breaking news.
Cell phone networks light up when news happens. If CalTrain hits a pedestrian, many people get on their cell phones to let their friends and family know that they will be late. That’s before any news outlet has even heard of the accident. By detecting unusual spikes, you can predict that something has happened — even if you don’t know what it is.
A few years ago, while I was stuck on a CalTrain that hit a pedestrian, I wrote about how Twitter would be used for breaking news:
There were lots of questions from the people on the train: What happened? Did we kill someone? How long are we going to be delayed? There were also a key question for others who use CalTrain: should I get on the train or find another way home?
Given the small number of people affected, this isn’t the type of thing that makes the local TV news. The Bay Area, being what it is, has a new answer: Twitter. An unofficial CalTrain account allows citizen journalists to share information about what’s going on. Readers can get the news on the Web or by text message.
Twitter could become the police scanner of our times. As Twitter becomes location aware, it would be possible to detect where something happened by looking for unusal spikes in activity around a location.
Although Twitter is often used for breaking news today, it doesn’t do a great job with geodata. It’s hard to tell tweets from people talking about the Japanese earthquake from those who are actually in Japan who are living in its wake. Undoubtedly, Color will be used to take pictures of breaking news. If the system is instrumented to process and normalize all of the geodata that it gets, it could not only show you where news was breaking, it would show you exactly what was happening there.
Networks like CNN have had features like iReport for a few years, but those require editors to manually process a lot of information and only work for really large events.
Color could take it to a new level and make it much more scalable by algorithmically determining what’s important based on where you are. Color would also allow an elastic view of “news”. A CalTrain wreck is news to a few hundred people. A giant earthquake and tsunami is news to billions. If you’re in Palo Alto standing at the CalTrain station, you’d see pictures from CalTrain. No matter where you were in the world, you’d see the Japan earthquake pictures.
Color comes from serial entrepreneur Bill Nguyen. It’s really incredible to see folks like Bill and Mike McCue creating tools that will revolutionize news and publishing.
While many have carped about Color having raised $41 million in financing, I’ll point out that The New York Times is reported to have spent $40-$50 million building a
paywall digital subscription system that won’t work.