One of the common complaints we get is the way we calculate our numbers and what that means. Well, it’s not rocket science at the moment. We add the comments you leave on Amplicate, Twitter and Facebook public streams about the topics you are interested in, and classify on the fly if it’s a raving or ranting message. We are binary. We are only interested in people that feel strongly about a subject – love it or hate it. We don’t attempt to classify the lukewarm or balanced review. It’s very easy if someone uses tags (like #Obama #Fail), relatively simple if there is simple language (like “i couldn’t help but love Inglourious Basterds”) but impossible with sarcasm.
The challenge for us is balancing the false positives vs the false negatives to produce the best statistics. Consequently we will never be perfect, and its doubly difficult when you deal with only 140 characters and the volume of streaming data. However we are:
- Open - you can download every opinion that we have on a topic and make your own conclusion. (We provide RSS feeds on our pages)
- Fair - every opinion has to be attributed, either through Facebook or Twitter account or through email if left directly
- Consistent - everything is treated the same so in most cases the numbers paint a highly accurate relative picture.
And we will improve it – we will soon:
- Look at reach (how many followers / readership does each person providing the opinion have at that time)
- Do a better job of picking up other public streams (next up facebook and blogs)
- Measure relative sentiment (if the average bank has only 20% approval, a bank with 40% is significantly popular)
- Provide time series data (the changes over time)
Hang in there…