While doing these reports, collecting and analysing comments, threads, tweets and discussion boards on Boostcrusing (yea, I know it's a forum dedicated entirely to cars = zero interest) I realised not only do people talk alot, but they talk alot of crap. This is part of my job. Yeah that's right all the tweets and comments where you felt your one cent input will justify all reasons once and for all. I read. Yes, there are times where I want to kill someone because the comments seem endless. But amongst crap I find the most honest opinions that no field surveys can ever extract. It just puts a spring in my step, it does.
The tool I use is a piece of shit, and I'm not going to hold back on this one. I end up doing alot of waiting, manual crawling and manual graphing, literally defeats the purpose of buying it in the first place. I've trialed most of the monitoring tools that are available within the Australian market. There are no perfect tools. Although I have a good idea of what it should encompass (that's for another blog post). I do however, have opinions on 'social media buzz reports'. Here are a few tips:
- Structure your query of keywords carefully: my tip is that you should always be able to say it in a sentence with a singular purpose
- Always think from the consumers' perspective: how do they talk? What is their mentality? How might they refer to a brand's service, name or product? because that is what your query should consist of, also target high search spelling errors (get that off Google AdWords)
- Online audience IS NOT everyone: people on forums or blogs etc. are savvy, usually adopters and opinion leaders but they don't represent majority. What you need to be aware of is how 'searchable' is their opinion and how much influence could it have? e.g. Is their comment visible on the first page of Google search?
- Be thorough: check source, content and author, ensure they're not spammers or robots
- Never rely on automated sentiment: no matter what tool (Radian 6, Buzz Metrics, Sentiment Metrics, Buzz Numbers etc.) none of them have the human languages down to a tee. It's best to find a tool that can give you an approximate but also ability to manually categorise comments so the graphical display is a guarantee 70-85% accurate
- Sample size matters: 5 comments is not enough but 20 - 50 gives a pretty good indication. My reports for a brand, I source and manually analyse anywhere between 2000 to 8000 comments per report, so give or take.
There are more, but my brain is fried today.
0 comments: