Posts Tagged ‘evolisten’

Correlating Gas Prices with Social Media Sentiment

Friday, July 20th, 2012

Sentiment analysis isn’t perfect and anyone who has tried to do it with social media data will confirm that. The nuances of language, including sarcasm, emoticons, slang, spelling errors, grammar creativity, and more mean that 100% accuracy is simply unattainable. But in market research, we aren’t looking for 100% accuracy, not even 90% accuracy. We know those kinds of numbers are unrealistic. What we expect, however, is to see that social media data has some relationship with real world data. And that is what we investigated here.

This project began by simply finding a third party source of fuel prices and we turned to Gasbuddy to give us average monthly US gas prices. Given that we estimated data points by carefully eyeballing a chart on the screen, the Gasbuddy numbers aren’t accurate to the last decimal place. But if you compare our Gasbuddy chart with the official chart, you’ll see that the trend is accurate. This is our criterion dataset.

The second dataset came from Conversition’s Evolisten database. We collected hundreds of thousands of verbatims from thousands of websites all of which in some way referenced fuel or gas prices or costs. Twitter, Facebook YouTube, Flickr, any type of website where people felt like sharing their opinions about gas prices was our target. After cleaning out the spam, we measured the sentiment of the remainder of opinions. Then, we calculated the inverse of the sentiment score. For example, a score of 5 (very positive) was changed to 1 (very negative), and a score of 1 was changed to a 5.

What you see in this chart is a correlation of 0.65. In other words, as the price of gas increases, sentiment decreases.

It just makes me think… what if everyone tweeted and messaged that the price of gas was really low. Could we turn this correlation into causation? It’s worth a try!

conversition evolisten fuel prices sentiment

MR Web: e-Rewards Buys Conversition Strategies

Monday, May 16th, 2011

e-Rewards Buys Conversition Strategies
May 11 2011

Jean Davis and Tessie Tinge-Rewards, the parent of online panel firm Research Now, has acquired social media market research agency Conversition Strategies, just two years after its launch by former Ipsos and NPD executives Jean Davis and Tessie Ting. Terms of the deal were not disclosed.

Conversition, which has bases in the US and Canada, applies scientific principles to the collection and analysis of social media data. Its flagship product EvoListen collects data sourced from online social media outlets; cleans, filters and weights it; and then formats it into quantitative data sets.

Read the rest of the mrweb announcement here.

Conversition Strategies launches EvoPlay, a consumer friendly social media exploration tool

Friday, July 23rd, 2010

For Immediate Release

New York, NY, July 23, 2010 – Conversition Strategies, the developer of social media research product evolisten™ and www.tweetfeel.com, has launched EvoPlay, a social media exploration tool for consumers.

Unlike evolisten™ which has been specifically designed for researchers, and tweetfeel which has been designed for  instant twitter analysis, EvoPlay is a free tool that consumers and brands alike can use to visualize sentiment across the entire internet space. Based on the most topical brands, the tool presents sentiment and conversation topics and provides a taste of social media research in a fun way. Sentiment can be displayed over limited time period allowing consumers to see brands rise and fall as marketing campaigns succeed and fail.

“The scientific principles of social media research can be intimidating, but this tool helps people see it in a more basic form by creating an engaging experience,” states Tessie Ting, co-founder of Conversition.  Jean Davis says, “This new tool will allow people to get inside the data and really experience what social media research is all about. Not just Twitter, not just Facebook, but the entire internet space.”


About Conversition Strategies:

Conversition Strategies is a boutique online market research firm based in the US (Conversition Strategies Limited) and Canada (Conversition Strategies Incorporated).  Conversition listens to consumers by applying scientific principles to the collection and analysis of social media data. Its strength lies in combining the expertise of globally respected market researchers with social media mavens.

For more information, please contact:

Jean Davis
Conversition | By researchers, For researchers

10 things you need to know about social media research

Monday, July 19th, 2010

The Good

  1. Anyone can benefit from social media research even if you have no social media presence. You can research your own brand, your competitors’ brand, the category, or the industry.
  2. You can measure far more types of information than the longest survey can. When your survey must be cut-off at 60 questions or 60 minutes, social media research answers questions that might require a 10 hour survey.
  3. You can listen to the voice of the consumer in their own, real, unfiltered words. Unlike surveys and focus groups where consumers may clean up their voice, or try to conceal hatred or indifference, genuineness is clear and strong in social media research.
  4. You can measure data using any scale imaginable. 2 points, 5 points, 10 points, 100 points. Your wish is our command.
  5. You can impress your boss with the statement that you are using data fusion technologies to combine the insights of survey research with those of social media research.

.


Photo credit: snowbear from morguefile.com
.

The Bad

  1. You need to sample your data sources properly, or you won’t be able to predict to the general population of internet users
  2. You can’t measure incidence. Just because people don’t say they are using your brand, doesn’t mean they aren’t using it. They just haven’t said so.
  3. You can’t measure awareness. Just because people aren’t talking about your brand online, doesn’t mean they haven’t heard of it. They just don’t talk about it.
  4. Because most people don’t share their personally identifying information when they contribute online, demographic and geographic is less precise than what you are used to with surveys or focus groups.
  5. The validity of sentiment and text analysis differs by vendor. Users of social media research need to ask their provider how they validate their results.

.

Buyer beware. Buyer be smart.

How to choose between human coders and automated coders

Thursday, June 3rd, 2010


Photo credit: cohdra from morguefile.com

In the right hands, text analytics can turn a nightmare into a dream come true. With the increasing popularity of social media research, companies are regularly collecting thousands, and even millions, of verbatims that require analysis. On the other hand, human coders have been carrying out text analytics for decades now, and in particular, why use automated systems when humans are doing the job so well?

Here are some guidelines to help you decide which method is right for you.

  1. Sample sizes – Sample size will likely be the most prominent variable in choosing a method. If you’re working with thousands or millions of verbatims, automated systems are your best friend. On the other hand, databases of several hundred verbatims are best done by hand. Remember, even if an automated system is used on a small dataset, you would still end up reading every verbatim to get a human flavor for the data. If you’re going to read every verbatim, you might as well do the analysis by hand.
  2. Number of constructs – If you normally use only a small number of predefined constructs, the human method is works great. Coders can easily remember all the intricacies of the coding scheme if it is strict and well-defined. And of course, it’s fun and interesting to get your hands right in there. But, if the research plan uses coding systems with hundreds or thousands of constructs, it is simply impossible for coders to remember all of them with sufficient within or between-rater reliability. Automated systems can really ease this process.
  3. New constructs – Are you open to discovering and implementing any number of new constructs? If you’re open to adding a handful of new constructs, then automated systems won’t make it much easier for you and you will be happy with your standard manual processes. But, if you want to be surprised and see where the data takes you, automated systems can provide that.
  4. Timing – This is the business world, after all. Are you in a rush? Are the results required yesterday? Well, if the data is already in a clean, computerized format, an automated system will work nicely. But, if your data consists of 20 sets of handwritten notes, most of which are barely legible, you might prefer the brain power of human coders who can turn scribbles into codes without any intervening translations.
  5. Coder reliability – Are you able to train and retain enough reliable coders? If you have a good team of trusty reliable coders, then keep them happy. They are valuable people who should be treated with kid gloves! But, if you’re having trouble finding those gems, an automated system will ensure that a high level of within-rater and between-rater reliability is maintained. It will even eliminate within-pair compromise.

In the end, you must choose the system that works best for you. Whether automated or human, one method with have the pros and cons that suit your specific needs. Choose well!

Social Media Research: Forget the Buzz, Focus on Biz

Tuesday, June 1st, 2010

On May 30, 2010, Annie Pettit (@LoveStats) presented the topic of social media research to a packed house at the MRIA conference in Toronto (#MRIA_AC). Enjoy the slides and leave a comment below. We’d love to hear your reactions. Even better, come see us in Boston next week at the MRA conference (#MRIA_AC). We’d love to chat in person!

Why Lee Won

Thursday, May 27th, 2010

After intense competition between Lee Dewyze and Crystal Bowersox for the title of 2010 American Idol, Lee took home the title. It was a battle between reflective, general competence and quirky, stylistic competence, and general competence is what the American people wanted this time. Though an obvious majority of idol fans are delighted that Lee won, another group of people are wondering how could this possibly happen. Well, here is the final answer on Why Lee Won.

First, let’s have a look at how Lee and Crystal have been doing on a weekly basis. The chart shows the percentage of opinions that were positive towards Lee and towards Crystal. Since the beginning of March, they have shared the top spot, but Lee has taken that spot on a more consistent basis.

Lee: Emotions towards the two contestants were fairly similar. About 48% of people had positive feelings towards Lee (as opposed to neutral or negative) while about 47% of people had positive emotions towards Crystal.

Lee: Recommendations for the two contestants were slightly higher for Lee. 43% of recommendations for Lee were positive as opposed to 39% for Crystal.

Lee: Physical appearance played a role as Lee generated 43% positive opinions about his looks while Crystal only generated 37% positive opinions about her looks.

Crystal: It could have been Crystal’s wacky locks that pulled her down but it wasn’t. Opinions about her hair generated 39% approval while opinions about Lee’s hair, as ordinary as it is, generated 36% positive opinions.

Lee: Crystal clearly had an issue with her discolored teeth and this also contributed to her physical appearance scores. But, many people were able to get past that as 24% of opinions about her teeth were positive while 29% of opinions about Lee’s teeth were positive.

Lee: Regardless of how true, Crystals disheveled hair and discolored teeth contributed to an overall opinion about cleanliness. Crystal generated only 34% positive scores in this regard while Lee generated 43% positive scores.

Lee: Crystal’s unique personality did not go unnoticed as 20% of people approved of her quirkiness. But, the calm, even personality of Lee was more desired as 39% as opinions about Lee’s emotional stability generated positive scores.

Lee: Lee’s quietness was very appealing to people as he generated a 47% positive score compared to Crystal’s 30% positive score.

Crystal: Not surprisingly, Crystal generated much higher new and different scores, with 50% positive opinions compared to Lee’s 42% positive opinions.

Crystal: And, Crystal definitely owned the ‘cool’ factor as she generated 55% approval compared to Lee’s 47% approval in this area.

Crystal: Crystal was also felt to be far better at selecting songs with 69% approval compared to 51% approval for Lee.

Crystal: These unique Crystal characteristics led to her generating slightly higher anticipation scores. Crystal generated 50% positive opinions while Lee generated 47% positive opinions.

Given that these are the areas generating the most differences between Lee and Crystal, one thing becomes clear. You might think appearances don’t matter, but that is Why Lee Won.

MRWeb Announces New Tool By Conversition and Peanut Labs to Make Social Media Research-Friendly

Thursday, May 13th, 2010

MRWeb today announced a new tool by Conversition Strategies and Peanut Labs to make social media research-friendly.

Social Media Sentiment: Bells, Cups, or Shark Fins?

Friday, April 30th, 2010

When do consumers choose to talk about brands in social media?

1) When they’re really, really annoyed and need to vent.
2) When they’re surprisingly pleased and want to share the love.

In fact, those are the messages people love to read. The hate, the disgust, the love, and the adoration make for interesting and provoking reading. These types of messages are also really fun to share with other people because the extreme opinions strike a very deep cord.

As with all things though,  there are more than two options. There must always be the neutral or disinterested or uninvolved vote. Some people have wondered whether this third option has any meaningful presence in SM data. Is the neutral vote simply overwhelmed by all of the extreme opinions, or does it actually show up in significant counts? Do we have a bell curve of emotions ranging from negative to positive or a cup curve of emotions that only includes very negative and very positive?

We evaluated the sentiment of three different brands to determine the distribution of 5000 randomly selected scores for each brand. We chose one brand that should skew negative, one that should skew positive, and one that should sit nicely in the middle.

#1 Toyota - We selected Toyota for the obvious reasons. All of the recalls they’ve experienced recently haven’t caused a lot of positive sentiment towards the brand. However, you can clearly see that though the negative side of the curve is well filled in, so is the positive side and the middle section, the neutral section. There is no denying a well rounded bell curve for this set of data.

#2 Taylor Lautner - Again, Taylor was selected for obvious reasons. He has received nothing but positive reviews as of late. It is clear that his sentiment lies more closely on the positive side of the curve, but once again, it does not pile up on the most extreme position. Even among the positives, there is a wide distribution of sentiment.

#3 Walmart – Finally, we selected Walmart as a solid overall brand that hasn’t received any particularly positive or particularly negative review as of late. The distribution of scores is fairly even, and  focuses neither solely on the extreme positive nor the extreme negative.

It’s clear that SM does more than elicit extreme positive and extreme negative opinions. In fact, in all three cases, the distribution of opinions ranged nicely from extreme positive all the way to extreme negative. Though the mid-point of the distribution shuffled a bit from left to right, the overall distributions were similar. They may not be perfect bell curves, but there is no other curve that better describes the resulting data (except perhaps a shark fin curve for the Walmart frequency!).

In the offline world, people are more than extremes. They express a wide range of emotions from positive to negative with all points in between. We have just seen that this range of emotions transfers to the online world as well. We’ve definitely got a bell curve of emotions.

Social media research for beginners: A presentation by Conversition

Monday, April 26th, 2010

At the recent ASTM conference held in St. Louis, Conversition Strategies presented a primer on social media research. Please enjoy the presentation here, though minus the personalized commentary and sense of humour from our presenter, Annie Pettit. Our next conference stop will be MRA in Boston. See you there!