Archive for February, 2011

Is Groupon Still on a Time-out? Social Media Research Reveals All #MRX

Friday, February 25th, 2011

Alright, so Groupon screwed up with their Super Bowl commercial. Tweeps were furious, many of them indicating they planned to unsubscribe from the group coupon service. But, as most of us know, actions speak louder than words and it made us wonder – did the commercial REALLY have any negative impact on Groupon?

The first chart below illustrates the volume of online conversations related to Groupon starting at January first and running until today. You can clearly see the spike when the commercial aired and people tweeted their thumbs off with fury. Then, you can see how the volume of conversations slowly returned back to a normal level.  In terms of just plain volume of conversations, it seems that we got over the offense pretty darn quick.


Obviously, though, the volume of conversations about a brand has pretty much nothing to do with how people feel towards a brand.  Videos and commercials and memes can be become viral because we hate them as well as because we love them.

With that in mind, we measured the sentiment towards Groupon over the same time frame. The following chart illustrates the percentage of comments about Groupon that were positive (the top black line) and the percentage of comments that were negative (the bottom blue line). Again, you can instantly see when the Super Bowl commercial aired. Up until that day, about 40% to 50% of conversations about Groupon were positive and about 3% to 7% were negative. Then, suddenly, only about 25% of conversations were positive and about 15% were negative.

But wait, we’re still angry with Groupon aren’t we? So why, pray tell, has the percentage of positive messages returned to 40% and negative messages to 5%? Sounds to me like the tweeters were just shooting off at the mouth. Were you one of them?


.
.
Related links

Join Us at the NewMR Virtual Festival – Listening is the New Asking #MRX

Thursday, February 24th, 2011

We are pleased to once again be presenters at a NewMR virtual festival! This virtual event looks at different aspects of buzz listening, blog monitoring, and computer assisted qualitative data analysis. You can read more about the event here.

The Event will be held as three two-hour sessions, each session being targeted at a different time zone. Annie Pettit will be speaking at the end of the third session which is 3pm to 5pm New York Time (EST) or 8pm to 10pm (GMT). .

Annie Pettit, PhD is the Chief Research Officer of Conversition Strategies. She has more than 15 years of experience as an online market researcher and specializes in data quality and social media research. Annie is a member of the CASRO, MRA, and ESOMAR social media research committees. Annie was previously the VP of Online Panel Analytics at Ipsos. Her expertise in research methods and data quality has been highlighted through numerous conference presentations, including ARF, CASRO, MRA ,MRIA, NetGain, and IIR. She has also published numerous articles in both professional and refereed magazines and journals.

.

Presentation Summary

Quick and Dirty or Slow and Careful: Social Media Data Speaks

Social media research seems like a quick and easy solution to surveys. The data is instantly available to anyone whether you are a researcher and whether you have access to any specialized tools. But, this paper will demonstrate, with real data, how your research results can be negatively affected if you don’t take care at various stages of the research process. We will evalute how results are affected when care isn’t taken to develop the initial search set, to create the variables being measured,  and to select the websites being reviewed.

RW Connect: Privacy and Ethics in Social Media Research #MRX

Wednesday, February 23rd, 2011

Privacy and ethics in social media research continue to attract the attention of established research organizations. Read about ESOMAR’s efforts to contribute to the discussion via a social media research committee that includes Annie Pettit, quoted below.

Privacy and ethics in social media research

Research Challenges & Issues — By Manfred Mareck on February 22, 2011 4:10 pm

“Most consumers are aware that their online conversations could be monitored but a small percentage may not understand that this is the case. A small percent of two billion online users is a lot of people who would be surprised, and possibly embarrassed or offended, that their information is being shared in an arena outside of what they originally intended”, says Annie Pettit, chief research officer at Toronto-based Conversition Strategies and member of the guideline project team. “Whilst ultimately consumers should always protect themselves this is not the ethical standard that market researchers can align themselves with and we must always work diligently on the contributors’ behalf.”

ESOMAR Launches Consultation on Social Media Research Guidelines

Tuesday, February 22nd, 2011

Social media research has grown extremely quickly over the last couple of years necessitating the need for various market research organizations to review the method and provide guidance. After creating a team including Annie Pettit of Conversition Strategies, ESOMAR has developed a draft guideline on social media research. This draft guideline is now publicly available for comment.

Excerpt from ESOMAR website:

“Social media research is a hot new technique for gaining insights that has also attracted significant media attention because of consumer concerns that they are being observed or tracked without their knowledge.

Given that it is critical that researchers are aware of and respect international, national and local laws and regulations as well as cultural dispositions, an ESOMAR team of experts has developed a new guideline to help researchers understand the key fundamentals of transparency and professionalism in respecting consumers’ concerns.

It is ESOMAR’s aim to cooperate with associations around the world to reach consensus on internationally agreed best practice and ESOMAR is working closely with CASRO in developing this guideline.

We welcome your feedback on this new guideline.

Please send your comments to professional.standards@esomar.org no later than Monday, 21 March so they can be taken into account in finalizing the guideline. Read the draft guideline.

.
.
Related links
Social media monitoring vs social media research: Can you see the difference?
The Conversition Hierarchy of Social Media Insight
Coke it is! Or not. I’m not sure. I can’t tell.
Apple pie, Apple orchard, Apple cider, or Apple iPad
Battle of the Brands: Blackberry vs iPhone

#MRX MRA IMRO SMR Guidelines #16: Demos and Geos

Thursday, February 17th, 2011

MRA recently released version 1 of the MRA/IMRO Guide to the Top 16 Social Media Research Questions, a tool to help newcomers and vendors communicate with each other about this new datasource and method. Conversition was a key contributor to this document which is now available on the MRA website.
.
This blog is #16 in a series of 16, each one addressing Conversition’s viewpoint on one of the items in the guidelines. We welcome your questions and comments, and look forward to further discussions on this exciting new trend in the market research industry.


duboix from morguefile
.
What, if any, methods are used for determining the geography associated with the data?
.
Geography and demographics are a tricky situation for social media research. Unlike survey research where people provide their age, gender, and other personal information, very few people do so in the social media space. (When was the last time you saw someone tweet something like, “I’m a 28 year old female who has a Marketing Diploma and I love poptarts.”)
.
There certainly are methods of determining the geographical and demographic characteristics of people contributing to social media but they aren’t perfect.

  • How about if you notice if the website is .ca or .uk or .au? We wish that method worked but the vast majority of social media is generated in .com websites which are visited from people all around the world. And, people don’t refrain from posting on websites just because they notice it’s reflecting a different country. The internet is a wonderfully global population.
  • How about IP addresses? This is another option, but again, a less than perfect option. According to my IP address, I’m sitting in the USA right now. But, I’m pretty sure I’m not!
  • What about the demos that people do share? Yes, people do share their demographic information. A very, very tiny % of people share one or two pieces of information but certainly nowhere near what researchers are used to, and certainly nowhere near what is required to create a valid generalization outside of those individual people.

Because of these reasons, Conversition uses a variety of methods for understanding the demographics of social media contributors.

.
Related links

MRA IMRO Guide #1: Advantages and Disadvantages of SMR
MRA IMRO Guide #2: Datasources of SMR
MRA IMRO Guide #3: Data Fusion and SMR
MRA IMRO Guide #4: Reliability of SMR
MRA IMRO Guide #5: Responsibilities of Social Media Data Users
MRA IMRO Guide #6: Social Media Research Skills
MRA IMRO Guide #7: Research Contributor Awareness
MRA IMRO Guide #8: Citing Reference
MRA IMRO Guide #9: Legal Issues
<

#MRX MRA IMRO SMR Guidelines #15: Validation

Tuesday, February 15th, 2011

MRA recently released version 1 of the MRA/IMRO Guide to the Top 16 Social Media Research Questions, a tool to help newcomers and vendors communicate with each other about this new datasource and method. Conversition was a key contributor to this document which is now available on the MRA website.
.
This blog is #15 in a series of 16, each one addressing Conversition’s viewpoint on one of the items in the guidelines. We welcome your questions and comments, and look forward to further discussions on this exciting new trend in the market research industry.

clarita from morguefile


If sentiment scoring is provided, what is the process for validating results?
.
We regularly validate both our sentiment analysis and our content analysis results.
.
Sentiment analysis is validated both at a project level (for example, just YOUR data) as well as at an overall level (a random sample across many different brands). Though there are more scientifically accurate methods (i.e., have fun reading Krippendorf!), we have chosen a simpler method because it is easily understood by both novice and experienced research users.

The two important features of our process are 1) the large sample size which ensures we do not receive spuriously high (or low) validation scores associated with a skewed selection of data points, and 2) the blinded nature of the manual scoring which ensures the researcher does not unconsciously create high validation scores.

Our process is as follows:

  • Randomly select 1000 verbatims
  • Manually score each verbatim as negative, neutral, or positive
  • Align the automated scores to the manual scores
  • Calculate the percentage of automated and manual scores that match

The process for validating constructs is similar.

  • Randomly select 1000 verbatims from a specific construct
  • Determine if it does or does not reflect the intended construct
  • Calculate the percentage of verbatims that were correctly identified as reflecting the intended construct

At the end of each process, we then identify the weak points and adjust our systems to account for them. For instance, if we discover that “Charlie Brown has been erroneously coded under the color construct, that anomaly is corrected.

It takes a long time to validate data but if you’re going to do it, you might as well do it right.

.
.

Related links

MRA IMRO Guide #1: Advantages and Disadvantages of SMR
MRA IMRO Guide #2: Datasources of SMR
MRA IMRO Guide #3: Data Fusion and SMR
MRA IMRO Guide #4: Reliability of SMR
MRA IMRO Guide #5: Responsibilities of Social Media Data Users
MRA IMRO Guide #6: Social Media Research Skills
MRA IMRO Guide #7: Research Contributor Awareness
MRA IMRO Guide #8: Citing Reference
MRA IMRO Guide #9: Legal Issues

Battle of the Burgers: Big Mac vs Whopper

Tuesday, February 15th, 2011

Sure, you can measure overall online sentiment towards a brand but that’s not particularly useful. You need to measure very specific aspects of your brand such as the fizziness of the soda pop, the dust on the shelves, or the pickle in your burger. This case study demonstrates the level of detailed analysis that can be achieved by combining content analysis and social media research.

Even if you’ve never tried the competition, you have a favorite hamburger. Check out our newest case study pitting the McDonald’s Big Mac against the Burger King Whopper. Tomatos, sauce, and onions, oh my! Who will win and do you agree?
.

.

Related Posts

Social media monitoring vs social media research: Can you see the difference?
The Conversition Hierarchy of Social Media Insight
Coke it is! Or not. I’m not sure. I can’t tell.
Apple pie, Apple orchard, Apple cider, or Apple iPad

#MRX MRA IMRO SMR Guidelines #14: Sentiment Scoring

Saturday, February 12th, 2011

MRA recently released version 1 of the MRA/IMRO Guide to the Top 16 Social Media Research Questions, a tool to help newcomers and vendors communicate with each other about this new datasource and method. Conversition was a key contributor to this document which is now available on the MRA website.
.
This blog is #14 in a series of 16, each one addressing Conversition’s viewpoint on one of the items in the guidelines. We welcome your questions and comments, and look forward to further discussions on this exciting new trend in the market research industry.

Alvimann from morguefile
Does the company provide sentiment scoring?

Why yes we do!

Conversition has built a proprietary sentiment scoring system that features many advantages. We’re very proud of its accuracy, particularly in relation to its application in marketing research.

  • We score more than 95% of data.
  • We do not throw out verbatims that can’t be scored because it turns out they are true neutrals.
  • We score verbatims on a continuous scale from extremely negative to extremely positive.
  • We score grammatically correct and grammatically incorrect conversations.
  • We score emoticons and slang.
  • We use an automated system that lets us score millions of conversations every day.
  • Our system is carefully designed to meet the specific needs of market researchers.

.
.

Related links

MRA IMRO Guide #1: Advantages and Disadvantages of SMR
MRA IMRO Guide #2: Datasources of SMR
MRA IMRO Guide #3: Data Fusion and SMR
MRA IMRO Guide #4: Reliability of SMR
MRA IMRO Guide #5: Responsibilities of Social Media Data Users
MRA IMRO Guide #6: Social Media Research Skills
MRA IMRO Guide #7: Research Contributor Awareness
MRA IMRO Guide #8: Citing Reference
MRA IMRO Guide #9: Legal Issues

Justin Bieber vs. Katy Perry vs. Metallica: A social media research case study

Friday, February 11th, 2011

With Justin Bieber’s movie opening to the masses today, you might be under the impression that he is a big winner all around. Are you sure?
.
Check out this short case study showing weekly tracking results for Bieber, Katy Perry, and Metallica. There’s something in here for teen idol fans, pop fans, and rock fans alike.
.
Enjoy!
.

#MRX MRA IMRO SMR Guidelines #13: Data Quality

Thursday, February 10th, 2011

MRA recently released version 1 of the MRA/IMRO Guide to the Top 16 Social Media Research Questions, a tool to help newcomers and vendors communicate with each other about this new datasource and method. Conversition was a key contributor to this document which is now available on the MRA website.
.
This blog is #13 in a series of 16, each one addressing Conversition’s viewpoint on one of the items in the guidelines. We welcome your questions and comments, and look forward to further discussions on this exciting new trend in the market research industry.

manicmorff from morguefile

What data quality processes are implemented in each stage of the SMR?

Ah, we love data quality! You need a sharp eye to see all of the details that we focus on in our regular data quality processes.

Once data has been collected, we apply rigorous techniques to identify and remove many different types of spam. Keyword loading and unmoderated boards are the worst of the offendors but we vigorously seek them out. That’s how we know that British Petroleum data isn’t Blood Pressure or Basis Point data.

We also apply apply rigorous processes to both our sentiment scoring and content analysis. That’s how we know that Charlie Brown isn’t a color and that ‘what a crock’ isn’t dishes. That’s how we know that “the bomb” is a good thing to say but “ftl” isn’t.

It’s not sufficient for us to use a one method fits all system. We know that doesn’t work. So we have a one method fits all process and hundreds of individualized methods, a unique one for every client. Because we know that an average level of data quality isn’t good enough. You need data quality specific to your job.

Quality is job 1. We believe it and practice it.

.
.
Related links

MRA IMRO Guide #1: Advantages and Disadvantages of SMR
MRA IMRO Guide #2: Datasources of SMR
MRA IMRO Guide #3: Data Fusion and SMR
MRA IMRO Guide #4: Reliability of SMR