A Twitterbot-sized chip on our shoulders

Human nature dictates that often we ask questions we believe we already know the answer to. And, much as they may deny it, market researchers often believe they know the outcome of their survey before they send it. Sometimes responses reveal that their assumptions of the audience are off the mark or that their beliefs are not, in fact, as dominant as they had thought. However, it is rare that a failed survey can actually produce more interesting insights than a successful one.

This is the scenario that Cheong, Aleti and Turner found themselves in when their survey of alcohol consumption habits among Australian Twitter users returned only five legitimate Australian responses out of a pool of 250. However, rather than crawl away with their tail between their legs, these researchers chose to question why it is that the medium of Twitter was such an unfruitful platform for the survey.

One of the discussion questions in Cheong, Aleti and Turner’s discussion paper asks whether, based on the observations in this case study, it’s possible to conduct a survey through a social media site such as Twitter?  Of course it is possible to conduct a survey through social media sites such as Twitter. Websites such as twtpoll and Survey Monkey provide user-friendly platforms to create surveys, disseminate them via social media and record the results. Twitter itself recently introduced Twitter Polls which enable users to ask one question of their follower network and poll the results over the next 24 hours. The challenge that Cheong, Aleti and Turner encountered was not in conducting a survey, but in conducting a targetted survey based solely on user posts.


Twitter users and, I believe, most regular users of social media and email have grown distrustful of messages from accounts they do not recognise and desensitised to the spam-like language attached to these forms of communication. A major driver of this distrust is the prevalence of ‘bots’ which generate huge amounts of online content. In particular, a Twitterbot is a program that is used to create automated posts on Twitter. They take a wide variety of forms, including bots that automatically follow users, generate spam, entice clicks and even to post @replies or retweet posts that contain specific words or phrases. In fact, an IULM University survey recently found that up to 46% of followers of surveyed brands on Twitter are Twitterbots.

In this context, it is understandable that Twitter users who received Cheong, Aleti and Turner’s survey request, disregarded it or even reacted with anger to the approach. An unsolicited survey request is an annoyance – it asks the user to donate their time with minimal reward. Respondents need motivation to complete the survey, perhaps to support an individual or organisation they approve of, to give insight into a significant social issue or to play their part in answering a question they would like answered.

In the case of Cheong, Aleti and Turner’s survey, the user is given no clear reason as to why they have been chosen – they are not members of a drinking-related group, a particular industry or institution which gives them reason for being chosen. For instance, if the survey had chosen to look at drinking among nurses then individuals would understand why they had been approached and may respond with the intention of helping others in the industry. Conversely, selecting users based only on the content of their posts does sound exactly like something a Twitterbot could be programmed to do… the survey would likely have achieved similar results if it had been disseminated by a Twitterbot, and at least it would have saved the researchers some time.


6 thoughts on “A Twitterbot-sized chip on our shoulders

  1. Great reading, Kelly. I thought the introduction was interesting – I didn’t know that we tend to ask questions we believe we already know the answers to. Anyway, It seems logical that the reason for so few respondents are because of people might taking it for spam. No doubt there are such a high rate of mistrust for certain Twitter researches if up to 46% of the followers on surveyed brands on Twitter are Twitterbots. I assume that the number of respondents would have been higher if the survey was made with a more descriptive reason why the users should take their time to answer it, just like you say. I think you highlighted some good aspects of why Twitter polls might and might not work, and also come up with a reasonable solution.


    1. Hi Johanna,
      Thankyou for reading and taking the time to comment on my blog post this week, I appreciate it 🙂
      Yes, I agree that if the users were given a reason why they had been chosen they may have been more likely to respond (depending on how comfortable they are with their drinking habits) and this is yet another limitation of the Twitter platform, there are literally not enough characters available for the researchers to explain themselves to the users.


  2. Hi Kelly,

    Thanks for the on-topic read regarding spam and all things wrong with it, including the Twitter survey! I totally agree that receiving an unsolicited invitation is annoying and will add to your point by saying a tweet only contains 140 or so characters, meaning most users spend a short amount of time actually writing a post. The survey was approximately 30 minutes long, with up to five hundred word open-ended responses in the first draft. Seems a bit odd to send to Twitter users who are so accustomed to the 140 response and probably aren’t going to allow the time via such an immediate platform.

    I do believe the researchers had a good basis for planning in that it was possible to target consumers (in the same way spam does) by filtering hashtags and narrowing down their prospective audience…which is important to marketing, it’s just that the execution was unfortunately not so successful, possibly due to people’s understandable fear of spam, as you’ve also mentioned. I hadn’t considered that one of the reasons people may have rejected or ignored the survey is due to Twitterbots, and am fairly astounded that 44% of brands are one! Interesting and scary all at the same time!


    1. Hi Melanie,
      Thankyou for taking the time to read and comment on my blog, I appreciate it 🙂
      That’s a good point you make that Twitter users are operating on a platform known for short, fast posts and responses. I can imagine that the prospect of a 20 minute survey with a possible total of 500 words, such as the one Tor & his colleagues posted would be very daunting.
      I was interested to see the screenshot of the survey invitation tonight and with it’s mention of Amazon, the $ sign and the long weblink to the survey it did look pretty spam-like.
      Although I suppose it’s always easier for us to make these assumptions in hindsight and should be thankful that our kind-hearted tutor has proven why we should never attempt to run a survey on Twitter 😉


  3. Hi Kelly,

    thanks for the post, it is very interesting to see how researches approach their target audiences through surveys. As we can see the majority tend to do much of their data collection online. However, I consider it is not always the preferred mode of data collection, especially if respondents are in hard areas to reach. I do believe that the mode surveys are conducted should depend on the type of study and the demographics of respondents.

    Online surveys and mobile surveys tend to be the most cost effective methods of survey research, but they may not reach those respondents that can only respond using alternate ones. Besides, results of online surveys and mobile surveys may suffer and differ greatly if important respondents are left out of the research. I think that for those respondents that are often hard to reach, it might be easier to catch them by using more traditional methods such as paper surveys or face to face interviews, even when these ones require more time, and money the results can be more reliable.


    1. Hi Alejandra,

      Thank you for your comment on my blog this week! I agree that some respondents my fall through the cracks with online and mobile surveys but I suppose this is part and parcel of the survey process, even with hard copy surveys we will get respondents whose surveys will not qualify for whatever reason – maybe they do not fit within the survey parameters or have misinterpreted the question or answered it incorrectly (selected too many responses for example).
      Online and mobile surveys have the advantage of being a very cost-effective means of getting a larger sample size, however I think their biggest drawback is their susceptibility to fraudulent behaviour or misrepresentation, such as the example Tor and his colleauges encountered with the bots on Twitter bombarding them with responses in an attempt to gain more Amazon vouchers.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s