The tech business is undermining itself with phony likes and manipulative sales practices that “game” customers. It could backfire.
I stumbled across two items on the web that pose an interesting problem for tech businesses, whether they are ad-supported, like Facebook or Google, or mobile games supported by in-app purchases. I think they show real problems with the idea of manipulating customers’ behaviour by “gamifying” it – turning everything into a carrot and stick that rewards the customer when they spend money.
One was this Youtube video about phony Facebook likes:
It’s a good video and makes some important points: that there are too many people being paid to deliver fake likes and it means buying ads on Facebook can be a dodgy experience.
I think many of his criticisms are spot on, and they present real problems for Facebook and Google, both of which are supported by advertising.
However, there are two important premises he has that make his experiment turn out particularly poorly.
1) He is starting from absolute zero, with no customers, community or real people to link to.
2) He isn’t targeting his advertising in any way.
From a marketing / advertising perspective, no targeting makes no sense. If you are a brick and mortar store, you want to attract people in your own area, and even if you are an online store you should be able to target people geographically, by keyword, or demographics.
That being said, the experience of spending money on Facebook or Google advertising and getting absolutely nothing for it in terms of sales is not rare. The fact that Google sends out hundreds of thousands of $100 coupons to get people to buy their advertising should be some indication that there is a problem.
The positive side of Likes on Facebook is that if you do it right, you can gather together a community who are actually interested in your business or service. If you have phony likes, that’s not the case.
The “gamification” part of the Facebook ads is that the advertiser pays for likes, and gets the satisfaction of seeing them go up, like points in a video game. The problem for the customer is that those likes are about as useful as points in a video game, and they cost real money.
The problem for Facebook is that the system itself is being gamed. Facebook has made it harder for organizations to reach all the people who like them on Facebook. To ensure that every user sees a post, you may have to pay extra for it.
Phony Facebook likes, fake Twitter followers and phantom Youtube views have all been used by celebrities as a way of making them appear more popular than they really are. Youtube wiped out billions of recorded views of Rihanna, Nicki Minaj and Justin Bieber, because they had been paying for views. The purge was so complete that Universal was left with just five videos left, and Sony had only three.
For celebrities buying fake followers is a way to fake status. When he ran for to be the GOP candidate for President, 92% of Newt Gingrich’s twitter followers were found to be fake.
But for businesses who are actually trying to find and reach real customers, they risk paying Facebook for mostly fake followers, then paying Facebook again to ensure their message reaches everyone, when only a fraction of the people following the page are actually interested.
Google faces exactly the same problem with “click fraud.” I have used Google ad coupons in small ad buys. Someone, somewhere clicked them all, but it never translated into a sale: in looking at google analytics, it rarely even translated into Facebook likes.
The other article I came across related to the use of in-app purchases on “free” mobile games, “How In-app Purchases have Destroyed the Industry” by Thomas Baekdal.
The article gave the example of a game, Dungeon Keeper, that used to be $5.99 for purchase, and is now free, but its in-app purchases slow down the game so much it is unplayable, and, what is more, incredibly expensive.
In these games, players are given a choice – they can either wait for the clock to run out for some feature to be unlocked (or to get another life) – which may take a few minutes, an hour, or a day, or they can spend money to get what they want right now.
Candy Crush, an app that is given away for free uses this method. Just as the player is on the verge of finishing a level, the player dies, and is offered the opportunity to finish a level for just 99 cents. Its revenue was reaching $850,000 a day.
99 cents may sound reasonable enough. But it is the same price for a couple of lives that may extend a game for 20 seconds as it is for a song on iTunes. And there are in-game purchases of much higher amounts – $5, $10, $25, $50 and $100 dollars, enough to buy a major game for a console.
Most surprising, these games are targeted at children, using globally famous films and cartoon strips, like Peanuts, The Smurfs, and Jurassic Park. These are screenshots from my iPad:
The “Trunk” of money in the Snoopy game costs $100, and so does the money in the Jurassic Park game. This money is used to buy upgrades in the game – but not finish it. Both are open ended games, and the Jurassic Park game recently added a combat mode that is not only boring, but you can’t progress without spending more, starting at a minimum of $1.99, which is soon exhausted.
These games are nothing short of exploitive. Baekdal also gave the example of a game, Asphalt 7, that requires $3500 in actual money to unlock every feature. There are many examples of children playing the games and racking up purchases of hundreds of dollars or more.
I have seen Facebook threads where friends with children complained that because they hadn’t disabled the password for purchases on their iPad, their children had racked up in-game purchases of $400, $800, or more than a $1000 – enough to buy a full game console, or a new computer! They were able to get refunds the first time, but others may not be so lucky, especially if it happens again.
These games are much more like slot machines or gambling than anyone might realize, except that with gambling there is at least a chance of a payout. The “free” game is almost unplayable unless you spend real money to speed things along. Waiting for new game elements may take minutes or hours, and useful in-game currency (there are often two kinds) is almost impossible to build up.
There have been TED talks delivered about “gamification” and encouraging people to behave by manipulating them with rewards, as if they are playing a game. When it comes to marketing, whether it is Facebook likes, or in-app purchases, there are problems with this approach.
One is that it sets out not just to persuade people, but to manipulate them. This might seem like a fine line, but the distinction is between honest communication about a product’s benefits and a deal that genuinely benefits both parties, and one where the customer is being deceived or manipulated, and if they find it out, feel ripped off. In their book “Yes! 50 Scientifically Proven Ways to be Persuasive” the authors make this point: that if customers feel manipulated, they will feel betrayed.
Part of running a business is that there is a relationship of trust, and of confidence. The issue of phoney likes and click fraud is a huge problem for Facebook and Google. It is not an exaggeration to say that their business depends on it. As for in-app purchases, Apple, gaming companies and the people who are licensing their children’s characters for addictive games (which aren’t particularly good) that exploit the player are fleecing their customers in a way that cries out for regulation – whether by the industry itself or government.
Companies should be willing to act, not just because it is the right thing to do (which it would be) nor because of fear of government regulation, but because it is in their own self-interest. This is the kind of business practice that is unsustainable, because it ruins the experience for the customer. Seller beware.
The irony is that this is the flip side of the ease of digital advertising and targeting that is the basis of Google and Facebook’s business models. It is being met by the ease of digital fraud, of bots pretending to be people, or people being paid to pretend to be someone else. Twitter has a button that allows individuals to report that a follower may be a spambot, and regularly purges its membership of phony members, but Facebook does not.
There is a bigger lesson here: Fakes and spambots created with computers don’t matter. Real people do.
– Dougald Lamont