"Facebook had a really boring deactivation page when you close your account... [One of the designers created a page] which has all your friends waving good-bye to you as you deactivate. Give you that final tug of the heart before you leave. This reduced the deactivation rate by 7%."
While that's a nice reduction in deactivations, it also strikes me as somewhat manipulative. It feels like many of the "social" companies see users solely as metrics to optimize, not as human beings.
Is it manipulative to optimize the tag line on your landing page? You're trying to persuade a visitor to buy your product. Or are you trying to manipulate them to buy your product? The same principal applies to keeping customers.
I'd call this manipulative when it becomes disingenuous. For example, had they put actual speech bubbles above my friends' heads and led me to believe that they actually were asking me to stay. Until then, this is roughly akin to saying "Hey man, don't leave the party yet. We haven't had cake yet!" To which I'd reply "OK, well twist my arm why don't you" and then eat some cake.
Well the deactivation page did automatically say that the users "friends" would "miss" them. Did they actually check with these users to see if they would want their image and identity to be used in this way? If not, then it seems pretty disingenuous to me. How do they know that these users will actually miss them? Why is facebook assuming that my friends would not respect my decision to leave facebook if that was the case..
>>"Did they actually check with these users to see if they would want their image and identity to be used in this way? "
I think we're waay beyond the point where fb checks with its users on how, where and when it wants to display their content. Sure you can go through their convoluted privacy controls but they'll find a way to do what they want.
If I'm told by a machine that "X will miss you", I do consider that disingenuous. There's a programmer using a program, trying to get an emotional reaction from me by the entirely speculative claim that another human being - probably emotionally close to me, but more probably not emotionally close to them - will miss me, suggesting an emotional connection human->machine->human where none exists, trying to exploit my natural tendency to anthropomorphize. And that kinda rubs me the wrong way.
i say false dichotomy. you may have persuaded your friend to stay at the party. you didn't do her any favors if she has to get up early tomorrow. a user at that final screen has clearly made up their mind.
also, saying "* will miss you" is not too far from putting "i'll miss you" in a speech bubble above them. i wouldn't miss >%50 of the people on my fb list and it misrepresents me to say i would miss them.
It's interesting that the only time Facebook seems to want to offer you any kind of emotional experience is when you're trying to leave the site, and the experience is designed to be a negative one, rooted in guilt.
Yeah. I'm not denying those kinds of experiences can happen on Facebook, but Facebook the company doesn't seem to care about making its users happier so much as it cares about keeping them on the site. Like network television or any other ad-supported medium, it doesn't need to be really good, it just needs to be minimally diverting. It's about generating addiction and not fulfillment.
"Facebook had a really boring deactivation page when you close your account... [One of the designers created a page] which has all your friends waving good-bye to you as you deactivate. Give you that final tug of the heart before you leave. This reduced the deactivation rate by 7%."
While that's a nice reduction in deactivations, it also strikes me as somewhat manipulative. It feels like many of the "social" companies see users solely as metrics to optimize, not as human beings.