Someone might think that Black Friday, Small Business Saturday, and Cyber Monday would be enough pseudo-holidays for the week following Thanksgiving. That person would be wrong. Another not-quite-holiday, Giving Tuesday, must be observed before we can stop marking the arrival of winter with open wallets.
I know about Giving Tuesday because Facebook told me about it. “It’s Giving Tuesday!” the social network said in an image depicting cartoon people holding up a giant, heart-clutching hand modeled after its Like button. “Today is a global day of giving back to our communities and the charitable causes we care about.”
The banner is innocuous enough. It drew attention to a day I wouldn’t have known about while prompting me to “see how others are giving back.” I might have ignored the whole thing if Facebook’s choice of words — “our communities” and “we care about” — didn’t make me pause to consider their implications.
Facebook has been a little creepier than usual lately. It keeps telling me that it cares about me, or that it thinks I’ll like a photo it plucked from social oblivion with its On This Day feature, which is supposed to compel me to share things I’ve already shared out of some prompted feeling of technological nostalgia.
Some have said these prompts are supposed to make people share more stuff on Facebook. The Wall Street Journal reported in November that this was the case based on Global Web Index statistics showing that users are posting to Facebook less even though they visit the social network several times throughout the day.
Jackdaw Research’s Jan Dawson agrees. “Ironically, Facebook needs real human beings to create content to make it a useful and meaningful service to use, and even though these are machine-based prompts, they still require humans to actually post something for their friends to see,” he said in an emailed response.
But the language used in these prompts has me considering another possibility: Facebook is using these images to humanize its service and seem more relatable, instead of like a cold, algorithm-driven social network that sucks content into its service in much the same way a mosquito draws blood from a person’s flesh.
That humanization could make it easier to share things with Facebook. People want to feel like they have an audience — being told that Facebook cares about “you and the memories you share here” could offer all the audience they need. Creepy? Yes. Effective? If it wasn’t Facebook would’ve already nixed the feature.
It is kind of funny that Facebook is testing these features now, while it’s also working to dehumanize the workers powering its M utility, which can be used for everything from finding a restaurant to sending parrots to a colleague’s office. The tool can do everything! But it relies on human workers for many functions.
Given that people are handing over their information to M and conversing with it the same way human beings always converse with digital facsimiles — with humor and candor not shared with actual people — it makes sense that Facebook wouldn’t call attention to the fact that it’s far different from Siri or Google Now.
This means Facebook is trying to humanize algorithms while it hides actual humans behind a veneer of artificial intelligence. (If only “algorithmize” were a word.) One convinces people to share more with the public; the other makes them more candid than they might be if they knew they were talking to a person.
How people react to these efforts depends on their disposition. Some people are more comfortable when a service tries to humanize itself; others, like me, are just kind of weirded out that a digital behemoth wants me to think it’s human. Either way it seems like Facebook is trying to blur the lines between machine and person in ways that allow it to get the most out of its billion-plus users.
Facebook didn’t respond to a request for comment. (And here all these images keep telling me that the company cares about me.) I’ll update this post if someone at the company emails me back and offers an on-the-record statement.