Yesterday Facebook CEO Mark Zuckerberg published a lengthy (3,000 word) blog post entitled A Privacy-Focused Vision for Social Networking. The post outlined a pivot of Facebook's public position on consumer data, expressing a new commitment to preserving privacy and taking measures including ensuring end-to-end encryption of messages across all Facebook-owned platforms (something that at present is only available on Whatsapp and if you specifically enable it on Facebook Messenger).
If fully realised, this would represent a head-spinning reversal from Facebook's approach up until this point, which has constituted funnelling up as much of our data as possible and unscrupulously proffering it to a range of developers and third parties. Facebook's behaviour up until this point has been in service of this vision - when the company went public in 2012, it was under the banner of a 'more open and connected' world.
Zuckerberg meaningfully addressing privacy is of course long overdue, following years of mounting pressure and ever-ramping public concern. Facebook has made tentative forays in this direction before, with the announcement last May of the as-of-yet still unreleased 'delete history' feature. However, following the detonation of scandal after scandal - the Cambridge Analytica events of last year being the biggest to date - it seems that Zuckerberg has finally accepted that some kind of response must be made.
"Frankly we don't currently have a strong reputation for building privacy protective services," Zuckerberg writes in the post.
But is Zuckerberg really performing a swift 180 in how his business operates - from developing one of the most intensive and wide-sweeping data harvesting machines ever seen to the business of building 'privacy protective services'? Or is he simply paying lip service? The internet is - predictably - awash with circumspection.
However, it has to be recognised that based on the statements made in the blogpost that Zuckerberg has not committed to anything approaching a complete overhaul of Facebook's business model, as some coverage has seemed to suggest. While it's true that at present third parties can skim Facebook messages for personal information to better curate ads, messages are but a drop in the ocean of metadata Facebook is currently guzzling up. Zuckerberg's practical proposals don't yet extend far beyond encrypting messages and ensuring that messages and Facebook Stories will be deleted after a certain time.
Because he didn't actually commit to any more tangible measures, it's likely that most of the current data-leeching practices will remain in place. For example, Zuckerberg could have instead promised that Facebook would be limiting the number or variety of third party apps that have access to Facebook or Instagram data. Instead, the commitment to encryption across messaging channels is something that will grab headlines and appeal to privacy-concerned users, but doesn't necessarily imply a more radical change to the business model.
For a company whose central business is data, Zuckerberg did not meaningfully address how he would make up for any loss of revenue caused by diminished data being served to third parties. But with plans to branch into payments and the ongoing development of the commerce arms of both Instagram and Facebook this perhaps isn't such a concern. Zuckerberg may well have made the calculation that the good press is worth some lost revenue in the short to medium term.
And there could be another boon for Facebook tied up in this decision. Encrypting all messages removes the prospect of Facebook acting as a moderator of this content - a fact that has led to the company being unable to regulate Whatsapp messages and led some to blame the company for disinformation campaigns around Brazil's presidential election and those said to have helped spread hoaxes tied to lynchings in India.
There is ongoing debate about to what extent Facebook itself is responsible for the content published using its services. There is even appetite for laws that would make platform companies such as Facebook responsible for everything that is published on their sites, in order to jolt them into a greater regulatory role. Of course, this would be highly undesirable for such firms, and Facebook going down the total encryption route throws more weight behind the counterposition: that these platforms should not be culpable for the content hosted on them.
In fact, the post reads: "People should expect that we will do everything we can to keep them safe on our services within the limits of what's possible in an encrypted service." The statement deftly downgrades Facebook's responsibilities in regards to regulating communications.
This issue should prompt a debate within wider society about what role we want platform companies to have. Do we desire greater oversight and regulatory powers, necessitating more access to our content and less privacy, or do we instead wish for increased privacy and therefore a necessarily diminished regulatory role for the likes of Facebook?
In amongst the ephemeral mesh of good intentions, another tangible announcement from the post is that Facebook will apparently not be setting up data centres in any countries "with weak records on human rights like privacy and freedom of expression," that require the company to store data locally and therefore make it easier for governments to extract information to wield against its citizens.
If the company sticks to it, this effectively kills any further attempts to infiltrate China in any capacity. The question is, in what direction was this decision made? Did the company decide to proactively take the moral highground? Or were the difficulty of accessing this market (something Facebook has been attempting for a number of years) and the calculated payout from the PR-friendly claim greater drivers?
Some construed this move as a direct dig at Apple, which runs a data centre in China in compliance with local regulations, meaning that data is stored on-site. Apple has come under fire before from accusations that this makes data more accessible to the Chinese government.
Another tech giant that has suffered similar bad press recently is Google, and its seemingly unsquashable Chinese ambitions.
It's becoming a trend among tech giants to enter into controversial partnerships that provoke outcry from both employees and the press. In the case of Google, this pattern has been played out in regards to both the company's drone contract with the US army and its plans for a search engine that would comply with Chinese censorship and data gathering laws. In both cases, the pushback has precipitated the end of the projects (at least publicly).
Amazon has also been criticised for some of its army and government partnerships, and recently Microsoft too faced employee pushback over a contract with the US army that involved the use of its augmented reality device Hololens to train soldiers. The company chose to forge ahead anyway.
It could be that against this backdrop, Mark Zuckerberg is intent on Facebook emerging as the good guy. Last year, he said the birth of his children had prompted him to reflect on his legacy and become more determined to create a platform that has had a net positive effect on humanity.
But aside from his desire for his own image preservation, Facebook's public perception is important in gaining the trust the service needs if it is to fulfil its goal of reaching into every aspect of our lives. The drive for continued public approval was illustrated in Facebook's largest ever, infantilising ad campaign following the Cambridge Analtica scandal that declared a range of negative things Facebook had become associated with (fake news, spam, data misuse) as 'Not our friend'. Zuckerberg's blog post could be his latest pitch for the public's hearts, with his assertion that although Facebook is a giant, it's a gentle giant.