In early 2018, Cambridge Analytica proved that all press is not good press when it was revealed they had mined data from tens of millions of Facebook accounts to create targeted political ads. The Facebook community immediately lost their collective mind(s), a warranted reaction to being compromised and targeted on a platform that had previously championed granular control over your personal profile.
Facebook tried to defend their actions by stating that clauses in their Terms (beginning in 2009) allowed them to share data with third party applications. However, Facebook's Terms were found wanting, and they were unable to prove that individual users accepted these Terms.
A concept that is at the crux of contracting, particularly digital contracting, is affirmative assent. Essentially, in order for a contract to be valid, users must actively accept terms and understand that they are entering into the outlined agreement.
Cambridge Analytica got their information by leveraging a quiz users were asked to take for “academic use.” While the case could be made that the 270,000 users who elected to take the quiz provided affirmative assent, there is virtually no case to be made for the exposure of those user’s networks (totaling over 80 million accounts) to the same mining. Those indirect users were not given a chance to offer their assent, their data was simply harvested without their direct knowledge or agreement.
While affirmative assent matters, so do the terms to which you are assenting.
Facebook clearly hyperlinks their Terms and Privacy policies beneath the signup screen when users create their accounts. It is clear that by signing up and activating an account, users are agreeing to the terms outlined in the hyperlinked documents.
However, Facebook did not expressly call out data sharing with third party apps until a 2009 version of their terms. Even though the data mining occurred between 2013 and 2015 (after these new terms were in place), Facebook struggled to prove that their users, both prior to and post 2009, had assented to the updated terms and clearly understood the ramifications.
For starters, the changes required to maintain private data had to be made on both the application level and in the privacy settings. These changes, though not "intentionally" misleading, were not easily understood by the average user. Additionally, Facebook argued that anyone who had created an account prior to 2009 had automatically assented to continuous future updates. According to California law, however, that clause is only applicable to changes made “in good faith," which is not a term that applies to the massive shift in policy Cambridge Analytica represented.
Another factor that could end up impacting the overall enforceability of Facebook’s terms is the use of dark patterns on their site.
Dark patterns -- building your application or site to incentivize users to behave in the way designers prefer -- are deceptive web practices. Studies observing the Facebook user flow showed that users are often forced to opt out of settings rather than affirmatively opt in, and that achieving maximum user privacy on Facebook required 13 clicks rather than the 4 required to accept “factory settings.”
Practices like these are generally unfavorable when it comes to enforceability of digital agreements, because they can easily fall outside of any “good faith” requirements and cloud the nature of user’s affirmative assent.
The courts have determined that the enforceability of Facebook’s policy would make the Cambridge Analytica scandal a moot point. Facebook’s inability to prove that their terms were accepted in an enforceable manner, however, has left the court no choice but to allow the case to continue.
Apart from being a case study in how not to treat your users, Facebook’s ongoing litigation is a great example of why it is important to structure Terms and Conditions properly and actively track acceptance. Immutable acceptance records of initial terms and subsequent records of updated terms acceptance could have resulted in the dismissal of Facebook’s legal problems, leaving just the PR-related ones.
If asked to do so today, would your organization be ready to prove user acceptance and Terms enforceability?
Complete our clickwrap self-assessment to see your risk level, then reach out to learn more about how to better protect your business without compromising the acceptance experience.