A few days ago I saw, in a famous Italian TV show, how it is possible, very easily indeed, to spy on locked Facebook profiles, obviously without being friend of the profile owner. The procedure looked very simple and that induced me to try it by myself.

It took me something less than one hour to find the right tutorial (straightly on youtube) and a very useful but also very disquieting paper in which the (not so) hidden search engine integrated inside of FB was revealed to my eyes in all its disarming and alarming power.

Trying to keep on being a decent data protection expert, I tried this using a friend’s account and I looked into another friend’s account (they are not FB friends each other and both profiles are locked), so that, actually, I was only simulating to see something I was not allowed to; obviously both my friends were consenting.
Just a couple of simple commands, directly written in the browser’s address bar, and the reserved photos stored inside of my friend’s account became visible on the account of a user who doesn’t even know who the guy in the pics is!
And the pictures are not the only thing you can spy on. The pdf guide I found on the web, honestly with very little effort, is pretty alarming since I found in it any sort of query commands, including religious, political and sexual preferences filters. Please don’t ask me to publish the link to this guide; shouldn’t you trust me, simply try to search by yourself: I can assure you that, unfortunately, you’ll find it very quickly.

According to the expert in the TV show, these query results, coming out of the Facebook’s integrated search engine, are the consequence of the default settings Facebook applies to any content shared amongst two or more users.
It should work like this: when someone publishes, say, a photo in which, outside of him, are depicted and tagged other FB users, this photo is stored as a personal data for each one of the tagged persons. This means that the photo will get different privacy settings corresponding, respectively, to the privacy settings of each tagged person, and this is absolutely normal.

Therefore there is a high probability that, amongst the tagged persons, there will be someone who lets everyone look at his contents without any restraint (the so called “open profile”).
Well, it seems that the default setting for shared content in FB is always the “weakest one” amongst all privacy settings of the users sharing that specific content. It may sound a little bit complex, but the following example will help to understand:

User n.1 – private photos.
Only friends can see them




User n.2 – public photos.
Anyone can see them




User n.3 – partly public photos.
Friends and friends of friends can see them




User n.4 – private photos.
Only friends can see them




Default shared setting:
user’s n.2 setting;
the photo is public

This implies that a very high percentage of the photos on any FB account will be available to anyone, since any photo stored (also) on an open profile will inherit the setting “public” from it.
So to say, it is like keeping your personal data in a safe you share with lots of people: there’s a high chance that someone decides to leave it open, or simply forgets it open.

I’m not totally sure that this is exactly the reason why all this happens. For instance the previous explanation does not totally fit some other perfectly working search options of the FB integrated search engine, such as single user’s liked places or visited places. It is true that geo-location can be trigged on whether by a user’s spontaneous action or by the sharing of a content whom someone else geo-located, but this does not happen with “likes”: “likes” are strictly related to the single user. Therefore, for this kind of data, to say that they inherit the weakest set of permissions among all settings of the people who share that content, is or less probable or simply impossible.

Anyway, may this be the explanation or not, there’s a couple of things to be remarked.

First of all default privacy settings for shared content in FB should immediately be changed so that the common setting of any shared content becomes the most limiting ones among all settings chosen by the owners of that content. Realizing such a change appears not at all technically difficult.

Furthermore, this change is not only something you’d do for the common sense, it is also explicitly prescribed in the GDPR; but, even if we’ve been talking about default settings, we must not look at article 25, introducing the privacy by default principle. The default settings we’ve been talking about are only programming choices made by FB. We must refer to article 6 of the GDPR, the one elucidating the lawfulness of personal data processing.

Indeed, here we have a personal data (say a photo) referring simultaneously to several individuals, i.e. several “data subjects”. On public FB profiles all contents, photos included, can be freely seen since the data subject gave consent (art. 6, par.1, letter a) of the GDPR). Nevertheless, when a content is shared amongst several data subjects, it should be obvious that even if only one of the data subjects sharing that specific content has not given consent, that content must be locked, i. e. set to “private”, on the profiles of all data subjects who share it.

It is worth remembering that, when subscribing to Facebook, you let all of your “friends” to see all your contents; this condition goes under article 6, par. 1, letter b) of the GDPR, in which the processing (visualization of your contents to your friends) is legitimate as part of the contract between you and Facebook.

On the other hand this particular criterion of lawfulness is not at all applicable to the previous case, since the visualization of proprietary contents outside of the “clique” of one’s FB friends is clearly regulated through a declaration of consent. Indeed, it is to be remarked that, through may 2014, Facebook defaults set up all contents and posts to “public”, while now the default is that all contents and posts can be only seen by each user’s FB friends.

A second question on which a little clarity is due is the real, actual search engine integrated in Facebook and its profiling options. As you may know, most of Facebook’s profit comes from advertising (for example see this official report https://s21.q4cdn.com/399680738/files/doc_financials/2017/Q4/Q4-2017-Earnings-Presentation.pdf) and, simply by using the social network, it is clear that user’s tastes and preferences are used to offer targeted advertising.

This is a very thorny point. First of all, whilst users are able to choose if they want to receive targeted advertising, the experience teaches that you can choose to decline it rather than receive it, i.e. by default you will receive targeted advertising; in a second moment you’ll be able to choose whether to keep on receiving this kind of advertising or to refuse it. Moreover, even if it is self-evident that there’s a sort of compensation between the service Facebook gives and users’ personal data, it must always be taken into account that Facebook, as data controller, is obliged under law to protect and store all these data with respect for the GDPR. The search engine integrated into Facebook’s web interface, making available filtering options comprehending political views, religious beliefs and sexual preferences, is unacceptable under the GDPR.

In fact, everything we just analysed conflicts with basic principles relating to processing of personal data under the 679/2016 Regulation, listed in article 5. In article 5, paragraph 1, letter a) it says: “[Personal data shall be] processed lawfully, fairly and in a transparent manner in relation to the data subject (‘lawfulness, fairness and transparency’);”; and in the letter f) it says: “[Personal data shall be] processed in a manner that ensures appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical or organisational measures (‘integrity and confidentiality’).”.

Regarding letter a), it is needed to understand how clear is Facebook’s privacy policy about the fact that targeted advertising is set by default on active, at least until the user changes the settings. If this detail is clearly specified, Menlo Park people fails, probably, in making their policy a little less difficult to search into (just to say, I've not been able to find this particular detail).

Conversely, regarding letter f), we can detect a total violation of GDPR’s instructions. The mere user's experience says that, with little effort and very little E-skills, anyone can see personal data which are only nominally reserved, not considering the availability of filters pointing directly to special categories of personal data of millions of Facebook users. This is not a glance, it is a thorough overlook on data of which Facebook should take care of painstakingly, whereas they are actually easily available for everyone to see, be they simple nosy individuals or profiling companies who, from these informations can gain, as most optimistic and innocuous hypothesis, big profits.

To these remarks we can also link article 21, paragraph 1, of the GDPR, in which the right to object is ratified. Here we learn that, when the subject exercises this right, the controller, i.e. Facebook, must stop processing personal data (except when the controller can prove that legitimate and cogent interests exist, allowing the processing and prevailing on overriding interests, fundamental rights and freedoms of the data subject); consequences of a scenario in which Facebook users would object en masse are easy to figure out. And, being available in Facebook such an easy to use search engine, it looks very difficult to comprehend all this into the contract terms of the most used social network in the world.

Moreover this whole matter belittles the well known “Cambridge Analytica” scandal. As you may remember, in that case Facebook had been questioned for lack of attention in managing and safeguarding its users’ data, letting CA, through a very permissive set of settings, accede to personal data of “friends of friends” of people using the single sign on feature to log in to CA applications. In short the “bad guy” was the British company, while Facebook simply played the role of the imprudent one. But what we have seen to be possible through the integrated Facebook’s search engine is a real reversal of scenario; now CA’s people almost look inexperienced, having paid to make use of the “Facebook login” feature, whereas, thanks to this astonishing search engine, anyone could have reached roughly the same results without paying a single penny.

In conclusion, shall it be for the common sense, or to comply to the regulations, or not to make the company sink, Facebook’s Management Board would do well to carry on a rigorous review of permissions and, more in general, of privacy managing criteria, so to finally take the road to GDPR compliance. Outside of that there’s always the big question mark regarding the hole opened by the integrated search engine in the social network residing in Menlo Park, whose simple existence can undermine Facebook’s survival itself.