Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

November 18 2019


Repubblica.it: A massive scandal: how Assange, his doctors, lawyers and visitors were all spied on for the U.S.

«The videos and audio recordings accessed by the Repubblica reveal the extreme violations of privacy that Julian Assange, the WikiLeaks journalists, lawyers, doctors and reporters were subjected to inside the embassy, and represent a shocking case study of the impossibility of protecting journalistic sources and materials in such a hostile environment. This espionage operation is particularly shocking if we consider that Assange was protected by asylum, and if we consider that the information gathered will be used by the United States to support his extradition and put him in prison for the crimes for which he is currently charged and for which he risks 175 years in prison: the publication of secret US government documents revealing war crimes and torture, from Afghanistan to Iraq to Guantanamo.»

«Sometimes the espionage operations were truly off the wall: at one point spies even planned to steal the diaper of a baby brought to visit Assange inside the embassy. The purpose? To gather the baby's feces and perform a DNA test to establish whether the newborn was a secret son of Julian Assange.»

July 26 2018

  • wisegeek.com: What is a Web Bug?

    “The original Web bug is a transparent image, just a few pixels or less in size, commonly embedded in webpages or email to perform clandestine services for third parties. Web bugs allow the background on the page to show through, making them invisible. They are called ‘bugs’ after the discriminate, remote listening devices of the same name. The modern Web bug need not take the form of a tiny transparent image. Scripts, iFrames, style tags and other implementations within a page can serve the same purpose.”

  • knowprivacy.org: Web Bugs

    No Accountability for Third Party Trackers

    In our analysis of privacy policies, 36 of the websites affirmatively acknowledged the presence of third-party tracking. However, each of these policies also stated that the data collection practices of these third parties were outside the coverage of the privacy policy. This appears to be a critical loophole in privacy protection on the Internet.”

  • youtube.com: Web bug Meaning

    “Video shows what web bug means. A small, usually transparent image added by an advertiser to a webpage (to track its popularity) or e-mail message (to track when it is read).. web bug synonyms: web beacon.”

June 20 2018


The GDPR and Browser Fingerprinting: How It Changes the Game for the Sneakiest Web Trackers

from EFF’s Deeplinks Blog:

Browser fingerprinting is on a collision course with privacy regulations. For almost a decade, EFF has been raising awareness about this tracking technique with projects like Panopticlick. Compared to more well-known tracking “cookies,” browser fingerprinting is trickier for users and browser extensions to combat: websites can do it without detection, and it’s very difficult to modify browsers so that they are less vulnerable to it. As cookies have become more visible and easier to block, companies have been increasingly tempted to turn to sneakier fingerprinting techniques.

But companies also have to obey the law. And for residents of the European Union, the General Data Protection Regulation (GDPR), which entered into force on May 25th, is intended to cover exactly this kind of covert data collection. The EU has also begun the process of updating its ePrivacy Directive, best known for its mandate that websites must warn you about any cookies they are using. If you’ve ever seen a message asking you to approve a site’s cookie use, that’s likely based on this earlier Europe-wide law.

This leads to a key question: Will the GDPR require companies to make fingerprinting as visible to users as the original ePrivacy Directive required them to make cookies?

The answer, in short, is yes. Where the purpose of fingerprinting is tracking people, it will constitute “personal data processing” and will be covered by the GDPR.

What is browser fingerprinting and how does it work?

When a site you visit uses browser fingerprinting, it can learn enough information about your browser to uniquely distinguish you from all the other visitors to that site. Browser fingerprinting can be used to track users just as cookies do, but using much more subtle and hard-to-control techniques. In a paper EFF released in 2010, we found that majority of users’ browsers were uniquely identifiable given existing fingerprinting techniques. Those techniques have only gotten more complex and obscure in the intervening years.

By using browser fingerprinting to piece together information about your browser and your actions online, trackers can covertly identify users over time, track them across websites, and build an advertising profile of them. The information that browser fingerprinting reveals typically includes a mixture of HTTP headers (which are delivered as a normal part of every web request) and properties that can be learned about the browser using JavaScript code: your time zone, system fonts, screen resolution, which plugins you have installed, and what platform your browser is running on. Sites can even use techniques such as canvas or WebGL fingerprinting to gain insight into your hardware configuration.

When stitched together, these individual properties tell a unique story about your browser and the details of your browsing interactions. For instance, yours is likely the only browser on central European time with cookies enabled that has exactly your set of system fonts, screen resolution, plugins, and graphics card.

By gathering that information together and storing it on its own servers, a site can track your browsing habits without the use of persistent identifiers stored on your computer, like cookies. Fingerprinting can also be used to recreate a tracking cookie for a user after the user has deleted it. Users that are aware of cookies can remove them within their browser settings, but fingerprinting subverts the built-in browser mechanisms that allow users to avoid being tracked.

And this doesn’t just apply to the sites you visit directly. The pervasive inclusion of remote resources, like fonts, analytics scripts, or social media widgets on websites means that the third parties behind them can track your browsing habits across the web, rather than just on their own websites.

Aside from the limited case of fraud detection (which needs transparency and opt-in consent for any further processing), browser fingerprinting offers no functionality to users. When the popular social media widget provider AddThis started using canvas fingerprinting in 2014, the negative reaction from their users was so overwhelming that they were forced to stop the practice.

Some fingerprinting tricks are potentially detectable by end-users or their software: for instance, a site changing some text into multiple fonts extremely quickly is probably scanning to see which fonts a user has installed. Privacy Badger, a browser extension that we develop at EFF, detects canvas fingerprinting to determine when a site looks like a tracker. And a W3C guidance document draft for web specification authors advises them to develop their specs with fingerprinting detectability in mind. Unfortunately, however, new and more covert techniques to fingerprint users are being discovered all the time.

Fingerprinting After the GDPR

You’ll struggle to find fingerprinting explicitly mentioned in the GDPR—but that’s because the EU has learned from earlier data protection laws and the current ePrivacy Directive to remain technologically neutral.

Apart from non-binding recitals (like Recital 30, discussing cookies), the GDPR avoids calling out specific technologies or giving exhaustive lists and examples. Instead, it provides general rules that the drafters felt should be neutral, flexible, and keep up with technological development beyond fingerprinting and cookies. Below we explain how those general rules apply to tracking Internet users, no matter what technique is used.

Browser Characteristics as Personal Data

The cornerstone of the GDPR is its broad definition of personal data.[1] Personal data is any information that might be linked to an identifiable individual. This definition not only covers all sorts of online identifiers (such as your computer’s MAC address, your networks’ IP address, or an advertising user ID in a cookie) but also less specific features — including the combination of browser characteristics that fingerprinting relies upon. The key condition is that a given element of information relates to an individual who can be directly or indirectly identified.

It is also worth noting that under the GDPR “identification” does not require establishing a user’s identity. It is enough that an entity processing data can indirectly identify a user, based on pseudonymous data, in order to perform certain actions based on such identification (for instance, to present different ads to different users, based on their profiles). This is what EU authorities refer to as singling-out[2], linkability[3], or inference.[4]

The whole point of fingerprinting is the ability of the tracking company (data controller) to be able to indirectly identify unique users among the sea of Internet users in order to track them, create their behavioural profiles and, finally, present them with targeted advertising. If the fingerprinting company has identification as its purpose, the Article 29 Working Party (an advisory board comprised of European data protection authorities) decided over ten years ago, regulators should assume that “the controller … will have the means ‘likely reasonably to be used’ to identify the people because “the processing of that information only makes sense if it allows identification of specific individuals.” As the Article 29 Working Party noted, “In fact, to argue that individuals are not identifiable, where the purpose of the processing is precisely to identify them, would be a sheer contradiction in terms.”[5]

Thus, when several information elements are combined (especially unique identifiers such as your set of system fonts) across websites (e.g. for the purposes of behavioral advertising), fingerprinting constitutes the processing of personal data and must comply with GDPR.[6]

Can Fingerprinting Be Legal Under The GDPR?

According to the GDPR, every entity processing personal data (including tracking user behavior online, matching ads with user profiles, or presenting targeted ads on their website) must be able to prove that they have a legitimate reason (by the definitions of the law) to do so.[7] The GDPR gives six possible legal grounds that enable processing data, with two of them being most relevant in the tracking/advertising context: user consent and the “legitimate interest” of whoever is doing the tracking.

How should this work in practice? User consent means an informed, unambiguous action (such as change of settings from “no” to “yes”).[8] In order to be able to rely on this legal ground, companies that use fingerprinting would have to, in the first place, reveal the fingerprinting before it is executed and, then, wait for a user to give their freely-given informed consent. Since the very purpose of fingerprinting is to escape user’s control, it is hardly surprising that trackers refuse to apply this standard.

It is more common for companies that use fingerprinting to claim their own, or whoever is paying them to fingerprint users, “legitimate interest” in doing so.

The concept of legitimate interest in the GDPR has been constructed as a compromise between privacy advocates and business interests.[9] It is much more vague and ambiguous than other legal grounds for processing data. In the coming months, you will see many companies who operate in Europe attempt to build their tracking and data collection of their users on the basis of their “legitimate interest.”

But that path won’t be easy for covert web fingerprinters. To be able to rely on this specific legal ground, every company that considers fingerprinting has to, first, go through a balancing test[10] (that is, verify for itself whether its interest in obscure tracking is not overridden by “the fundamental rights and freedoms of the data subject, including privacy” and whether it is in line with “reasonable expectations of data subjects”[11]) and openly lay out its legitimate interest argument for end-users. Second, and more importantly, the site has to share detailed information with the person that is subjected to fingerprinting, including the scope, purposes, and legal basis of such data processing.[12] Finally, if fingerprinting is done for marketing purposes, all it takes for end-users to stop it (provided they do not agree with the legitimate interest argument that has been made by the fingerprinter) is to say “no.”[13] The GDPR requires no further justification.

Running Afoul of the ePrivacy Rules

Fingerprinting also runs afoul of the ePrivacy Directive, which sets additional conditions on the use of device and browser identifiers. The ePrivacy Directive is a companion law, applying data protection rules more specifically in the area of communications. The Article 29 Working Party emphasised that fingerprinting—even if it does not involve processing personal data—is covered by Article 5(3) of the ePrivacy Directive (the section commonly referred to as the cookie clause) and thus requires user consent:

Parties who wish to process device fingerprints[14] which are generated through the gaining of access to, or the storing of, information on the user’s terminal device must first obtain the valid consent of the user (unless an exemption applies).[15]

While this opinion focused on device fingerprints, the logic still applies to browser fingerprints. Interpretations can vary according to national implementation and this has resulted in an inconsistent and ineffective application of the ePrivacy Directive, but key elements, such as the definition of consent, are controlled by the GDPR which will update its interpretation and operation. The EU aims to pass an updated ePrivacy Regulation in 2019, and current drafts target fingerprinting explicitly.

Looking at how web fingerprinting techniques have been used so far, it is very difficult to imagine companies moving from deliberate obscurity to full transparency and open communication with users. Fingerprinting companies will have to do what their predecessors in the cookie world did before now: face greater detection and exposure by coming clean about their practices, or slink even further behind the curtain, and hope to dodge European law.


When EFF first built Panopticlick in 2010, fingerprinting was largely a theoretical threat, in a world that was just beginning to wake up to the more obvious use of tracking cookies. Since then, we’ve seen more and more sites adopt the surreptitious methods we highlighted then, to disguise their behaviour from anti-tracking tools, or to avoid the increasing visibility and legal obligations of using tracking cookies within Europe.

With the GDPR in place, operating below the radar of European authorities and escaping rules that apply to commercial fingerprinting will be very difficult and—potentially—very expensive. To avoid severe penalties fingerprinting companies should, at least, be more upfront about their practices.

But that’s just in theory. In practice, we don’t expect the GDPR to make fingerprinting disappear any time soon, just as the ePrivacy Directive did not end the use of tracking cookies. The GDPR applies to any company as long as they process the personal data of individuals living within the European Economic Area for commercial purposes, or for any purpose when the behavior is within the EEA. However, many non-EU sites who track individuals in Europe using fingerprinting may decide to ignore European law in the belief that they can escape the consequences. European companies will inevitably claim a “legitimate interest” in tracking, and may be prepared to defend this argument. Consumers may be worn down by requests for consent, or ignore artfully crafted confessions by the tracking companies.

The rationale behind fingerprinting, as it is used today, is to evade transparency and accountability and make tracking impossible to control. If this rationale holds, fingerprinters won’t be able to convince the EU’s courts and regulators that, indeed, it is their legitimate interest to do so. In fact, there’s nothing legitimate about this method of tracking: that’s what privacy laws like the GDPR recognize, and that’s what regulators will act upon. Before we see results of their actions, browser companies, standards organizations, privacy advocates, and technologists will still need to work together to minimize how much third-parties can identify about individual users just from their browsers.

[1] Article 29 Data Protection Working Party, Opinion 4/2007 on the concept of personal data; GDPR Rec. 26 and 30; Art 4 (1)

[2] Article 29 Data Protection Working Party, Opinion 05/2014 on Anonymisation Techniques, pp 11-12. Singling-out: “the possibility to isolate some or all records which identify an individual in the dataset.”

[3] Article 29 Working Party, Opinion 05/2014 on Anonymisation Techniques, pp 11-12. Linkability: “the ability to link, at least, two records concerning the same data subject or a group of data subjects (either in the same database or in two different databases). If an attacker can establish (e.g. by means of correlation analysis) that two records are assigned to a same group of individuals but cannot single out individuals in this group, the technique provides resistance against ‘singling out’ but not against linkability.”

[4] Article 29 Data Protection Working Party, Opinion 05/2014 on Anonymisation Techniques, pp 11-12. Interference: “the possibility to deduce, with significant probability, the value of an attribute from the values of a set of other attributes.”

[5] Article 29 Data Protection Working Party, Opinion 4/2007 on the concept of personal data; see also Article 29 Data Protection Working Party, Opinion 9/2014 on the application of Directive 2002/58/EC to device fingerprinting.

[6] It is possible to collect information on a browser’s fingerprint without allowing for indirect identification of a user, and therefore without implicating “personal data” under the GDPR, For example, because no further operations, such as tracking user behaviour across the web or collecting the data allowing one to link non-unique browser characteristics to other data about the user, take place. This would be unusual outside of rare cases like a fingerprinting research project. In any event, the ePrivacy Directive also applies to non-personal data. See Article 29 Data Protection Working Party, Opinion 9/2014 on the application of Directive 2002/58/EC to device fingerprinting; ePrivacy Directive Art 5(3).

[7] GDPR Rec 40 and Art. 5(1)(a)

[8] GDPR Rec and 42 Art. 4(11); Article 29 Data Protection Working Party, Guidelines on consent under Regulation 2016/679

[9] Article 29 Data Protection Working Party, Opinion 6/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46/EC; GDPR Rec 47 and Art 6(1)(f)

[10] See Recital 47 EU GDPR, "The legitimate interests of a controller, including those of a controller to which the personal data may be disclosed, or of a third party, may provide a legal basis for processing, provided that the interests or the fundamental rights and freedoms of the data subject are not overriding, taking into consideration the reasonable expectations of data subjects based on their relationship with the controller."

[11] Article 29 Data Protection Working Party, Opinion 6/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46/EC; GDPR Rec 47 and Art 6(1)(f)

[12] GDPR Art 13

[13] GDPR Art 21(2)

[14] See Article 29 Data Protection Working Party, Opinion 9/2014 on the application of Directive 2002/58/EC to device fingerprinting "The technology of device fingerprinting is not limited to the configuration parameters of a traditional web browser on a desktop PC. Device fingerprinting is not tied to a particular protocol either, but can be used to fingerprint a broad range of internet connected devices..." (p.4)

[15] Article 29 Data Protection Working Party, Opinion 9/2014 on the application of Directive 2002/58/EC to device fingerprinting

March 19 2018


May 20 2017


UN sets up privacy rapporteur role in wake of Snowden leaks

“Although the right to privacy is enshrined in international law – it is set out in article 12 of the Universal Declaration of Human Rights and article 17 of the International Covenant on Civil and Political Rights – it has largely been ignored and is low on the list of UN priorities.

The Snowden revelations changed this dynamic, producing a strong reaction in Germany, given its history of surveillance by the Stasi secret police and the disclosure that the NSA had hacked the mobile phone of the chancellor, Angela Merkel. In Brazil, the president, Dilma Rousseff, cancelled a visit to Washington in protest over spying on her country.

The resolution, ‘The right to privacy in the digital age’, notes that ‘the rapid pace of technological development enables individuals all over the world to use new information and communications technology and at the same time enhances the capacity of governments, companies and individuals to undertake surveillance, interception and data collection, which may violate or abuse human rights’ and describes the issue as one of increasing concern.

It expresses deep concern ‘at the negative impact that surveillance and/or interception of communications, including extraterritorial surveillance and/or interception of communications, as well as the collection of personal data, in particular when carried out on a mass scale, may have on the exercise and enjoyment of human rights’.

The campaign group Privacy International has been lobbying for such a resolution since 2013. Tomaso Falchetta, legal officer for PI, said: ‘Now, perhaps more than ever, we need a dedicated individual to hold those accountable who wish to violate privacy, whether it is through surveillance, indiscriminate data collection, or other techniques that infringe on this important right.’”


OHCHR | Special Rapporteur on Privacy

“In July 2015, the Human Rights Council appointed Prof. Joseph Cannataci of Malta as the first-ever Special Rapporteur on the right to privacy. The appointment is for three years.


The Special Rapporteur is mandated by Human Rights Council Resolution 28/16:

(a) To gather relevant information, including on international and national frameworks, national practices and experience, to study trends, developments and challenges in relation to the right to privacy and to make recommendations to ensure its promotion and protection, including in connection with the challenges arising from new technologies;

(b) To seek, receive and respond to information, while avoiding duplication, from States, the United Nations and its agencies, programmes and funds, regional human rights mechanisms, national human rights institutions, civil society organizations, the private sector, including business enterprises, and any other relevant stakeholders or parties;

(c) To identify possible obstacles to the promotion and protection of the right to privacy, to identify, exchange and promote principles and best practices at the national, regional and international levels, and to submit proposals and recommendations to the Human Rights Council in that regard, including with a view to particular challenges arising in the digital age;

(d) To participate in and contribute to relevant international conferences and events with the aim of promoting a systematic and coherent approach on issues pertaining to the mandate;

(e) To raise awareness concerning the importance of promoting and protecting the right to privacy, including with a view to particular challenges arising in the digital age, as well as concerning the importance of providing individuals whose right to privacy has been violated with access to effective remedy, consistent with international human rights obligations;

(f) To integrate a gender perspective throughout the work of the mandate;

(g) To report on alleged violations, wherever they may occur, of the right to privacy, as set out in article 12 of the Universal Declaration of Human Rights and article 17 of the International Covenant on Civil and Political Rights, including in connection with the challenges arising from new technologies, and to draw the attention of the Council and the United Nations High Commissioner for Human Rights to situations of particularly serious concern;

(h) To submit an annual report to the Human Rights Council and to the General Assembly, starting at the thirty-first session and the seventy-first session respectively.”


March 03 2017

Spyware in Windows: Windows Update snoops on the user. Windows 8.1 snoops on local searches. And there's a secret NSA key in Windows, whose functions we don't know.
Microsoft's Software Is Malware - GNU Project - Free Software Foundation

[posted: Sep. 7 2015. source]


Microsoft’s new small print – how your personal data is (ab)used

Windows 10 - Windows Surveillance Edition.

“A French tech news website Numerama analysed the new privacy policy and found a number of conditions users should be aware of:

By default, when signing into Windows with a Microsoft account, Windows syncs some of your settings and data with Microsoft servers, for example ‘web browser history, favorites, and websites you have open’ as well as ‘saved app, website, mobile hotspot, and Wi-Fi network names and passwords’. Users can however deactivate this transfer to the Microsoft servers by changing their settings.

More problematic from a data protection perspective is however the fact that Windows generates a unique advertising ID for each user on a device. This advertising ID can be used by third parties, such as app developers and advertising networks for profiling purposes.

Also, when device encryption is on, Windows automatically encrypts the drive Windows is installed on and generates a recovery key. The BitLocker recovery key for the user’s device is automatically backed up online in the Microsoft OneDrive account.

Microsoft’s updated terms also state that they collect basic information from you and your devices, including for example ‘app use data for apps that run on Windows’ and ‘data about the networks you connect to.’

Users who chose to enable Microsoft’s personal assistant software ‘Cortana’ have to live with the following invasion to their privacy: ‘To enable Cortana to provide personalized experiences and relevant suggestions, Microsoft collects and uses various types of data, such as your device location, data from your calendar, the apps you use, data from your emails and text messages, who you call, your contacts and how often you interact with them on your device. Cortana also learns about you by collecting data about how you use your device and other Microsoft services, such as your music, alarm settings, whether the lock screen is on, what you view and purchase, your browse and Bing search history, and more.’ But this is not all, as this piece of software also analyses undefined ‘speech data’: ‘we collect your voice input, as well your name and nickname, your recent calendar events and the names of the people in your appointments, and information about your contacts including names and nicknames.’

But Microsoft’s updated privacy policy is not only bad news for privacy. Your free speech rights can also be violated on an ad hoc basis as the company warns:

‘We will access, disclose and preserve personal data, including your content (such as the content of your emails, other private communications or files in private folders), when we have a good faith belief that doing so is necessary to’, for example, ‘protect their customers’ or ‘enforce the terms governing the use of the services’.”

[posted: Sep. 7 2015. source]

June 01 2015

Reposted fromFlau Flau viaRekrut-K Rekrut-K

May 24 2015


マイナンバー法改正案が衆院通過 預金口座にも適用


  • 危険? 便利? 「マイナンバー制度」の是非 (Yahoo! ニュース 2013年5月24日)




    “割り当てられた番号は不変のため、いったん情報が漏洩すると、いわゆる「なりすまし」 による被害が多発し、個人が大きな損害を被る危険性があります。



  • このままではマイナンバー制度は絶対うまくいかない理由 (BLOGOS 2012年7月31日)
















February 04 2015


One Year Later, Obama Failing on Promise to Rein in NSA | Electronic Frontier Foundation

“President Obama still has time in office to make this right, and he’s got ample power to rein in NSA overreach without Congress lifting a finger. But if he continues to offer these weak reforms, then he should be prepared for a major Congressional battle when sections of the Patriot Act come up for reauthorization in June.”

November 15 2014

6017 597b 500
Reposted frombwana bwana viawonko wonko

October 29 2014


October 13 2014


October 12 2014


‘Hostile to privacy’: Snowden urges internet users to get rid of Dropbox — RT News

“The whistleblower believes one fallacy in how authorities view individual rights has to do with making the individual forsake those rights by default. Snowden’s point is that the moment you are compelled to reveal that you have nothing to hide is when the right to privacy stops being a right – because you are effectively waiving that right.

‘When you say, “I have nothing to hide,” you’re saying, “I don’t care about this right.” You’re saying, “I don’t have this right, because I’ve got to the point where I have to justify it.” The way rights work is, the government has to justify its intrusion into your rights – you don’t have to justify why you need freedom of speech.’

In that situation, it becomes OK to live in a world where one is no longer interested in privacy as such – a world where Facebook, Google and Dropbox have become ubiquitous, and where there are virtually no safeguards against the wrongful use of the information one puts there.

In particular, Snowden advised web users to ‘get rid’ of Dropbox. Such services only insist on encrypting user data during transfer and when being stored on the servers. Other services he recommends instead, such as SpiderOak, encrypt information while it’s on your computer as well.

‘We're talking about dropping programs that are hostile to privacy,’ Snowden said.

The same goes for social networks such as Facebook and Google, too. Snowden says they are ‘dangerous’ and proposes that people use other services that allow for encrypted messages to be sent, such as RedPhone or SilentCircle.

The argument that encryption harms security efforts to capture terrorists is flawed, even from a purely legalistic point of view, Snowden said, explaining that you can still retain encryption and have the relevant authorities requesting private information from phone carriers and internet providers on a need-to-know basis.”

September 21 2014


August 30 2014


July 18 2014


June 05 2014

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.
Get rid of the ads (sfw)

Don't be the product, buy the product!