Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

December 11 2019

Genetic Genealogy Company GEDmatch Acquired by Company With Ties to FBI & Law Enforcement—Why You Should Be Worried

This week, GEDmatch, a genetic genealogy company that gained notoriety for giving law enforcement access to its customers’ DNA data, quietly informed its users it is now operated by Verogen, Inc., a company expressly formed two years ago to market “next-generation [DNA] sequencing” technology to crime labs.  

What this means for GEDmatch’s 1.3 million users—and for the 60% of white Americans who share DNA with those users—remains to be seen. 

GEDmatch allows users to upload an electronic file containing their raw genotyped DNA data so that they can compare it to other users’ data to find biological family relationships. It estimates how close or distant those relationships may be (e.g., a direct connection, like a parent, or a distant connection, like a third cousin), and it enables users to determine where, along each chromosome, their DNA may be similar to another user. It also predicts characteristics like ethnicity. 

An estimated 30 million people have used genetic genealogy databases like GEDmatch to identify biological relatives and build a family tree, and law enforcement officers have been capitalizing on all that freely available data in criminal investigations. Estimates are that genetic genealogy sites were used in around 200 cases just last year. For many of those cases, officers never sought a warrant or any legal process at all. 

Earlier this year, after public outcry, GEDmatch changed its previous position allowing for warrantless law enforcement searches, opted out all its users from those searches, and required all users to expressly opt in if they wanted to allow access to their genetic data. Only a small percentage did. But opting out has not prevented law enforcement from accessing consumers’ genetic data, as long as they can get a warrant, which one Orlando, Florida officer did last summer.  

Law enforcement has argued that people using genetic genealogy services have no expectation of privacy in their genetic data because users have willingly shared their data with the genetics company and with other users and have “consented” to a company’s terms of service. But the Supreme Court rejected a similar argument in Carpenter v. United States. 

In Carpenter, the Court ruled that even though our cell phone location data is shared with or stored by a phone company, we still have a reasonable expectation of privacy in it because of all the sensitive and private information it can reveal about our lives. Similarly, genetic data can reveal a whole host of extremely private and sensitive information about people, from their likelihood to inherit specific diseases to where their ancestors are from to whether they have a sister or brother they never knew about. Researchers have even theorized at one time or another that DNA may predict race, intelligence, criminality, sexual orientation, and political ideology. Even if later disproved, officials may rely on outdated research like this to make judgements about and discriminate against people. Because genetic data is so sensitive, we have an expectation of privacy in it, even if other people can access it.

However, whether individual users of genetic genealogy databases have consented to law enforcement searches is somewhat beside the point. In all cases that we know of so far, law enforcement isn’t looking for the person who uploaded their DNA to a consumer site, they are looking for that person’s distant relatives—people who never could have consented to this kind of use of their genetic data because they don’t have any control over the DNA they happen to share with the site’s users.  

We need to think long and hard as a society about whether law enforcement should be allowed to access genetic genealogy databases at all—even with a warrant.

These are also dragnet searches, conducted under “general warrants,” and no different from officers searching every house in a town with a population of 1.3 million on the off chance that one of those houses could contain evidence useful to finding the perpetrator of a crime. With or without a warrant, the Fourth Amendment prohibits searches like this in the physical world, and it should prohibit genetic dragnets like this one as well.  That means these searches are nothing more than fishing expeditions through millions of innocent people’s DNA. They are not targeted at finding specific users or based on individualized suspicion—a fact the police admit because they don’t know who their suspect is. They are supported only by the hope that a crime scene sample might somehow be genetically linked to DNA submitted to a genetic genealogy database by a distant relative, which might give officers a lead in a case. There's a real question whether a warrant that allows this kind of search could ever meet the particularity requirements of the Fourth Amendment. 

We need to think long and hard as a society about whether law enforcement should be allowed to access genetic genealogy databases at all—even with a warrant. These searches impact millions of Americans. Although GEDmatch likely only encompasses about 0.5% of the U.S. adult population, research shows 60% of white Americans can already be identified from its 1.3 million users. This same research shows that once GEDmatch’s users encompass just 2% of the U.S. population, 90% of white Americans will be identifiable.

Although many authorities once argued these kinds of searches would only be used as a way to solve cold cases involving the most terrible and serious crimes, that is changing; this year, police used genetic genealogy to implicate a teenager for a sexual assault. Next year it could be used to identify political or environmental protestors. Unlike established criminal DNA databases like the FBI’s CODIS database, there are currently few rules governing how and when genetic genealogy searching may be used.

We should worry about these searches for another reason: they can implicate people for crimes they didn’t commit. Although police used genetic searching to finally identify the man they believe is the “Golden State Killer,” an earlier search in the same case identified a different person. In 2015, a similar search in a different case led police to suspect an innocent man. Even without genetic genealogy searches, DNA matches may lead officers to suspect—and jail—the wrong person, as happened in a California case in 2012. That can happen because we shed DNA constantly and because our DNA may be transferred from one location to another, possibly ending up at the scene of a crime, even if we were never there. 

All of this is made even more concerning by the recent acquisition of GEDmatch by a company whose main purpose is to help the police solve crimes. The ability to research family history and disease risk shouldn’t carry the threat that our data will be accessible to police or others and used in ways we never could have foreseen. Genetic genealogy searches by law enforcement invade our privacy in unique ways—they allow law enforcement to access information about us that we may not even know ourselves, that we have no ability to hide, and that could reveal more about us in the future than scientists know now. These searches should never be allowed—even with a warrant.

Related Cases:  Maryland v. King Carpenter v. United States
Tags: Bookmarks EFF

February 15 2019


June 20 2018


The GDPR and Browser Fingerprinting: How It Changes the Game for the Sneakiest Web Trackers

from EFF’s Deeplinks Blog:

Browser fingerprinting is on a collision course with privacy regulations. For almost a decade, EFF has been raising awareness about this tracking technique with projects like Panopticlick. Compared to more well-known tracking “cookies,” browser fingerprinting is trickier for users and browser extensions to combat: websites can do it without detection, and it’s very difficult to modify browsers so that they are less vulnerable to it. As cookies have become more visible and easier to block, companies have been increasingly tempted to turn to sneakier fingerprinting techniques.

But companies also have to obey the law. And for residents of the European Union, the General Data Protection Regulation (GDPR), which entered into force on May 25th, is intended to cover exactly this kind of covert data collection. The EU has also begun the process of updating its ePrivacy Directive, best known for its mandate that websites must warn you about any cookies they are using. If you’ve ever seen a message asking you to approve a site’s cookie use, that’s likely based on this earlier Europe-wide law.

This leads to a key question: Will the GDPR require companies to make fingerprinting as visible to users as the original ePrivacy Directive required them to make cookies?

The answer, in short, is yes. Where the purpose of fingerprinting is tracking people, it will constitute “personal data processing” and will be covered by the GDPR.

What is browser fingerprinting and how does it work?

When a site you visit uses browser fingerprinting, it can learn enough information about your browser to uniquely distinguish you from all the other visitors to that site. Browser fingerprinting can be used to track users just as cookies do, but using much more subtle and hard-to-control techniques. In a paper EFF released in 2010, we found that majority of users’ browsers were uniquely identifiable given existing fingerprinting techniques. Those techniques have only gotten more complex and obscure in the intervening years.

By using browser fingerprinting to piece together information about your browser and your actions online, trackers can covertly identify users over time, track them across websites, and build an advertising profile of them. The information that browser fingerprinting reveals typically includes a mixture of HTTP headers (which are delivered as a normal part of every web request) and properties that can be learned about the browser using JavaScript code: your time zone, system fonts, screen resolution, which plugins you have installed, and what platform your browser is running on. Sites can even use techniques such as canvas or WebGL fingerprinting to gain insight into your hardware configuration.

When stitched together, these individual properties tell a unique story about your browser and the details of your browsing interactions. For instance, yours is likely the only browser on central European time with cookies enabled that has exactly your set of system fonts, screen resolution, plugins, and graphics card.

By gathering that information together and storing it on its own servers, a site can track your browsing habits without the use of persistent identifiers stored on your computer, like cookies. Fingerprinting can also be used to recreate a tracking cookie for a user after the user has deleted it. Users that are aware of cookies can remove them within their browser settings, but fingerprinting subverts the built-in browser mechanisms that allow users to avoid being tracked.

And this doesn’t just apply to the sites you visit directly. The pervasive inclusion of remote resources, like fonts, analytics scripts, or social media widgets on websites means that the third parties behind them can track your browsing habits across the web, rather than just on their own websites.

Aside from the limited case of fraud detection (which needs transparency and opt-in consent for any further processing), browser fingerprinting offers no functionality to users. When the popular social media widget provider AddThis started using canvas fingerprinting in 2014, the negative reaction from their users was so overwhelming that they were forced to stop the practice.

Some fingerprinting tricks are potentially detectable by end-users or their software: for instance, a site changing some text into multiple fonts extremely quickly is probably scanning to see which fonts a user has installed. Privacy Badger, a browser extension that we develop at EFF, detects canvas fingerprinting to determine when a site looks like a tracker. And a W3C guidance document draft for web specification authors advises them to develop their specs with fingerprinting detectability in mind. Unfortunately, however, new and more covert techniques to fingerprint users are being discovered all the time.

Fingerprinting After the GDPR

You’ll struggle to find fingerprinting explicitly mentioned in the GDPR—but that’s because the EU has learned from earlier data protection laws and the current ePrivacy Directive to remain technologically neutral.

Apart from non-binding recitals (like Recital 30, discussing cookies), the GDPR avoids calling out specific technologies or giving exhaustive lists and examples. Instead, it provides general rules that the drafters felt should be neutral, flexible, and keep up with technological development beyond fingerprinting and cookies. Below we explain how those general rules apply to tracking Internet users, no matter what technique is used.

Browser Characteristics as Personal Data

The cornerstone of the GDPR is its broad definition of personal data.[1] Personal data is any information that might be linked to an identifiable individual. This definition not only covers all sorts of online identifiers (such as your computer’s MAC address, your networks’ IP address, or an advertising user ID in a cookie) but also less specific features — including the combination of browser characteristics that fingerprinting relies upon. The key condition is that a given element of information relates to an individual who can be directly or indirectly identified.

It is also worth noting that under the GDPR “identification” does not require establishing a user’s identity. It is enough that an entity processing data can indirectly identify a user, based on pseudonymous data, in order to perform certain actions based on such identification (for instance, to present different ads to different users, based on their profiles). This is what EU authorities refer to as singling-out[2], linkability[3], or inference.[4]

The whole point of fingerprinting is the ability of the tracking company (data controller) to be able to indirectly identify unique users among the sea of Internet users in order to track them, create their behavioural profiles and, finally, present them with targeted advertising. If the fingerprinting company has identification as its purpose, the Article 29 Working Party (an advisory board comprised of European data protection authorities) decided over ten years ago, regulators should assume that “the controller … will have the means ‘likely reasonably to be used’ to identify the people because “the processing of that information only makes sense if it allows identification of specific individuals.” As the Article 29 Working Party noted, “In fact, to argue that individuals are not identifiable, where the purpose of the processing is precisely to identify them, would be a sheer contradiction in terms.”[5]

Thus, when several information elements are combined (especially unique identifiers such as your set of system fonts) across websites (e.g. for the purposes of behavioral advertising), fingerprinting constitutes the processing of personal data and must comply with GDPR.[6]

Can Fingerprinting Be Legal Under The GDPR?

According to the GDPR, every entity processing personal data (including tracking user behavior online, matching ads with user profiles, or presenting targeted ads on their website) must be able to prove that they have a legitimate reason (by the definitions of the law) to do so.[7] The GDPR gives six possible legal grounds that enable processing data, with two of them being most relevant in the tracking/advertising context: user consent and the “legitimate interest” of whoever is doing the tracking.

How should this work in practice? User consent means an informed, unambiguous action (such as change of settings from “no” to “yes”).[8] In order to be able to rely on this legal ground, companies that use fingerprinting would have to, in the first place, reveal the fingerprinting before it is executed and, then, wait for a user to give their freely-given informed consent. Since the very purpose of fingerprinting is to escape user’s control, it is hardly surprising that trackers refuse to apply this standard.

It is more common for companies that use fingerprinting to claim their own, or whoever is paying them to fingerprint users, “legitimate interest” in doing so.

The concept of legitimate interest in the GDPR has been constructed as a compromise between privacy advocates and business interests.[9] It is much more vague and ambiguous than other legal grounds for processing data. In the coming months, you will see many companies who operate in Europe attempt to build their tracking and data collection of their users on the basis of their “legitimate interest.”

But that path won’t be easy for covert web fingerprinters. To be able to rely on this specific legal ground, every company that considers fingerprinting has to, first, go through a balancing test[10] (that is, verify for itself whether its interest in obscure tracking is not overridden by “the fundamental rights and freedoms of the data subject, including privacy” and whether it is in line with “reasonable expectations of data subjects”[11]) and openly lay out its legitimate interest argument for end-users. Second, and more importantly, the site has to share detailed information with the person that is subjected to fingerprinting, including the scope, purposes, and legal basis of such data processing.[12] Finally, if fingerprinting is done for marketing purposes, all it takes for end-users to stop it (provided they do not agree with the legitimate interest argument that has been made by the fingerprinter) is to say “no.”[13] The GDPR requires no further justification.

Running Afoul of the ePrivacy Rules

Fingerprinting also runs afoul of the ePrivacy Directive, which sets additional conditions on the use of device and browser identifiers. The ePrivacy Directive is a companion law, applying data protection rules more specifically in the area of communications. The Article 29 Working Party emphasised that fingerprinting—even if it does not involve processing personal data—is covered by Article 5(3) of the ePrivacy Directive (the section commonly referred to as the cookie clause) and thus requires user consent:

Parties who wish to process device fingerprints[14] which are generated through the gaining of access to, or the storing of, information on the user’s terminal device must first obtain the valid consent of the user (unless an exemption applies).[15]

While this opinion focused on device fingerprints, the logic still applies to browser fingerprints. Interpretations can vary according to national implementation and this has resulted in an inconsistent and ineffective application of the ePrivacy Directive, but key elements, such as the definition of consent, are controlled by the GDPR which will update its interpretation and operation. The EU aims to pass an updated ePrivacy Regulation in 2019, and current drafts target fingerprinting explicitly.

Looking at how web fingerprinting techniques have been used so far, it is very difficult to imagine companies moving from deliberate obscurity to full transparency and open communication with users. Fingerprinting companies will have to do what their predecessors in the cookie world did before now: face greater detection and exposure by coming clean about their practices, or slink even further behind the curtain, and hope to dodge European law.


When EFF first built Panopticlick in 2010, fingerprinting was largely a theoretical threat, in a world that was just beginning to wake up to the more obvious use of tracking cookies. Since then, we’ve seen more and more sites adopt the surreptitious methods we highlighted then, to disguise their behaviour from anti-tracking tools, or to avoid the increasing visibility and legal obligations of using tracking cookies within Europe.

With the GDPR in place, operating below the radar of European authorities and escaping rules that apply to commercial fingerprinting will be very difficult and—potentially—very expensive. To avoid severe penalties fingerprinting companies should, at least, be more upfront about their practices.

But that’s just in theory. In practice, we don’t expect the GDPR to make fingerprinting disappear any time soon, just as the ePrivacy Directive did not end the use of tracking cookies. The GDPR applies to any company as long as they process the personal data of individuals living within the European Economic Area for commercial purposes, or for any purpose when the behavior is within the EEA. However, many non-EU sites who track individuals in Europe using fingerprinting may decide to ignore European law in the belief that they can escape the consequences. European companies will inevitably claim a “legitimate interest” in tracking, and may be prepared to defend this argument. Consumers may be worn down by requests for consent, or ignore artfully crafted confessions by the tracking companies.

The rationale behind fingerprinting, as it is used today, is to evade transparency and accountability and make tracking impossible to control. If this rationale holds, fingerprinters won’t be able to convince the EU’s courts and regulators that, indeed, it is their legitimate interest to do so. In fact, there’s nothing legitimate about this method of tracking: that’s what privacy laws like the GDPR recognize, and that’s what regulators will act upon. Before we see results of their actions, browser companies, standards organizations, privacy advocates, and technologists will still need to work together to minimize how much third-parties can identify about individual users just from their browsers.

[1] Article 29 Data Protection Working Party, Opinion 4/2007 on the concept of personal data; GDPR Rec. 26 and 30; Art 4 (1)

[2] Article 29 Data Protection Working Party, Opinion 05/2014 on Anonymisation Techniques, pp 11-12. Singling-out: “the possibility to isolate some or all records which identify an individual in the dataset.”

[3] Article 29 Working Party, Opinion 05/2014 on Anonymisation Techniques, pp 11-12. Linkability: “the ability to link, at least, two records concerning the same data subject or a group of data subjects (either in the same database or in two different databases). If an attacker can establish (e.g. by means of correlation analysis) that two records are assigned to a same group of individuals but cannot single out individuals in this group, the technique provides resistance against ‘singling out’ but not against linkability.”

[4] Article 29 Data Protection Working Party, Opinion 05/2014 on Anonymisation Techniques, pp 11-12. Interference: “the possibility to deduce, with significant probability, the value of an attribute from the values of a set of other attributes.”

[5] Article 29 Data Protection Working Party, Opinion 4/2007 on the concept of personal data; see also Article 29 Data Protection Working Party, Opinion 9/2014 on the application of Directive 2002/58/EC to device fingerprinting.

[6] It is possible to collect information on a browser’s fingerprint without allowing for indirect identification of a user, and therefore without implicating “personal data” under the GDPR, For example, because no further operations, such as tracking user behaviour across the web or collecting the data allowing one to link non-unique browser characteristics to other data about the user, take place. This would be unusual outside of rare cases like a fingerprinting research project. In any event, the ePrivacy Directive also applies to non-personal data. See Article 29 Data Protection Working Party, Opinion 9/2014 on the application of Directive 2002/58/EC to device fingerprinting; ePrivacy Directive Art 5(3).

[7] GDPR Rec 40 and Art. 5(1)(a)

[8] GDPR Rec and 42 Art. 4(11); Article 29 Data Protection Working Party, Guidelines on consent under Regulation 2016/679

[9] Article 29 Data Protection Working Party, Opinion 6/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46/EC; GDPR Rec 47 and Art 6(1)(f)

[10] See Recital 47 EU GDPR, "The legitimate interests of a controller, including those of a controller to which the personal data may be disclosed, or of a third party, may provide a legal basis for processing, provided that the interests or the fundamental rights and freedoms of the data subject are not overriding, taking into consideration the reasonable expectations of data subjects based on their relationship with the controller."

[11] Article 29 Data Protection Working Party, Opinion 6/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46/EC; GDPR Rec 47 and Art 6(1)(f)

[12] GDPR Art 13

[13] GDPR Art 21(2)

[14] See Article 29 Data Protection Working Party, Opinion 9/2014 on the application of Directive 2002/58/EC to device fingerprinting "The technology of device fingerprinting is not limited to the configuration parameters of a traditional web browser on a desktop PC. Device fingerprinting is not tied to a particular protocol either, but can be used to fingerprint a broad range of internet connected devices..." (p.4)

[15] Article 29 Data Protection Working Party, Opinion 9/2014 on the application of Directive 2002/58/EC to device fingerprinting

June 21 2017

As the Espionage Act Turns 100, We Condemn Threats Against Wikileaks

The federal law that is commonly used to prosecute leakers marks its 100th birthday on June 15,2017.

Signed into law on June 15, 1917, the Espionage Act 18 U.S.C. § 792 et seq., was Congress’s response to a fear that public criticism of U.S. participation in World War I would impede the conscript of soldiers to support the war effort and concerns about U.S. citizens undermining the war effort by spying for foreign governments. Although some parts of the law were repealed, many remain in effect 100 years later.

Most pertinent today, the law criminalizes both the disclosure and receipt of certain national security information. As a result, the Espionage Act remains the most common grounds upon which leakers of U.S. governmental information are prosecuted. Indeed, the recent charges against the alleged source of the NSA Russian Election Systems Phishing documents are based on the Espionage Act.

To date, however, the United States has never sought to prosecute a journalistic entity under the Espionage Act for either receiving secret government documents from a source or further disseminating the documents themselves or information from them in the course of reporting. There is nothing in the language of the law that prevents its use against a news organization, but it has been unofficially accepted that it should not apply to the press.

So it is alarming that the Justice Department is reportedly taking a serious look at bringing criminal charges against Wikileaks and Julian Assange for disclosing classified information . In so doing, the Trump administration is threatening to step over a never-crossed line – applying the secret documents provisions of the Espionage Act to journalistic practices. The threat is greatly concerning in the context of prosecuting whistleblowers, and, more broadly, preserving a free press.

The threat is greatly concerning in the context of prosecuting whistleblowers, and, more broadly, preserving a free press.

Leaks are a vital part of the free flow of information that is essential to our democracy. And reporting on leaked materials, including reporting on classified information, is an essential role of American journalism. The US Supreme Court, in Bartnicki v. Vopper, recognized that those who lawfully obtain information pertaining to a matter of public interest have a near absolute right to publish it even if their source illegally obtained the information. Prosecuting Wikileaks for its role in this fundamental democratic process will undermine these vital protections.

In sections 793(d), (e) and 798 the Espionage Act criminalizes the unauthorized communication of both certain classified information and information “connected with the national defense.” Section 793(c) also prohibits merely obtaining national defense documents “with intent or reason to believe that the information is to be used to the injury of the United States, or to the advantage of any foreign nation.” Whether the principle of Bartnicki v. Vopper would bar a successful prosecution against a news organization under these provisions has never been tested.

A strong defense of Wikileaks is not simply an anti-Trump position. As current events indicate, leaks are non-partisan: those on both sides of the aisle typically embrace leaks that are politically useful and condemn leaks that are politically damaging. President Donald Trump famously praised Wikileaks when disclosures of DNC emails benefitted him. He now threatens to bring the strong arm of the law down on it.

It can be difficult to separate rhetoric from a planned course of action with this administration. But there are strong signs this White House intends to follow through on its bluster.

First, CIA Director Mike Pompeo labeled Wikileaks a “non-state hostile intelligence service,” at an April 13, 2017 speech at the Center for Strategic And International Studies. The director then followed up by asserting his “philosophical understanding,” as opposed to a legal conclusion, that Wikileaks and Assange are not exercising First Amendment rights.

About a week later, Attorney General Jeff Sessions explained that his department was “stepping up its efforts” “on all leaks” with the goal being to “put some people in jail.”

President Trump also reportedly urged then-FBI director James Comey to prosecute and imprison journalists who published classified information. Comey’s failure to prioritize this has been cited as the one of the reasons for his firing.

Moreover, the president’s reported initial first choice for FBI director, former Senator Joseph Lieberman, has a history of belligerence against both the news media broadly and Wikileaks in particular. In 2010, Lieberman called for an investigation of the New York Times and other news media for publishing Wikileaks documents, proposed an “anti-Wikileaks Law” that would have criminalized the disclosure of intelligence source names, and pressured Amazon and credit card processors to choke off funding for Wikileaks.

Many of the other threats the president and those speaking on his behalf have made against the news media both during the election and since he took office require legislative action by either Congress or the states. Unlike his threat to “open up the libel laws”—which would require action by 50 state legislatures or otherwise be subject to Congressional oversight—the executive branch can initiate a federal criminal prosecution on its own.

We condemn the threats of prosecution of Wikileaks and call for all to speak out against the them.

One hundred years is long enough to let the threat of prosecution under the Espionage Act cast a shadow over our free speech and press freedom protections. Sign our petition, and tell U.S. lawmakers to reform this outdated and overbroad law.

Take Action


Read more about how the Espionage Act came to be and the law's murky legal history.

Reposted from02myEcon-01 02myEcon-01 viawikileaks wikileaks

June 02 2015


May 05 2015


March 14 2014


January 12 2014


Remembering Aaron

One year ago, we lost Aaron Swartz, a dear friend and a leader in the fight for a free and open Internet. The shock was, and remains, a profound one. It's a testament to the power of his commitments and ideals that both in life and in death he has inspired millions around the world, including all of us at EFF, to redouble our own efforts to advance the causes that he believed in, and to untangle the twisted and brutal computer crime laws that were used to persecute him. ...

November 24 2013

The main problem with the TPP is that trade delegates are negotiating this agreement behind closed doors under the undue influence of major entertainment companies and other corporate interests. Most of the 700 members on Trade Advisory Committees are corporate lawyers, and they have almost unlimited access to see and comment on draft texts. Meanwhile, civil society groups have only leaked documents to know what is being proposed in the TPP.

Wikileaks' publication of the ‘Intellectual Property’ chapter is an opportunity for public interest advocates to make these threats known to the public. But it's important to note that there are other chapters on investor rights and e-commerce that are also deeply worrisome. We still do not know how negotiations over those chapters are proceeding and we may not even have a chance to see their text until the agreement is finished and can’t be changed.

The public has a right to know when their government representatives are proposing regulations in their name, especially when it deals with non-trade issues like digital copyright enforcement that will distort or prevent reforms to domestic law. We want to see the drafts of TPP, and the U.S. government’s negotiating position, released after every round of negotiation. Leaks are far from a sufficient substitute for true transparency and a participatory public process.
Civil Society Groups Demand Transparency and User Protections in TPP (Electronic Frontier Foundation, Nov. 22 2013)

October 25 2013


EFF Has Lavabit’s Back in Contempt of Court Appeal

Federal law enforcement officers compromised the backbone of the Internet and violated the Fourth Amendment when they demanded private encryption keys from the email provider Lavabit, the Electronic Frontier Foundation (EFF) argues in a brief submitted Thursday afternoon to the US Court of Appeals for the Fourth Circuit. In the amicus brief, EFF asks the panel to overturn a contempt-of-court finding against Lavabit and its owner Ladar Levison for resisting a government subpoena and search warrant that would have put the private communications and data of Lavabit's 400,000 customers at risk of exposure to the government. ...

September 06 2013


Electronic Frontier Foundation (Sep. 5 2013)

  • Hundreds of Pages of NSA Spying Documents to be Released As Result of EFF Lawsuit

    In a major victory in one of EFF's Freedom of Information Act (FOIA) lawsuits, the Justice Department conceded yesterday that it will release hundreds of pages of documents, including FISA court opinions, related to the government’s secret interpretation of Section 215 of the Patriot Act, the law the NSA has relied upon for years to mass collect the phone records of millions of innocent Americans. ...

  • Leaks Show NSA is Working to Undermine Encrypted Communications, Here's How You Can Fight Back

    In one of the most significant leaks to date regarding National Security Agency (NSA) spying, the New York Times, the Guardian, and ProPublica reported today that the NSA has gone to extraordinary lengths to secretly undermine our secure communications infrastructure, collaborating with GCHQ (Britain's NSA equivalent) and a select few intelligence organizations worldwide.

    These frightening revelations imply that the NSA has not only pursued an aggressive program of obtaining private encryption keys for commercial products—allowing the organization to decrypt vast amounts of Internet traffic that use these products—but that the agency has also attempted to put backdoors into cryptographic standards designed to secure users' communications. Additionally, the leaked documents make clear that companies have been complicit in allowing this unprecedented spying to take place, though the identities of cooperating companies remain unknown. ...

Tags: Bookmarks EFF NSA

August 11 2013


What It Means to Be An NSA "Target": New Information Shows Why We Need Immediate FISA Amendments Act Reform | Electronic Frontier Foundation

An important New York Times investigation from today reporting that the NSA "is searching the contents of vast amounts of Americans’ e-mail and text communications into and out of the country," coupled with leaked documents published by the Guardian, seriously calls into question the accuracy of crucial statements made by government officials about NSA surveillance.

The government has previously tried to reassure the public about its use of FISA Amendments Act Section 702 surveillance practices, emphasizing that, under Section 702, the government may not “intentionally target any U.S. citizen, any other U.S. person, or anyone located within the United States." Indeed, the chair of the Senate Intelligence Committee Senator Feinstein, in a letter to constituents who wrote to her expressing concern about the NSA's spying program, said this: "[T]he government cannot listen to an American’s telephone calls or read their emails without a court warrant issued upon a showing of probable cause."

We’ve written before about the word games the government plays in describing its surveillance practices: “acquire,” “collect,” and “content” are all old government favorites. The New York Times report proves Feinstein statement is false, and it's clear it’s time to add “target” to the list of word games as well. ...

August 08 2013

This particular attack appears to affect only Windows users who have not updated to the most recent version of the Tor Browser Bundle. Because of this and a variety of other reasons that make it challenging to use Windows securely, Tor advises that ‘switching away from Windows is probably a good security move.’ If moving to a different platform is not practical, it is especially important to keep up with software updates. The advisory also recommends that users concerned about their security consider disabling JavaScript and installing the Firefox add-on Request Policy, which allows you to control which origins are loaded from a given website.
Tor Browser attacked, users should update software immediately (Electronic Frontier Foundation, Aug. 6 2013)
Reposted bywonko wonko

July 20 2013


Barrett Brown Prosecution Threatens Right to Link, Could Criminalize Routine Journalism Practices | Electronic Frontier Foundation

Horrible story.

Under the government’s theory in Barrett Brown’s case, all journalists (and anyone else for that matter) tweeting out the link to the list of Congressional staffer email addresses and passwords were trafficking in authentication features and are guilty of a felony. While it turns out that many of the passwords in this case may not have been accurate, this lesson holds true anytime someone links to groups of stolen passwords posted online, which seems to happen fairly frequently.

And in this situation, under the Justice Department’s theory, those linking to the list violated the aggravated identity theft statute too because during that crime, they knowingly transferred “without lawful authority, a means of identification of another person”—the email addresses. These are serious charges; aggravated identity theft alone carries a mandatory two-year prison sentence that must run consecutively to any other sentence imposed.

It bears repeating: the government does not allege Brown participated in the hacking of Stratfor at all. Here, Brown didn’t even publish anything, he merely directed other people to where information was already published via a standard hyperlink. The right of journalists—or anyone for that matter—to link to already-public information, including sensitive information, is in serious jeopardy if Brown is convicted.

July 18 2013


Technology to Protect Against Mass Surveillance (Part 1) | Electronic Frontier Foundation

In the past several weeks, EFF has received many requests for advice about privacy tools that provide technological shields against mass surveillance. We've been interested for many years in software tools that help people protect their own privacy; we've defended your right to develop and use cryptographic software, we've supported the development of the Tor software, and written privacy software of our own.

This article is part one of a two-part series. In this part, we'll take a brief look at some of the available tools to blunt the effects of mass surveillance. In the second part, we'll discuss the big picture, reasons Internet users have been slow to adopt cryptographic software, and some limitations of existing technology's ability to defend us against government snooping. ...

July 17 2013


rt.com (Jul. 16 2013)

  • Yahoo wins lawsuit to declassify docs proving resistance to PRISM

    Search engine Yahoo has won a court case to release NSA records and potentially prove it resisted handing over customer data to US authorities. The ruling could clear Yahoo’s name following allegations it collaborated with the NSA to spy on citizens. ...

  • Obama administration drowning in lawsuits filed over NSA surveillance

    Attorneys for the Electronic Frontier Foundation have sued the Obama administration and are demanding the White House stop the dragnet surveillance programs operated by the National Security Agency.

    Both the White House and Congress have weighed in on the case of Edward Snowden and the revelations he’s made by leaking National Security Agency documents. Now the courts are having their turn to opine, and with opportunities aplenty. ...

  • Microsoft asks US government for freedom to disclose national security organizations’ requests

    Microsoft has asked US Attorney General for more freedom to disclose how it handles requests from national security organizations for customer data, the Corporation said. It follows the Guardian newspaper report, citing Edward Snowden leaked documents, that the world's largest software company allowed US security agencies to break an encryption of Outlook emails and capture Skype online chats, Reuters reports. Microsoft said there were "significant inaccuracies" in media reports last week and argued that it does not allow any government direct or indirect access to customers' emails, instant messages or data.

July 07 2013


How To Opt Out Of Twitter's Tailored Advertisements (And More!) | Electronic Frontier Foundation

Earlier, we posted about Twitter's new tailored advertising announcement. We applauded Twitter's commitment to privacy by allowing two opt-out mechanisms—both an internal setting and your browser's Do Not Track capability. To make things easier for you, here's a guide to opt out of Twitter's tailored advertisements and how best to protect yourself from online tracking.

June 30 2013

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!