Electronic Privacy Information Center

Electronic Privacy Information Center

Law Practice

Washington, District of Columbia 2,003 followers

Über uns

EPIC is an independent non-profit research center in Washington, DC. EPIC works to protect privacy, freedom of expression, democratic values, and to promote the Public Voice in decisions concerning the future of the Internet. EPIC pursues a wide range of program activities including public education, litigation, and advocacy. EPIC routinely files amicus briefs in federal courts, pursues open government cases, defends consumer privacy, organizes conferences for NGOs, and speaks before Congress and judicial organizations about emerging privacy and civil liberties issues. EPIC works closely with a distinguished advisory board, with expertise in law, technology and public policy. EPIC maintains two of the most popular privacy web sites in the world - epic.org and privacy.org.

Website
http://epic.org/
Industrie
Law Practice
Größe des Unternehmens
11-50 Mitarbeiter
Hauptsitz
Washington, District of Columbia
Typ
Nonprofit
Gegründet
1994
Spezialitäten
Privacy, Appellate Advocacy, Civil Liberties, and Open Government

Standorte

Employees at Electronic Privacy Information Center

Aktualisierungen

  • Electronic Privacy Information Center reposted this

    🚨 United Against Spyware Abuse in the EU and Beyond 🚨 Spyware isn’t just a privacy issue — it’s a threat to the very foundations of our democratic values. By undermining independent decision-making, restricting public debate, and silencing journalists and activists, spyware erodes the pillars of a healthy civic space. As new European Union institutions prepare to take office following the EU elections, the growing threat of spyware has become a pressing global concern that demands immediate attention. Today, CDT Europe, alongside 30 civil society and journalists' organisations, publishes a joint statement urging the incoming EU institutions to prioritise action against the misuse of spyware in the new legislative term. Some of our coalition’s key recommendations include: 🔹 A ban on the production, sale, and use of spyware that disproportionately harms fundamental rights. 🔹 Stronger export controls to prevent the misuse of these technologies beyond the EU. 🔹 Transparency and accountability in government contracts involving spyware. As Silvia Lorenzo Perez, Director of CDT Europe’s Security, Surveillance & Human Rights Programme, puts it: "The incoming EU institutions have the opportunity to correct the failures of the last legislature by taking concrete and decisive action against the abuse of spyware surveillance." The new EU institutions must seize this moment to restore public trust, protect our fundamental rights, and uphold the values that define the Union. #StopSpyware #Surveillance #Pegasus #CivilSociety

  • EPIC Adds Ten Leading Scholars and Advocates to Advisory Board   Today the Electronic Privacy Information Center (EPIC) announced the addition of ten members to its Advisory Board. As EPIC celebrates our 30th anniversary, we are thrilled to continue to grow our network of leading scholars, experts, and advocates in the privacy, civil liberties, and cybersecurity space, whose knowledge we draw on to inform and advance our work. Read the press release here: https://lnkd.in/dPpPP6Zu “EPIC is fortunate to have deep expertise on our Advisory Board that enables us to play a leading role in debates on emerging policy issues that shape the future of the internet. The new members joining us as advisors work at the cutting edge of AI policy, cybersecurity, digital democracy, and data protection.” says EPIC Executive Director Alan Butler.     EPIC will turn to the expertise of these new members in our research, advocacy, and litigation work at a time when states are passing privacy legislation that establishes meaningful limits on the collection and use of personal data, when tech companies seek to use the First Amendment to overturn laws that would give users privacy rights and prevent harmful platform design choices, when major data security incidents have illuminated gaps in our cybersecurity infrastructure and risks to health and other personal data, and when our democracy and civil rights are under threat from AI disinformation.    EPIC’s new members bring a wealth of knowledge and experience in educating the public and working to protect privacy and civil liberties against the various threats that exist today. The new members are: Jim Balsillie, Lorrie Cranor, Serge Egelman, Leah Fowler, Susan Landau, Jessica Levinson, Dawn Nunziato, Spencer Overton, Andrew Selbst, and Zephyr Teachout. EPIC Advisory Board: https://lnkd.in/eWyzj8WW

    PRESS RELEASE: EPIC Adds Ten Leading Scholars and Advocates to Advisory Board 

    PRESS RELEASE: EPIC Adds Ten Leading Scholars and Advocates to Advisory Board 

    epic.org

  • 💡 NEW: As the Consumer Financial Protection Bureau considers adopting new rules under the Fair Credit Reporting Act, EPIC is publishing a series of resources about the rulemaking process and the urgent need to crack down on data brokers through the FCRA. EPIC’s new resources delve into the rules CFPB is considering and how they would help modernize the FCRA to better protect consumers. EPIC’s new materials also highlight how data brokers harm consumers and pose a threat to national security, and they explain how the upcoming FCRA rulemaking can prevent or mitigate those risks. ➡ Check out the landing page for EPIC’s publications on the FCRA rulemaking: https://lnkd.in/eVDKPhBg EPIC has also published three one pagers on different aspects of the rulemaking and on the harms inflicted by data brokers:   ▪ FCRA Rulemaking: A Path to Reigning in Data Brokers https://lnkd.in/ezR-VXt7   ▪ CFPB FCRA Rulemaking Explained: Key Rule Proposals https://lnkd.in/eWHsquDg  ▪ Data Broker Threats: National Security https://lnkd.in/eaYWUqpK  Be on the lookout for more one pagers soon focusing on ways that data brokers violate our privacy and harm different communities. Future one pagers will also be housed on the FCRA rulemaking landing page.

    • Keine alternative Textbeschreibung für dieses Bild
    • Keine alternative Textbeschreibung für dieses Bild
    • Keine alternative Textbeschreibung für dieses Bild
  • SCOTUS's decision in the NetChoice cases was a huge blow to Big Tech's strategy of asking for the broadest possible relief based on the barest of records. Tomorrow, the 9th Cir can check Big Tech in two other poorly supported facial challenges. Read more here: https://lnkd.in/e9tUgh5i Over the last few years, Big Tech has asked courts to declare a series of laws entirely invalid under the First Amendment based on little more than vibes. In the process, they have hoped to secure broad constitutional rulings essentially declaring them immune from regulation. In NetChoice v. Bonta, the Big Tech trade org challenged the CA Age-Appropriate Design Code based on sloppy statutory & constitutional interpretation & speculation about the law's application. NetChoice asked for the moon—relief from all data protection regulations—& they got it. In X v. Bonta, X Corp challenged CA's AB 587, which requires social media cos to disclose information about their content moderation practices. X argues the law might be used to jawbone changes to their content moderation—which doesn't come close to justifying facial invalidation. SCOTUS's decision in the NetChoice case decides both of these appeals. The dangerous decision in the NetChoice case should be vacated b/c the judge applied the wrong standard. The decision not to grant X an injunction based on a flimsy record should be affirmed. EPIC filed amicus briefs supporting AG Bonta in both cases. In NetChoice, EPIC distinguished the CAAADC from other state laws that restrict kids' access to social media. (https://lnkd.in/ePJu6hAZ) In X, EPIC explained the importance of transparency laws—& how X's arguments don't support facial invalidation. (https://lnkd.in/e34Vh9-u) We have also written about the dangers & flaws in the district court's decision in NetChoice v. Bonta (https://lnkd.in/eArCTDau)

    Far From a Punt, SCOTUS’s NetChoice Decision Crushes Big Tech’s Big Litigation Dreams

    Far From a Punt, SCOTUS’s NetChoice Decision Crushes Big Tech’s Big Litigation Dreams

    epic.org

  • In its opinion in Moody v. NetChoice and NetChoice v. Paxton last week, SCOTUS signaled a surprisingly—and healthily—narrow interpretation of platform’s First Amendment curation rights. Read our takeaways here: https://lnkd.in/eCN34W-P   Takeaway 1: A majority of Justices agreed that content moderation is expressive when it is based on a platform’s content and community guidelines. These guidelines most closely resemble the editorial judgment protected for traditional curators like newspaper editors. Takeaway 2: Machine-learning algorithms that enforce content and community guidelines may receive less constitutional protection because using black-box algorithms may attenuate the connection to human editorial judgments. Takeaway 3: The Court was skeptical that using algorithms to curate content outside a platform’s published guidelines is expressive, such as recommending content based on user behavior. This is good news for content-neutral laws that focus on these systems such as NY's SAFE Act. The general guidance that can be pulled from these cases is that courts need to drill into the expressiveness of a curatorial activity at a granular level of specificity. Not everything companies do to select and display content is inherently expressive, but some of it is.

    In NetChoice Cases, Supreme Court Labels a Surprisingly Narrow Class of Online Platform Company Activities as Protected Expression

    In NetChoice Cases, Supreme Court Labels a Surprisingly Narrow Class of Online Platform Company Activities as Protected Expression

    epic.org

  • In a big blow to Big Tech and its allies, SCOTUS just rebuked NetChoice’s extreme, shoot-for-the-moon litigation strategy in Moody v. NetChoice and NetChoice v. Paxton. https://lnkd.in/ezCtJcAq NetChoice sought a pronouncement that platform design choices are wholly protected expression, but SCOTUS refused to take the bait. Instead, it explained that a facial First Amendment challenge must be based on a robust record and careful interpretation of the law's full sweep. This is a rebuke of Big Tech’s recent litigation strategy. They attempt to overturn entire regulatory schemes by misconstruing the laws and warning that those misconstructions could hypothetically be enforced in an unconstitutional way. This is what we have seen in NetChoice’s challenge of California’s AADC or X’s challenge to California’s social media transparency law. (https://lnkd.in/e2ZVENhX https://lnkd.in/e63G9iqA) The majority also provided non-binding guidance on when social media companies’ editorial judgment is implicated. Its decision was another loss for NetChoice, which had argued that all of its members’ platform design choices are protected expression. SCOTUS recognized that some content-based moderation decisions may represent editorial judgment, but it refused to conflate those activities with other platform design activities, meaning that content-neutral platform design and privacy laws are likely constitutional. This narrow course is admirable. Neither of the parties argued for it, and only two out of dozens of amici suggested it: EPIC and the Knight First Amendment Institute at Columbia University. There are many interesting threads to pull. Over the coming days and weeks, EPIC will provide analysis of how the Justices raised and approached interesting questions such as whether and when non-human algorithmic content curation is speech.

    NetChoice v. Paxton / Moody v. NetChoice

    NetChoice v. Paxton / Moody v. NetChoice

    epic.org

  • Today, in Murthy v. Missouri, SCOTUS ruled that the plaintiffs lacked standing to bring their suit. Specifically, they failed to prove that the government was likely to coerce social media companies to remove the plaintiffs’ posts in the future. Read EPIC's previous analysis here: https://lnkd.in/eiKe5NKT   The decision sidestepped the main issue of the case: whether the Biden Administration had unconstitutionally coerced, or “jawboned,” tech companies to suppress speech by consulting with the companies about disinformation campaigns that largely violated the platforms’ own policies. This is an important question in an election year. GenAI-fueled disinformation may be more influential than ever. The government’s expertise can help keep users safe & informed, but the government’s power may present its own dangers if used to suppress disfavored views. https://lnkd.in/eiKe5NKT   But Murthy is still an important ruling for platform governance. Future plaintiffs can’t make generalized assertions about government censorship to stop officials from talking to platforms. To sue the government for jawboning a platform, you need evidence linking gov action to your censorship. It may be hard for future plaintiffs to assemble that evidence because online platforms are famously secretive. This is one reason why EPIC supports platform transparency laws so that the public can better understand how these powerful companies operate. For more on platform transparency, see our amicus brief in X v. Bonta in which we supported a transparency law against tech company attacks. https://lnkd.in/e63G9iqA

    Murthy v. Missouri and the Threat of Election Disinformation

    Murthy v. Missouri and the Threat of Election Disinformation

    epic.org

  • 💡 NEW: EPIC published its AI Legislation Scorecard (epic.org/aiscorecard), a first-of-its-kind rubric for lawmakers, journalists, advocates and academics to use when evaluating the strength of state and federal AI bills. EPIC’s AI Legislation Scorecard comes as the United States faces a tidal wave of new AI legislation, with hundreds of AI bills being introduced in at least 40 states and dozens of federal regulations following suit. To ground its review of the growing AI legislative landscape, EPIC set out to create a structured tool for evaluating AI bills. Informed by expert consultations, internal bill analysis, and evidence-backed policy research, EPIC’s AI Legislation Scorecard outlines key provisions that effective AI legislation should contain, including but not limited to data minimization requirements, impact assessment and testing obligations, prohibitions on particularly harmful AI uses, and robust enforcement mechanisms. The launch of EPIC’s AI Legislation Scorecard was paired with an expert panel discussion involving Vermont State Representative Monique Priestley; Nik Marda, Mozilla’s Technical Lead for AI Governance; Alicia Solow-Niederman, Associate Professor at the George Washington University Law School; and Adam Billen, Director of Policy at Encode Justice. As Nik Marda explained, EPIC’s Scorecard “shows that we actually know a lot about how to regulate AI already. While [responsible AI oversight] is framed like a hopeless endeavor, we’ve actually made a lot of progress through the years on how to tackle [AI] harms.” With this Scorecard, EPIC provides a guide for what robust, comprehensive AI legislation can look like. If you have questions or would like EPIC to grade a specific piece of state or federal AI legislation, please contact EPIC Law Fellow Kara Williams.

  • A quick reminder to join EPIC and panelists for our AI Legislation Scorecard: Virtual Panel Discussion and Scorecard Launch. We hope to see you there on June 25 at 12pm ET! Zoom: https://lnkd.in/dbTqFHqr

    To launch our AI Legislation Scorecard, we invite folks to join EPIC and expert panelists for a virtual discussion at 12 p.m. on Tuesday, June 25. Event page: https://lnkd.in/e39Ed-nV Zoom link: https://lnkd.in/eUcjKmRp We are delighted to be joined by: Vermont Representative Monique Priestley; Alicia Solow-Niederman, Associate Professor, George Washington University Law School; Adam Billen, Director of Policy, Encode Justice; and Nik Marda, Technical Lead, AI Governance, Mozilla. The United States is facing a growing wave of AI legislation at both the state and federal levels. Hundreds of bills seeking to regulate AI were introduced in at least 40 states this legislative session, and dozens of federal regulations have been proposed as well. These bills varied widely in their approaches to regulating AI—some tried to set out comprehensive frameworks, some created task forces or commissioned further study, and some focused on regulating narrow or sector-specific AI uses.  EPIC set out to create a tool for evaluating this plethora of AI bills: EPIC’s AI Legislation Scorecard provides a rubric for lawmakers, journalists, advocates, and academics to use to evaluate the strength of AI bills. The scorecard lays out key provisions that any effective AI legislation should contain, including prohibitions on particularly harmful AI uses, data minimization requirements, impact assessment and testing obligations, and robust enforcement mechanisms. During the event, panelists will discuss why strong AI regulation is urgently needed, what effective regulation should look like, and how lawmakers can craft legislation that ensures AI is adopted safely and responsibly. 

    This content isn’t available here

    Access this content and more in the LinkedIn app

  • Join EPIC's Sara Geoghegan and Suzanne Bernstein alongside Ethical Tech Initiative on Wednesday, June 26th from 6-8pm for a panel discussion on Two Years Post-Dobbs: The Legal Landscape of Reproductive Data Privacy. RSVP here: https://lnkd.in/eXzux6c6

    View organization page for Ethical Tech Initiative, graphic

    211 followers

    Please RSVP using this form: https://lnkd.in/eXzux6c6 Join us on Wed. June 26th from 6-8pm as the Reproductive Data Privacy Initiative of EthicalTech@GW and the Electronic Privacy Information Center (EPIC) host Two Years Post-Dobbs: The Legal Landscape of Reproductive Data Privacy to launch our joint Reproductive Data Privacy Database. This catered event will feature a panel discussion on the pressing challenges of protecting reproductive data privacy since the Dobbs decision was handed down in June 2022. A networking session for law students interested in privacy law with privacy law attorneys will follow the panel discussion. The event will feature recorded introductory remarks by Congresswoman Sara Jacobs (sponsor of the My Body, My Data Act) and will be moderated by Professor Dawn Nunziato. Professor Sonia Suter and Sara Geoghegan and Suzanne Bernstein of EPIC will serve as expert panelists. The event will be held in the Student Conference Center (SCC) of The George Washington Law School and will be catered with great food and beverages.

    • Keine alternative Textbeschreibung für dieses Bild

Ähnliche Seiten

Jobs durchsuchen