23 Matching Annotations
  1. Mar 2024
    1. Millions of Patient Records at Risk: The Perils of Legacy Protocols

      Sina Yazdanmehr | Senior IT Security Consultant, Aplite GmbH Ibrahim Akkulak | Senior IT Security Consultant, Aplite GmbH Date: Wednesday, December 6, 2023

      Abstract

      Currently, a concerning situation is unfolding online: a large amount of personal information and medical records belonging to patients is scattered across the internet. Our internet-wide research on DICOM, the decade-old standard protocol for medical imaging, has revealed a distressing fact – Many medical institutions have unintentionally made the private data and medical histories of millions of patients accessible to the vast realm of the internet.

      Medical imaging encompasses a range of techniques such as X-Rays, CT scans, and MRIs, used to visualize internal body structures, with DICOM serving as the standard protocol for storing and transmitting these images. The security problems with DICOM are connected to using legacy protocols on the internet as industries strive to align with the transition towards Cloud-based solutions.

      This talk will explain the security shortcomings of DICOM when it is exposed online and provide insights from our internet-wide research. We'll show how hackers can easily find, access, and exploit the exposed DICOM endpoints, extract all patients' data, and even alter medical records. Additionally, we'll explain how we were able to bypass DICOM security controls by gathering information from the statements provided by vendors and service providers regarding their adherence to DICOM standards.

      We'll conclude by providing practical recommendations for medical institutions, healthcare providers, and medical engineers to mitigate these security issues and safeguard patients' data.

  2. Feb 2024
    1. Harold Abelson, Ross Anderson, Steven M Bellovin, Josh Benaloh, Matt Blaze, Jon Callas, Whitfield Diffie, Susan Landau, Peter G Neumann, Ronald L Rivest, Jeffrey I Schiller, Bruce Schneier, Vanessa Teague, Carmela Troncoso, Bugs in our pockets: the risks of client-side scanning, Journal of Cybersecurity, Volume 10, Issue 1, 2024, tyad020, https://doi.org/10.1093/cybsec/tyad020

      Abstract

      Our increasing reliance on digital technology for personal, economic, and government affairs has made it essential to secure the communications and devices of private citizens, businesses, and governments. This has led to pervasive use of cryptography across society. Despite its evident advantages, law enforcement and national security agencies have argued that the spread of cryptography has hindered access to evidence and intelligence. Some in industry and government now advocate a new technology to access targeted data: client-side scanning (CSS). Instead of weakening encryption or providing law enforcement with backdoor keys to decrypt communications, CSS would enable on-device analysis of data in the clear. If targeted information were detected, its existence and, potentially, its source would be revealed to the agencies; otherwise, little or no information would leave the client device. Its proponents claim that CSS is a solution to the encryption versus public safety debate: it offers privacy—in the sense of unimpeded end-to-end encryption—and the ability to successfully investigate serious crime. In this paper, we argue that CSS neither guarantees efficacious crime prevention nor prevents surveillance. Indeed, the effect is the opposite. CSS by its nature creates serious security and privacy risks for all society, while the assistance it can provide for law enforcement is at best problematic. There are multiple ways in which CSS can fail, can be evaded, and can be abused.

      Right off the bat, these authors are highly experienced and plugged into what is happening with technology.

  3. Mar 2023
    1. Companies that perform surveillance are attempting the same mental trick. They assert that we freely share our data in return for valuable services. But opting out of surveillance capitalism is like opting out of electricity, or cooked foods—you are free to do it in theory. In practice, it will upend your life.

      Opting-out of surveillance capitalism?

  4. Dec 2022
    1. The presence of Twitter’s code — known as the Twitter advertising pixel — has grown more troublesome since Elon Musk purchased the platform.AdvertisementThat’s because under the terms of Musk’s purchase, large foreign investors were granted special privileges. Anyone who invested $250 million or more is entitled to receive information beyond what lower-level investors can receive. Among the higher-end investors include a Saudi prince’s holding company and a Qatari fund.

      Twitter investors may get access to user data

      I'm surprised but not surprised that Musk's dealings to get investors in his effort to take Twitter private may include sharing of personal data about users. This article makes it sound almost normal that this kind of information-sharing happens with investors (inclusion of the phrase "information beyond what lower-level investors can receive").

    1. Meta's receipt of tax information via tracking pixels on tax preparer websites is the subject of a federal lawsuit. The tax preparing sites are not participants in the lawsuit (yet?).

  5. Nov 2022
  6. Jun 2022
    1. The goal is to gain “digital sovereignty.”

      the age of borderless data is ending. What we're seeing is a move to digital sovereignty

    1. All wireless devices have small manufacturing imperfections in the hardware that are unique to each device. These fingerprints are an accidental byproduct of the manufacturing process. These imperfections in Bluetooth hardware result in unique distortions, which can be used as a fingerprint to track a specific device. For Bluetooth, this would allow an attacker to circumvent anti-tracking techniques such as constantly changing the address a mobile device uses to connect to Internet networks. 

      Tracking that evades address changes

      An operating system can change the hardware address it broadcasts in avoid tracking. But subtle differences in the signal itself can still be identified and tracked.

    1. the one thing that you have to keep conveying to people about the consequences of surveillance is that it's all very well to say that you have nothing to hide, but when you're spied upon, everybody that's connected to you gets spied upon. And if we don't push back, the most vulnerable people in society, the people that actually keep really massive violations of human rights and illegality in check, they're the people who get most affected.

      "I Have Nothing To Hide" counter-argument

      Even if you have nothing to hide, that doesn't mean that those you are connected with aren't also being surveilled and are part of targeted communities.

  7. Apr 2022
    1. Dorothea Salo (2021) Physical-Equivalent Privacy, The Serials Librarian, DOI: 10.1080/0361526X.2021.1875962

      Permanent Link: http://digital.library.wisc.edu/1793/81297

      Abstract

      This article introduces and applies the concept of “physical-equivalent privacy” to evaluate the appropriateness of data collection about library patrons’ use of library-provided e‑resources. It posits that as a matter of service equity, any data collection practice that causes e‑resource users to enjoy less information privacy than users of an information-equivalent print resource is to be avoided. Analysis is grounded in real-world e‑resource-related phenomena: secure (HTTPS) library websites and catalogs, the Adobe Digital Editions data-leak incident of 2014, and use of web trackers on e‑resource websites. Implications of physical-equivalent privacy for the SeamlessAccess single-sign-on proposal will be discussed.

    1. I thought that the point of disappearing messages was to eat your cake and have it too, by allowing you to send a message to your adversary and then somehow deprive them of its contents. This is obviously a stupid idea.But the threat that Snapchat — and its disappearing message successors —was really addressing wasn’t communication between untrusted parties, it was automating data-retention agreements between trusted parties.

      Why use a disappearing message service

      The point of a disappearing message service is to have the parties to the message agree on the data-retention provisions of a message. The service automates that agreement by deleting the message at the specified time. The point isn't to send a message to an adversary and then delete it so they can't prove that it has been sent. There are too many ways of capturing the contents of a message—as simple as taking a picture of the message with another device.

    1. Weinberg’s tweet announcing the change generated thousands of comments, many of them from conservative-leaning users who were furious that the company they turned to in order to get away from perceived Big Tech censorship was now the one doing the censoring. It didn’t help that the content DuckDuckGo was demoting and calling disinformation was Russian state media, whose side some in the right-wing contingent of DuckDuckGo’s users were firmly on.

      There is an odd sort of self-selected information bubble here. DuckDuckGo promoted itself as privacy-aware, not unfiltered. On their Sources page, they talk about where they get content and how they don't sacrifice privacy to gather search results. Demoting disinformation sources in their algorithms would seem to be a good thing. Except if what you expect to see is disinformation, and then suddenly the search results don't match your expectations.

  8. Dec 2021
    1. About 7 in 10 Americans think their phone or other devices are listening in on them in ways they did not agree to.

      I'm enough of a tinfoil hat wearer to this this might be true. Especially since my google home talks to me entirely too much when I'm not talking to it.

  9. Nov 2021
  10. May 2021
  11. Nov 2020
  12. Jul 2020
  13. Jun 2020
  14. Apr 2020
  15. Mar 2020
    1. Right now, if you want to know what data Facebook has about you, you don’t have the right to ask them to give you all of the data they have on you, and the right to know what they’ve done with it. You should have that right. You should have the right to know and have access to your data.
  16. Apr 2018
    1. What can we build that would allow people to 1.) annotate terms of service related to tools they adopt in a classroom? and 2.) see an aggregated list of all current annotations. Last, if we were to start critically analyzing EdTech Terms of Service, what questions should we even ask?

  17. Mar 2017
  18. Jan 2017
    1. Almost half of eight- to 11-year-olds have agreed impenetrable terms and conditions to give social media giants such as Facebook and Instagram control over their data, without any accountability, according to the commissioner’s Growing Up Digital taskforce. The year-long study found children regularly signed up to terms including waiving privacy rights and allowing the content they posted to be sold around the world, without reading or understanding their implications.