738 Matching Annotations
  1. Sep 2024
  2. Jul 2024
    1. First, the complexity of modern federal criminal law, codified in several thousand sections of the United States Code and the virtually infinite variety of factual circumstances that might trigger an investigation into a possible violation of the law, make it difficult for anyone to know, in advance, just when a particular set of statements might later appear (to a prosecutor) to be relevant to some such investigation.

      If the federal government had access to every email you’ve ever written and every phone call you’ve ever made, it’s almost certain that they could find something you’ve done which violates a provision in the 27,000 pages of federal statues or 10,000 administrative regulations. You probably do have something to hide, you just don’t know it yet.

  3. Jun 2024
    1. People haven’t really come to grips with the fact that it’s not just personal privacy that matters, it’s also institutional privacy
  4. May 2024
    1. So I'm not surprised, and perhaps the last question about user privacy.

      Question about the privacy of the user interacting with the RAG

      It doesn't seem like JSTOR has thought about this thoroughly. In the beta there is kind of an expectation that the service is being validated so what users are doing is being watched closer.

  5. Apr 2024
    1. Privacy is realized by the group founder creating a specialkeypair for the group and secretly sharing the private group key with every new group member.When a member is removed from a group or leaves it, the founder has to renew the group’s keypair

      Have locks on data is a nice idea.

      But given we have dissemination/disclosure among friends only, do we need to encrypt the blocks? Given communication channels are encrypted.

    1. What are some top UX design principles when designing for kids?Some important UX design principles when designing for kids are as follows. Simplicity and clarity Interactive and engaging elements Age-appropriate content Safety and privacy Consistent feedback and rewards

      There's 5 in this list and there was 4 in the other - I think Safety and Privacy is the one additional but it's also in my proposal because I am concerned about it too.

  6. Mar 2024
    1. Millions of Patient Records at Risk: The Perils of Legacy Protocols

      Sina Yazdanmehr | Senior IT Security Consultant, Aplite GmbH Ibrahim Akkulak | Senior IT Security Consultant, Aplite GmbH Date: Wednesday, December 6, 2023

      Abstract

      Currently, a concerning situation is unfolding online: a large amount of personal information and medical records belonging to patients is scattered across the internet. Our internet-wide research on DICOM, the decade-old standard protocol for medical imaging, has revealed a distressing fact – Many medical institutions have unintentionally made the private data and medical histories of millions of patients accessible to the vast realm of the internet.

      Medical imaging encompasses a range of techniques such as X-Rays, CT scans, and MRIs, used to visualize internal body structures, with DICOM serving as the standard protocol for storing and transmitting these images. The security problems with DICOM are connected to using legacy protocols on the internet as industries strive to align with the transition towards Cloud-based solutions.

      This talk will explain the security shortcomings of DICOM when it is exposed online and provide insights from our internet-wide research. We'll show how hackers can easily find, access, and exploit the exposed DICOM endpoints, extract all patients' data, and even alter medical records. Additionally, we'll explain how we were able to bypass DICOM security controls by gathering information from the statements provided by vendors and service providers regarding their adherence to DICOM standards.

      We'll conclude by providing practical recommendations for medical institutions, healthcare providers, and medical engineers to mitigate these security issues and safeguard patients' data.

    1. Critics say this amounts to users having to pay for their privacy.

      "If you aren't paying for it, you're the product."

      "If you pay for it, you're paying for fundamental rights."

  7. Feb 2024
    1. Harold Abelson, Ross Anderson, Steven M Bellovin, Josh Benaloh, Matt Blaze, Jon Callas, Whitfield Diffie, Susan Landau, Peter G Neumann, Ronald L Rivest, Jeffrey I Schiller, Bruce Schneier, Vanessa Teague, Carmela Troncoso, Bugs in our pockets: the risks of client-side scanning, Journal of Cybersecurity, Volume 10, Issue 1, 2024, tyad020, https://doi.org/10.1093/cybsec/tyad020

      Abstract

      Our increasing reliance on digital technology for personal, economic, and government affairs has made it essential to secure the communications and devices of private citizens, businesses, and governments. This has led to pervasive use of cryptography across society. Despite its evident advantages, law enforcement and national security agencies have argued that the spread of cryptography has hindered access to evidence and intelligence. Some in industry and government now advocate a new technology to access targeted data: client-side scanning (CSS). Instead of weakening encryption or providing law enforcement with backdoor keys to decrypt communications, CSS would enable on-device analysis of data in the clear. If targeted information were detected, its existence and, potentially, its source would be revealed to the agencies; otherwise, little or no information would leave the client device. Its proponents claim that CSS is a solution to the encryption versus public safety debate: it offers privacy—in the sense of unimpeded end-to-end encryption—and the ability to successfully investigate serious crime. In this paper, we argue that CSS neither guarantees efficacious crime prevention nor prevents surveillance. Indeed, the effect is the opposite. CSS by its nature creates serious security and privacy risks for all society, while the assistance it can provide for law enforcement is at best problematic. There are multiple ways in which CSS can fail, can be evaded, and can be abused.

      Right off the bat, these authors are highly experienced and plugged into what is happening with technology.

  8. Jan 2024
    1. Also just by observing what they’re doing it becomes pretty clear. For example: Facebook recently purchased full-page ads on major newspapers entirely dedicated to “denounce” Apple. Why? Because Apple has built a system-level feature on iPhones that allows users to very easily disable every kind of advertising tracking and profiling. Facebook absolutely relies on being able to track you and profile your interests, so they immediately cooked up some cynical reasons why Apple shouldn’t be allowed to do this.But the truth is: if Facebook is fighting against someone on privacy matters, that someone is probably doing the right thing.
  9. Nov 2023
    1. To improve user privacy, display moment notifications are intentionally delayed a random amount of time when FedCM is enabled.

      How does that improve privacy?

  10. Oct 2023
    1. "Without the right to tinker and explore, we risk becoming enslaved by technology; and the more we exercise the right to hack, the harder it will be to take that right away" - Andre "Bunnie" Huang

      hah, we are already "enslaved by technology". ask Ted Kaczynski

      our enemies already have hardware backdoors, compromising emissions (tempest), closed-source firmware/drivers/hardware, ... but sure, "feel free"

    1. ``` { version: 4,

      info: { reason: <string>, // what triggered this ping: "saved-session", "environment-change", "shutdown", ... revision: <string>, // the Histograms.json revision timezoneOffset: <integer>, // time-zone offset from UTC, in minutes, for the current locale previousBuildId: <string>, // null if this is the first run, or the previous build ID is unknown

      sessionId: <uuid>,  // random session id, shared by subsessions
      subsessionId: <uuid>,  // random subsession id
      previousSessionId: <uuid>, // session id of the previous session, null on first run.
      previousSubsessionId: <uuid>, // subsession id of the previous subsession (even if it was in a different session),
                                    // null on first run.
      
      subsessionCounter: <unsigned integer>, // the running no. of this subsession since the start of the browser session
      profileSubsessionCounter: <unsigned integer>, // the running no. of all subsessions for the whole profile life time
      
      sessionStartDate: <ISO date>, // hourly precision, ISO date in local time
      subsessionStartDate: <ISO date>, // hourly precision, ISO date in local time
      sessionLength: <integer>, // the session length until now in seconds, monotonic
      subsessionLength: <integer>, // the subsession length in seconds, monotonic
      
      addons: <string>, // obsolete, use ``environment.addons``
      

      },

      processes: {...}, simpleMeasurements: {...},

      // The following properties may all be null if we fail to collect them. histograms: {...}, keyedHistograms: {...}, chromeHangs: {...}, // removed in firefox 62 threadHangStats: [...], // obsolete in firefox 57, use the 'bhr' ping log: [...], // obsolete in firefox 61, use Event Telemetry or Scalars gc: {...}, fileIOReports: {...}, lateWrites: {...}, addonDetails: {...}, UIMeasurements: [...], // Android only slowSQL: {...}, slowSQLstartup: {...}, } ```

    1. ``` { type: <string>, // "main", "activation", "optout", "saved-session", ... id: <UUID>, // a UUID that identifies this ping creationDate: <ISO date>, // the date the ping was generated version: <number>, // the version of the ping format, currently 4

      application: { architecture: <string>, // build architecture, e.g. x86 buildId: <string>, // "20141126041045" name: <string>, // "Firefox" version: <string>, // "35.0" displayVersion: <string>, // "35.0b3" vendor: <string>, // "Mozilla" platformVersion: <string>, // "35.0" xpcomAbi: <string>, // e.g. "x86-msvc" channel: <string>, // "beta" },

      clientId: <UUID>, // optional environment: { ... }, // optional, not all pings contain the environment payload: { ... }, // the actual payload data for this ping type } ```

  11. Sep 2023
    1. Google claims this new API addresses FLoC’s serious privacy issues. Unfortunately, it does anything but. The Topics API only touches the smallest, most minor privacy issues in FLoC, while leaving its core intact. At issue is Google’s insistence on sharing information about people’s interests and behaviors with advertisers, trackers, and others on the Web that are hostile to privacy.
    1. If a site you visit queries the Topics API, it may learn of this interest from Chrome and decide to serve you an advert about bonds or retirement funds. It also means websites can fetch your online interests straight from your browser.

      The Topics API is worst than 3rd-parties cookies, anyone can query a user ad profile:

      ```js // document.browsingTopics() returns an array of BrowsingTopic objects. const topics = await document.browsingTopics();

      // Get data for an ad creative. const response = await fetch('https://ads.example/get-creative', { method: 'POST', headers: { 'Content-Type': 'application/json', }, body: JSON.stringify(topics) });

      // Get the JSON from the response. const creative = await response.json();

      // Display the ad. (or not) ```

  12. Aug 2023
    1. We lived in a relatively unregulated digital world until now. It was great until the public realized that a few companies wield too much power today in our lives. We will see significant changes in areas like privacy, data protection, algorithm and architecture design guidelines, and platform accountability, etc. which should reduce the pervasiveness of misinformation, hate and visceral content over the internet.
      • for: quote, quote - Prateek Raj, quote - internet regulation, quote - reducing misinformation, fake news, indyweb - support
      • quote
        • We lived in a relatively unregulated digital world until now.
        • It was great until the public realized that a few companies wield too much power today in our lives.
        • We will see significant changes in areas like
          • privacy,
          • data protection,
          • algorithm and
          • architecture design guidelines, and
          • platform accountability, etc.
        • which should reduce the pervasiveness of
          • misinformation,
          • hate and visceral content
        • over the internet.
        • These steps will also reduce the power wielded by digital giants.
        • Beyond these immediate effects, it is difficult to say if these social innovations will create a more participative and healthy society.
        • These broader effects are driven by deeper underlying factors, like
          • history,
          • diversity,
          • cohesiveness and
          • social capital, and also
          • political climate and
          • institutions.
        • In other words,
          • just as digital world is shaping the physical world,
          • physical world shapes our digital world as well.
      • author: Prateek Raj
        • assistant professor in strategy, Indian Institute of Management, Bangalore
    1. A spec to optimize ad targeting (respectful of privacy, they say... 😂🤣).

      Fuck you Google with your dystopian API:

      ```js // document.browsingTopics() returns an array of BrowsingTopic objects. const topics = await document.browsingTopics();

      // Get data for an ad creative. const response = await fetch('https://ads.example/get-creative', { method: 'POST', headers: { 'Content-Type': 'application/json', }, body: JSON.stringify(topics) });

      // Get the JSON from the response. const creative = await response.json();

      // Display the ad. (or not) ```

    1. On-device ad auctions to serve remarketing and custom audiences, without cross-site third-party tracking.

      Naming a thing with a meaning opposite to what the named thing is...

      Google is insatiable when it regards to accessing users private data. Let's block that bullshit.

  13. Jul 2023
    1. Such efforts to protect data privacy go beyond the abilities of the technology involved to also encompass the design process. Some Indigenous communities have created codes of use that people must follow to get access to community data. And most tech platforms created by or with an Indigenous community follow that group’s specific data principles. Āhau, for example, adheres to the Te Mana Raraunga principles of Māori data sovereignty. These include giving Māori communities authority over their information and acknowledging the relationships they have with it; recognizing the obligations that come with managing data; ensuring information is used for the collective benefit of communities; practicing reciprocity in terms of respect and consent; and exercising guardianship when accessing and using data. Meanwhile Our Data Indigenous is committed to the First Nations principles of ownership, control, access and possession (OCAP). “First Nations communities are setting their own agenda in terms of what kinds of information they want to collect,” especially around health and well-being, economic development, and cultural and language revitalization, among others, Lorenz says. “Even when giving surveys, they’re practicing and honoring local protocols of community interaction.”

      Colonized groups such as these indigenous people have urgency to avoid colonization of their data and are doing something about it

  14. Apr 2023
    1. Seeing how powerful AI can be for cracking passwords is a good reminder to not only make sure you‘re using strong passwords but also check:↳ You‘re using 2FA/MFA (non-SMS-based whenever possible) You‘re not re-using passwords across accounts Use auto-generated passwords when possible Update passwords regularly, especially for sensitive accounts Refrain from using public WiFi, especially for banking and similar accounts

      看到人工智能在破解密码方面有多么强大,这很好地提醒了我们,不仅要确保你在使用强密码,还要检查:

      • 你正在使用 2FA/MFA(尽可能不使用基于短信的)。

      • 你没有在不同的账户间重复使用密码

      • 尽可能使用自动生成的密码

      • 定期更新密码,特别是敏感账户的密码

      • 避免使用公共WiFi,尤其是银行和类似账户

    2. Now Home Security Heroes has published a study showing how scary powerful the latest generative AI is at cracking passwords. The company used the new password cracker PassGAN (password generative adversarial network) to process a list of over 15,000,000 credentials from the Rockyou dataset and the results were wild. 51% of all common passwords were cracked in less than one minute, 65% in less than an hour, 71% in less than a day, and 81% in less than a month.
  15. Mar 2023
    1. Time to dive a little deeper to see what information the barcodes actually contain. For this I will break down the previously extracted information into smaller pieces.

      Information contained within boarding pass barcodes

    1. Companies that perform surveillance are attempting the same mental trick. They assert that we freely share our data in return for valuable services. But opting out of surveillance capitalism is like opting out of electricity, or cooked foods—you are free to do it in theory. In practice, it will upend your life.

      Opting-out of surveillance capitalism?

    1. Does the EDL/EID card transmit my personal information? No. The RFID tag embedded in your card doesn't contain any personal identifying information, just a unique reference number.

      Can this unique reference number be used to identify me (assuming they've already identified me another way and associated this number with me)? Yes!!

      So this answer is a bit incomplete/misleading...

  16. Feb 2023
    1. It means that everything AI makes would immediately enter the public domain and be available to every other creator to use, as they wish, in perpetuity and without permission.

      One issue with blanket, automatic entry of AI-generated works to the public domain is privacy: A human using AI could have good reasons not to have the outputs of their use made public.

    1. Exercising Your Rights: California residents can exercise the above privacy rights by emailing us at: support@openai.com.

      Does that mean that any California resident can email to request a record of all the information OpenAI has collected about them?

    2. Affiliates: We may share Personal Information with our affiliates, meaning an entity that controls, is controlled by, or is under common control with OpenAI. Our affiliates may use the Personal Information we share in a manner consistent with this Privacy Policy.

      This would include Microsoft.

    3. improve and/or analyze the Services

      Does that mean that we are agreeing for them to use personal information in any way they choose if they deem it to help them improve their software?

    1. Your access of the website and/or use of our services, after modification, addition or deletion of the privacy policy shall be deemed to constitute acceptance by you of the modification, addition or deletion.

      This sounds bad. Users can't be held to have agreed to arbitrary changes to a privacy policy, if we are not even notified about the changes.

      If you make significant changes to the privacy policy you should give users 30 days' notice, and preferably get their consent again.

      Here's an article about it: https://www.privacypolicies.com/blog/privacy-policy-update-notices/

    1. You may opt-out of the telemetry by setting Storybook's configuration element disableTelemetry to true, using the --disable-telemetry flag, or setting the environment variableSTORYBOOK_DISABLE_TELEMETRY to 1.
  17. Jan 2023
    1. How did it work? GNUAsk (the aspirational, mostly unreleased search engine UI) relied on hundreds of bots, running as daemons, and listening in on conversations within AOL AIM, IRC, Skype, and Yahoo public chat rooms and recording all the textual conversations.
  18. Dec 2022
    1. “Berla devices position CBP and ICE to perform sweeping searches of passengers’ lives, with easy access to cars' location history and most visited places and to passengers’ family and social contacts, their call logs, and even their social media feeds,” she said.
    2. Cybersecurity researcher Curry told Forbes that, after seeing what could be done with just a VIN, it was “terrifying” that those identifying numbers were public.
    1. Economists explain that markets work bestwith “perfect information.” And visibilityfeeds this market by translating and sharingskills. But the price of transparency in themodern age is invaded privacy, as well as biasinherent in automated products and services
    1. The presence of Twitter’s code — known as the Twitter advertising pixel — has grown more troublesome since Elon Musk purchased the platform.AdvertisementThat’s because under the terms of Musk’s purchase, large foreign investors were granted special privileges. Anyone who invested $250 million or more is entitled to receive information beyond what lower-level investors can receive. Among the higher-end investors include a Saudi prince’s holding company and a Qatari fund.

      Twitter investors may get access to user data

      I'm surprised but not surprised that Musk's dealings to get investors in his effort to take Twitter private may include sharing of personal data about users. This article makes it sound almost normal that this kind of information-sharing happens with investors (inclusion of the phrase "information beyond what lower-level investors can receive").

    1. Meta's receipt of tax information via tracking pixels on tax preparer websites is the subject of a federal lawsuit. The tax preparing sites are not participants in the lawsuit (yet?).

  19. Nov 2022
    1. From a technical point of view, the IndieWeb people have worked on a number of simple, easy to implement protocols, which provide the ability for web services to interact openly with each other, but in a way that allows for a website owner to define policy over what content they will accept.

      Thought you might like Web Monetization.

    1. Donations

      To add some other intermediary services:

      To add a service for groups:

      To add a service that enables fans to support the creators directly and anonymously via microdonations or small donations by pre-charging their Coil account to spend on content streaming or tipping the creators' wallets via a layer containing JS script following the Interledger Protocol proposed to W3C:

      If you want to know more, head to Web Monetization or Community or Explainer

      Disclaimer: I am a recipient of a grant from the Interledger Foundation, so there would be a Conflict of Interest if I edited directly. Plus, sharing on Hypothesis allows other users to chime in.

    1. Hidden below all of this is the normalization of surveillance that consistently targets marginalized communities. The difference between a smartwatch and an ankle monitor is, in many ways, a matter of context: Who wears one for purported betterment, and who wears one because they are having state power enacted against them?
    2. The conveniences promised by Amazon’s suite of products may seem divorced from this context; I am here to tell you that they’re not. These “smart” devices all fall under the umbrella of what the digital-studies scholar David Golumbia and I call “luxury surveillance”—that is, surveillance that people pay for and whose tracking, monitoring, and quantification features are understood by the user as benefits.
    1. Some of the sensitive data collection analyzed by The Markup appears linked to default behaviors of the Meta Pixel, while some appears to arise from customizations made by the tax filing services, someone acting on their behalf, or other software installed on the site. Report Deeply and Fix Things Because it turns out moving fast and breaking things broke some super important things. Give Now For example, Meta Pixel collected health savings account and college expense information from H&R Block’s site because the information appeared in webpage titles and the standard configuration of the Meta Pixel automatically collects the title of a page the user is viewing, along with the web address of the page and other data. It was able to collect income information from Ramsey Solutions because the information appeared in a summary that expanded when clicked. The summary was detected by the pixel as a button, and in its default configuration the pixel collects text from inside a clicked button.  The pixels embedded by TaxSlayer and TaxAct used a feature called “automatic advanced matching.” That feature scans forms looking for fields it thinks contain personally identifiable information like a phone number, first name, last name, or email address, then sends detected information to Meta. On TaxSlayer’s site this feature collected phone numbers and the names of filers and their dependents. On TaxAct it collected the names of dependents.

      Meta Pixel default behavior is to parse and send sensitive data

      Wait, wait, wait... the software has a feature that scans for privately identifiable information and sends that detected info to Meta? And in other cases, the users of the Meta Pixel decided to send private information ot Meta?

    1. Although complicated, Gen Z’s relationship with data privacy should be a consideration for brands when strategizing their data privacy policies and messaging for the future. Expectations around data privacy are shifting from something that sets companies apart in consumers’ minds to something that people expect the same way one might expect a service or product to work as advertised. For Gen Zers, this takes the form of skepticism that companies will keep their data safe, and their reluctance to give companies credit for getting it right means that good data privacy practices will increasingly be more about maintaining trust than building it.

      Gen-Z expectations are complicated

      The Gen-Z generation have notably different expectations about data privacy than previous generations. "Libraries" wasn't among the industry that showed up in their survey results. That Gen-Z expects privacy built in makes that factor a less differentiating characteristic as compared to older generations. It might also be harder to get trust back from members of the Gen-Z population if libraries surprise those users with data handling practices that they didn't expect.

    1. “You have to assume that things can go wrong,” shared Waymo’s head of cybersecurity, Stacy Janes. “You can’t just design for this success case – you have to design for the worst case.”

      Future proofing by asking "what if we're wrong?"

  20. Oct 2022
    1. A Midwestern hospital system is treating its use of Google and Facebook web tracking technologies as a data breach, notifying 3 million individuals that the computing giants may have obtained patient information.

      Substitute “library” for “hospital”

      In an alternate universe: “A Midwestern library system is treating its use of Google and Facebook web tracking technologies as a data breach, notifying 3 million individuals that the computing giants may have obtained search and borrowing histories.”

    1. On the other end, there was The Good Phone Foundation, a not-for-profit organization founded with a mission to create an open, transparent, and secure mobile ecosystem outside of Big Tech’s reach, who just released their own Android-based mobile OS and were looking for apps to rely on. They contacted me, and after a couple of calls, we realized that partnering up on the smartphone makes a lot of sense for both of us. So, here we are, introducing you to our brand new Simple Phone. Only having control over both software and hardware ensures the ultimate privacy and security. The target audience consists of more privacy-oriented people that do not want to be tracked or rely on big corporations, Google Play, etc. It focuses on people who just want to get things done in a simple way without having to keep closing ads and wondering what does the system do in the background. Not to mention consistency again as the core apps are developed by us. Hope you will like it just like we do 🙂

      Simple Phone's effort to release its own mobile OS is promising for ordinary users. Because Simple Mobile Tools represents a full suite of basic Android applications, in can, ideally, provide a privacy-friendly and user-friendly alternative to stock Android by providing a unified suite of apps. /e/ OS (aka Murena) is attempting something similar, but its app collection is not quite as unified as the Simple Mobile suite.

    1. in many ways, Law 25 is the most stringent of the three regimes
    2. Impact assessments: Law 25 is broad and requires a PIR to be carried out whenever conditions are met, regardless of the level of risk. The GDPR is less stringent, only requiring assessments in cases where processing is likely to result in a ‘high risk’ to rights and freedoms. Because the CCPA does not specifically focus on accountability-related obligations, it does not mandate impact assessments.
    3. Privacy by default: Bill 64’s “confidentiality by default” clause is far broader in scope and significantly more stringent than the “privacy by design” concept under the GDPR. The CCPA does not provide for this concept at all, instead taking an “after-the-event” remedial approach. 
    1. exige qu'une AIPD soit réalisée à chaque fois que la situation le nécessite et quel que soit le niveau de risque
    2. La confidentialité par défaut : La clause de "confidentialité par défaut" du projet de loi 64 a une portée beaucoup plus vaste et est beaucoup plus stricte que le concept de "confidentialité par conception" prévu par le RGPD. Le CCPA adopte plutôt une approche corrective "après coup".
    1. En cas de non-respect de la Loi, la Commission d’accès à l’information pourra imposer des sanctionsimportantes, qui pourraient s’élever jusqu’à 25 M$ ou à 4 % du chiffre d’affaires mondial. Cette sanctionsera proportionnelle, notamment, à la gravité du manquement et à la capacité de payer de l’entreprise.ENTREPRISES
  21. Sep 2022
    1. Denmark’s data protection regulator found that local schools did not really understand what Google was doing with students’ data and as a result blocked around 8,000 students from using the Chromebooks that had become a central part of their daily education.

      Danish data regulator puts a temporary ban on Google education products

    1. If someone came up to you in the street, said they’re from an online service provider and requested you store all of the above data about you with them, I imagine for many the answer would be a resounding NO!
  22. Aug 2022
    1. University can’t scan students’ rooms during remote tests, judge rules

      Room scan by proctoring tools violates protection against unreasonable searches Ohio court rules (US const 4th amendment) Univ defense was 'it's standard industry practice' and 'others did not complain'. In other words no actual moral consideration was made by univ. This is so bad, even without the fact that a third party keeps the recording of the video scan of this student's bedroom.

      Where there's a need for remote test taking, why try to copy over the cheating controls from in-person test taking? How about adapting the test content on the assumption that students will have material available to them during the test, reducing the proctoring need to only assuring the actual student is taking the test.

    1. Your personal data will be shared both within Beekeeper associated offices globally and with Greenhouse Software, Inc., a cloud services provider located in the United States of America and engaged by Controller to help manage its recruitment and hiring process on Controller’s behalf.

      Personal data will be accessible to different branches (i.e., national affiliates) of Beekeeper.

    1. NETGEAR is committed to providing you with a great product and choices regarding our data processing practices. You can opt out of the use of the data described above by contacting us at analyticspolicy@netgear.com

      You may opt out of these data use situations by emailing analyticspolicy@netgear.com.

    2. Marketing. For example, information about your device type and usage data may allow us to understand other products or services that may be of interest to you.

      All of the information above that has been consented to, can be used by NetGear to make money off consenting individuals and their families.

    3. USB device

      This gives Netgear permission to know what you plug into your computer, be it a FitBit, a printer, scanner, microphone, headphones, webcam — anything not attached to your computer.

  23. Jul 2022
    1. In the wake of Roe v. Wade being overturned, online privacy is on everyone's minds. But according to privacy experts, the entire way we think about and understand what 'privacy' actually means... is wrong. In this new Think Again, NBC News Correspondent Andrew Stern dives deep into digital privacy — what it really means, how we got to this point, how it impacts every facet of our lives, and how little of it we actually have.

      In the wake of Roe v. Wade being overturned, online privacy is on everyone's minds. But according to privacy experts, the entire way we think about and understand what 'privacy' actually means... is wrong. In this new Think Again, NBC News Correspondent Andrew Stern dives deep into digital privacy — what it really means, how we got to this point, how it impacts every facet of our lives, and how little of it we actually have.

  24. www.mojeek.com www.mojeek.com
    1. Mojeek

      Mojeek is the 4th largest English lang. web search engine after Google, Bing and Yandex which has it's own index, crawler and algo. Index has passed 5.7 billion pages. Growing. Privacy based.

      It uses it's own index with no backfill from others.

    1. she’s being dragged into the public eye nonetheless.

      Wow... I have heard of instances similar to this one. A stranger narrates the life of another stranger online, it goes viral and the identities of everyone are revealed. To me it seemed meaningless I never thought that the people involved could just want their privacy. I find it very scary how this can happen to anyone. Another reason why I limit my social media usage. I found myself to be engaging with very negative commentary and it really affected my mental health. I wonder how this womans mental health is going. Pretty much the whole world knows about her know unfortunately.

    1. Something has shifted online: We’ve arrived at a new era of anonymity, in which it feels natural to be inscrutable and confusing—forget the burden of crafting a coherent, persistent personal brand. There just isn’t any good reason to use your real name anymore. “In the mid 2010s, ambiguity died online—not of natural causes, it was hunted and killed,” the writer and podcast host Biz Sherbert observed recently. Now young people are trying to bring it back. I find this sort of exciting, but also unnerving. What are they going to do with their newfound freedom?
  25. Jun 2022
    1. Companies need to actually have an ethics panel, and discuss what the issues are and what the needs of the public really are. Any ethics board must include a diverse mix of people and experiences. Where possible, companies should look to publish the results of these ethics boards to help encourage public debate and to shape future policy on data use.

    1. The goal is to gain “digital sovereignty.”

      the age of borderless data is ending. What we're seeing is a move to digital sovereignty

    1. Using the network-provided DNS servers is the best way to blend in with other users. Network and web sites can fingerprint and track users based on a non-default DNS configuration.
    1. All wireless devices have small manufacturing imperfections in the hardware that are unique to each device. These fingerprints are an accidental byproduct of the manufacturing process. These imperfections in Bluetooth hardware result in unique distortions, which can be used as a fingerprint to track a specific device. For Bluetooth, this would allow an attacker to circumvent anti-tracking techniques such as constantly changing the address a mobile device uses to connect to Internet networks. 

      Tracking that evades address changes

      An operating system can change the hardware address it broadcasts in avoid tracking. But subtle differences in the signal itself can still be identified and tracked.

    1. Free public projects private projects starting at $9/month per project

      For many tools and apps payment for privacy is becoming the norm.

      Examples: - Kumu.io - Github for private repos - ...

      pros: - helps to encourage putting things into the commons

      cons: - Normalizes the idea of payment for privacy which can be a toxic tool.

      discuss...

    1. the one thing that you have to keep conveying to people about the consequences of surveillance is that it's all very well to say that you have nothing to hide, but when you're spied upon, everybody that's connected to you gets spied upon. And if we don't push back, the most vulnerable people in society, the people that actually keep really massive violations of human rights and illegality in check, they're the people who get most affected.

      "I Have Nothing To Hide" counter-argument

      Even if you have nothing to hide, that doesn't mean that those you are connected with aren't also being surveilled and are part of targeted communities.

  26. May 2022
    1. For example, we know one of the ways to make people care about negative externalities is to make them pay for it; that’s why carbon pricing is one of the most efficient ways of reducing emissions. There’s no reason why we couldn’t enact a data tax of some kind. We can also take a cautionary tale from pricing externalities, because you have to have the will to enforce it. Western Canada is littered with tens of thousands of orphan wells that oil production companies said they would clean up and haven’t, and now the Canadian government is chipping in billions of dollars to do it for them. This means we must build in enforcement mechanisms at the same time that we’re designing principles for data governance, otherwise it’s little more than ethics-washing.

      Building in pre-payments or a tax on data leaks to prevent companies neglecting negative externalities could be an important stick in government regulation.

      While it should apply across the board, it should be particularly onerous for for-profit companies.

    2. Even with data that’s less fraught than our genome, our decisions about what we expose to the world have externalities for the people around us.

      We need to think more about the externalities of our data decisions.

  27. Apr 2022
    1. This Playbuzz Privacy Policy (“Policy”) outlines what personal information is collected by Playbuzz Ltd. (“Playbuzz”, “we”, “us” or “our”), how we use such personal information, the choices you have with respect to such personal information, and other important information.

      We keep your personal information personal and private. We will not sell, rent, share, or otherwise disclose your personal information to anyone except as necessary to provide our services or as otherwise described in this Policy.

    1. Dorothea Salo (2021) Physical-Equivalent Privacy, The Serials Librarian, DOI: 10.1080/0361526X.2021.1875962

      Permanent Link: http://digital.library.wisc.edu/1793/81297

      Abstract

      This article introduces and applies the concept of “physical-equivalent privacy” to evaluate the appropriateness of data collection about library patrons’ use of library-provided e‑resources. It posits that as a matter of service equity, any data collection practice that causes e‑resource users to enjoy less information privacy than users of an information-equivalent print resource is to be avoided. Analysis is grounded in real-world e‑resource-related phenomena: secure (HTTPS) library websites and catalogs, the Adobe Digital Editions data-leak incident of 2014, and use of web trackers on e‑resource websites. Implications of physical-equivalent privacy for the SeamlessAccess single-sign-on proposal will be discussed.

    1. a child had gone missing in our town and the FBI came to town to investigate immediately and had gone to the library. They had a tip and wanted to seize and search the library’s public computers. And the librarians told the FBI that they needed to get a warrant. The town was grief stricken and was enraged that the library would, at a time like that, demand that the FBI get a warrant. Like everyone in town was like, are you kidding me? A child is missing and you’re– and what? This town meeting afterwards, the library budget, of course, is up for discussion as it is every year, and the people were still really angry with the library, but a patron and I think trustee of the library – again, a volunteer, someone living in town – an elderly woman stood up and gave the most passionate defense of the Fourth Amendment and civil liberties to the people on the floor that I have ever witnessed.

      An example of how a library in Vermont stood up to a warrantless request from the FBI to seize and search public library computers. This could have impacted the library's budget when the issue was brought to a town meeting, but a library patron was a passionate advocate for the 4th amendment.

    1. K-Anonymity, L-Diversity, and T-ClosenessIn this section, I will introduce three techniques that can be used to reduce the probability that certain attacks can be performed. The simplest of these methods is k-anonymity, followed by l-diversity, and then followed by t-closeness. Other methods have been proposed to form a sort of alphabet soup, but these are the three most commonly utilized. With each of these, the analysis that must be performed on the dataset becomes increasingly complex and undeniably has implications on the statistical validity of the dataset.

      privacy metrics

    1. Privacy is not secrecy. A private matter is something one doesn't want the whole world to know, but a secret matter is something one doesn't want anybody to know.

      Privacy is the power to decide when and what is secret and who to.

    1. I thought that the point of disappearing messages was to eat your cake and have it too, by allowing you to send a message to your adversary and then somehow deprive them of its contents. This is obviously a stupid idea.But the threat that Snapchat — and its disappearing message successors —was really addressing wasn’t communication between untrusted parties, it was automating data-retention agreements between trusted parties.

      Why use a disappearing message service

      The point of a disappearing message service is to have the parties to the message agree on the data-retention provisions of a message. The service automates that agreement by deleting the message at the specified time. The point isn't to send a message to an adversary and then delete it so they can't prove that it has been sent. There are too many ways of capturing the contents of a message—as simple as taking a picture of the message with another device.

    1. Weinberg’s tweet announcing the change generated thousands of comments, many of them from conservative-leaning users who were furious that the company they turned to in order to get away from perceived Big Tech censorship was now the one doing the censoring. It didn’t help that the content DuckDuckGo was demoting and calling disinformation was Russian state media, whose side some in the right-wing contingent of DuckDuckGo’s users were firmly on.

      There is an odd sort of self-selected information bubble here. DuckDuckGo promoted itself as privacy-aware, not unfiltered. On their Sources page, they talk about where they get content and how they don't sacrifice privacy to gather search results. Demoting disinformation sources in their algorithms would seem to be a good thing. Except if what you expect to see is disinformation, and then suddenly the search results don't match your expectations.

  28. Mar 2022
    1. Thus,information about people and their behaviour is made visible to other people, systems andcompanies.

      "Data trails"—active information and passive telemetry—provide a web of details about a person's daily life, and the analysis of that data is a form of knowledge about a person.

  29. Feb 2022
    1. Others because they want privacy

      AIUI, your account's contribution graph and feed are still public, not private, without a way to opt out—just like on GitHub.

  30. Dec 2021
    1. Efforts to clarify and disseminatethe differences between “privacy as advocacy” (e.g.,privacy is a fundamental right; privacy is an ethicalnorm) and “privacy as compliance” (e.g., ensuringprivacy policies and laws are followed; privacyprograms train, monitor, and measure adherence torules) help frame conversations and set expectations.

      This is an interesting distinction... privacy-because-it-is-the-right-thing-to-do versus privacy-because-you-must. I think the latter is where most institutions are today. It will take a lot more education to get institutions to the former.

    2. As informed and engagedstakeholders, students understand how and why theirinstitutions use academic and personal data.

      Interesting that there is a focus here on advocacy from an active student body. Is it the expectation that change from some of the more stubborn areas of the campus would be driven by informed student push-back? This section on "Students, Faculty, and Staff" doesn't have the same advocacy role from the other portions of the campus community.

    1. Questions, comments and requests, including any complaints, regarding us or this privacy policy are welcomed and should be addressed to privacy@marugroup.net.

      However, if you do this then your email, IP, broswer etc will be collected and shared as per the information above. To be safer, I would write a letter, stick a stamp on the envelope and send it in.

    2. stored at, a destination outside the European Economic Area ("EEA").

      Why? is that allowed? I don't think that I would be happy about that as I am not reassured that 'taking reasonable steps' is actually appropriate considering one of those would be to host within regions specified by GDPR

    3. third party, in which case personal data held by it about its customers will be one of the transferred assets.

      I was going to respond to the survey until I saw this. I am offering to provide feedback for free and yet my personal information is collected and becomes part of the sale of the business in the form of an asset. The question is why is my personal information being held for any length of time after I have completed the survey? Isn't that a violation of GDPR?

    4. In the event that we sell or buy any business or assets, in which case we will disclose your personal data to the prospective seller or buyer of such business or assets.

      Why? I came across this privacy policy as I had been asked to respond to a survey about the website. Not only am I giving my feedback for free but then they want to take my personal information and give it away to unknown buyers and sellers (third parties) all of whom are fictitious and in the future?

    1. About 7 in 10 Americans think their phone or other devices are listening in on them in ways they did not agree to.

      I'm enough of a tinfoil hat wearer to this this might be true. Especially since my google home talks to me entirely too much when I'm not talking to it.

  31. Nov 2021
    1. There Is No Antimimetics Division (qntm): This is the best new sci fi I've read in recent memory, I think because it feels fresh and modern, tackling some of the hardest social questions that the world is facing today. It's about antimemes, defined as "an idea with self-censoring properties...which, by its intrinsic nature, discourages or prevents people from spreading it."

      I like the idea of antimemes. The tougher question is how to actually implement it on the web?

      Is this just the idea of a digital secret?

      "The only way for two computers to keep a secret on the web is if all the computers are dead."—Chris Aldrich

    1. Pretty much anything that can be remembered can be cracked. There’s still one scheme that works. Back in 2008, I described the “Schneier scheme”: So if you want your password to be hard to guess, you should choose something that this process will miss. My advice is to take a sentence and turn it into a password. Something like “This little piggy went to market” might become “tlpWENT2m”. That nine-character password won’t be in anyone’s dictionary. Of course, don’t use this one, because I’ve written about it. Choose your own sentence — something personal.

      Good advice on creating secure passwords.

    1. ISO 29100/Privacy Framework [2] defines the privacy principles as:1.Consent and choice,2.Purpose legitimacy and specification,3.Collection limitation,4.Data minimization,5.Use, retention and disclosure limitation,6.Accuracy and quality,7.Openness, transparency and notice,8.Individual participation and access,9.Accountability,10.Information security, and11.Privacy compliance.
  32. Oct 2021
    1. A screenshot from the document providing an overview of different data retention periods. Image: Motherboard.

      Is it possible that FBI stores this data on us?

    1. We will also show you how to de-link your Chrome profile from your Google account(s) by stopping Chrome from syncing with Google in the first place. This will help keep your Chrome profile separate from your Google account and enhance your online privacy.
    2. To do that, Chrome automatically links your Chrome profile to a Google account when you sign in to any Google service on the web. That helps Google deliver a ‘seamless experience’ across all devices by letting you sync your history, bookmarks, passwords, etc., across multiple devices. Meanwhile, privacy-conscious users see this as a major threat to their online privacy and advise users to remove their Google account from Chrome.
    3. As mentioned already, Chrome automatically signs you in to your Google account every time you sign into a Google service, like Gmail, YouTube, Google Photos, etc. It also links your current Chrome profile to that account. While Google says that it does so to offer a ‘seamless experience’, it is a privacy nightmare for many users.
  33. Sep 2021
  34. Aug 2021
    1. You can request that Zoom delete any and all information they hold on you. Information on your data rights and how to get in contact with Zoom to request they erase your data can be found in their privacy policy. Once you have made the request, follow up to ensure you get confirmation that your data has been removed from their servers.
    1. U.S. Senate Subcommittee on Communications, Technology, Innovation, and the Internet, "Optimizing for Engagement: Understanding the Use of Persuasive Technology on Internet Platforms," 25 June 2019, www.commerce.senate.gov/2019/6/optimizing-for-engagement-understanding-the-use-of-persuasive-technology-on-internet-platforms.

      Perhaps we need plurality in the areas for which social data are aggregated?

      What if we didn't optimize for engagement, but optimized for privacy, security, or other axes in the space?

  35. Jul 2021
    1. whereas now, they know that user@domain.com was subscribed to xyz.net at some point and is unsubscribing. Information is gold. Replace user@domain with abcd@senate and xyz.net with warezxxx.net and you've got tabloid gold.
    1. Roberts noted that the risks of physical danger to donors are heightened “with each passing year” as changes in technology enables “anyone with access to a computer” to “compile a wealth of information about” anyone.

      He's going to be shocked at what's in his Facebook (shadow) profile...

    1. consumer friendly

      Including the "consumer" here is a red herring. We're meant to identify as the consumer and so take from this statement that our rights and best interests have been written into these BigTech-crafted laws.

      But a "consumer" is different from a "citizen," a "person," we the people.

    2. passage in March of a consumer data privacy law in Virginia, which Protocol reported was originally authored by Amazon

      From the article:

      Marsden and Virginia delegate Cliff Hayes held meetings with other large tech companies, including Microsoft; financial institutions, including Capital One; and non-profit groups and small businesses...

      Which all have something at stake here: the ability to monitor people and mine their data in order to sell it.

      Weak privacy laws give the illusion of privacy while maintaining the corporate panopticon.

    3. consumers would have to opt out of rather than into tracking

      Example of a dark pattern.

  36. Jun 2021
    1. But after using it for a few days you quickly realize that there is one major privacy issue that has been installed consciously by Amazon and Ring.The ring app allows you to delete videos on the system but it does Not allow you to delete motion sensor and window sensor history.So Amazon/ring knows everything that happens inside your home and there is no way for you to delete that history. They know when you’re inside, they know when you open your door, they know when you closed it. etc. etc. etc. So they essentially know everything about you and your motions within your home.This is a major privacy issue. And it is not some mistake that was overlooked. This was a conscious choice on Amazon/rings part to track the motions of you and your family inside your own home.I spoke with the customer service rep from Ring and she admitted that many many people call up and complain that they can’t delete sensor history. Of course it would’ve been much more ethical to explain to potential customers BEFORE they buy ring products that this breach of privacy has been installed.But Amazon/ring does not warn their customers about this privacy breach. They don’t warn customers because they created the privacy breech and Will continue to have an always have very personal information on the motions of your family inside your own home.If you care about your privacy. Don’t buy Ring products.
    1. Το ότι αποτελούν αντικείμενο ρύθμισης δεν είναι κάποια ριζοσπαστική θέση, είναι η θέση που έχει εκφράσει στο κογκρέσο των ΗΠΑ ο ιδρυτής  και ιδιοκτήτης του Fb Mark Zuckerberg: «Η θέση μου δεν είναι ότι δεν πρέπει να υπάρχει ρύθμιση. Πιστεύω ότι το πραγματικό ερώτημα, καθώς το διαδίκτυο γίνεται ολοένα και πιο σημαντικό για τις ζωές των ανθρώπων, είναι ποιος είναι ο σωστός τρόπος ρύθμισης, και όχι αν είναι απαραίτητο να υπάρχει ρύθμιση»

      Τσακαλώτος στα καλύτερά του, επιχειρηματολογέι εναντια στην ιδεολογία της ιδιώτευσης στο Fb.

    1. Yet books are curious objects: their strength is to be both intensely private and intensely social — and marginalia is a natural bridge between these two states.

      Books represent a dichotomy in being both intensely private and intensely social at the same time.

      Are there other objects that have this property?

      Books also have the quality of providing people with identities.

  37. May 2021
    1. <small><cite class='h-cite via'> <span class='p-author h-card'>jenny (phire) zhang</span> in jenny (phire) zhang on Twitter: "@markpopham the OSS/indieweb world falls into this trap a lot imo!! thank you for reading <3" / Twitter (<time class='dt-published'>05/06/2021 07:20:50</time>)</cite></small>

    2. In 1962, a book called Silent Spring by Rachel Carson documenting the widespread ecological harms caused by synthetic pesticides went off like a metaphorical bomb in the nascent environmental movement.

      Where is the Silent Spring in the data, privacy, and social media space?

    3. Amidst the global pandemic, this might sound not dissimilar to public health. When I decide whether to wear a mask in public, that’s partially about how much the mask will protect me from airborne droplets. But it’s also—perhaps more significantly—about protecting everyone else from me. People who refuse to wear a mask because they’re willing to risk getting Covid are often only thinking about their bodies as a thing to defend, whose sanctity depends on the strength of their individual immune system. They’re not thinking about their bodies as a thing that can also attack, that can be the conduit that kills someone else. People who are careless about their own data because they think they’ve done nothing wrong are only thinking of the harms that they might experience, not the harms that they can cause.

      What lessons might we draw from public health and epidemiology to improve our privacy lives in an online world? How might we wear social media "masks" to protect our friends and loved ones from our own posts?

    4. In an individual model of privacy, we are only as private as our least private friend.

      So don't have any friends?

      Obviously this isn't a thing, but the implications of this within privacy models can be important.

      Are there ways to create this as a ceiling instead of as a floor? How might we use topology to flip this script?

    5. 130 years on, privacy is still largely conceived of as an individual thing, wherein we get to make solo decisions about when we want to be left alone and when we’re comfortable being trespassed upon.

      How could one design a mathematical balancing system to help separate individuals embedded within a variety of societies or publics to enforce a balance of levels of privacy.

      • There's the interpersonal level between the individuals
      • There's the person's individual privacy and the public's reaction/response to the thing captured, for which the public may shun or not
      • There's the takers rights (possibly a journalist or news outlet) to inform the broader public which may shame or not
      • There's the publics' potential right to know, the outcome may effect them or dramatically change society as a whole
      • others facets?
      • how many facets?
      • how to balance all these to create an optimum outcome for all parties?
      • How might the right to forget look like and be enforced?
      • How do economic incentives play out (paparazzi, journalism, social media, etc.?)
    1. Draft notes, E-mail, plans, source code, to-do lists, what have you

      The personal nature of this information means that users need control of their information. Tim Berners-Lee's Solid (Social Linked Data) project) looks like it could do some of this stuff.

    1. The seminal 1890 Harvard Law Review article The Right to Privacy—which every essay about data privacy is contractually obligated to cite—argued that the right of an individual to object to the publication of photographs ought to be considered part of a general ‘right to be let alone’.

      <small><cite class='h-cite via'> <span class='p-author h-card'>Jenny</span> in left alone, together | The Roof is on Phire (<time class='dt-published'>05/08/2021 18:32:41</time>)</cite></small>

      See also: https://en.wikipedia.org/wiki/The_Right_to_Privacy_(article)

    1. “For one of the most heavily guarded individuals in the world, a publicly available Venmo account and friend list is a massive security hole. Even a small friend list is still enough to paint a pretty reliable picture of someone's habits, routines, and social circles,” Gebhart said.

      Massive how? He's such a public figure that most of these connections are already widely reported in the media or easily guessable by an private invistigator. The bigger issue is the related data of transactions which might open them up for other abuses or potential leverage as in the other examples.

    1. Although I believe people have a right to secure and private communication, I disagree with those who extrapolate from this that we have a right to anonymous property transfer. It’s totally in the public’s legitimate interest to keep track of who owns what, and to settle which transfers of ownership are legitimate, for instance by disallowing coerced ones.

      I found this thought helpful. I had feelings like this but could not articulate them before.

  38. Apr 2021
    1. People can take the conversations with willing co-workers to Signal, Whatsapp, or even a personal Basecamp account, but it can't happen where the work happens anymore.

      Do note that two of the three systems that Fried use for examples are private. In other words, only people who you explicitly want to see what you're writing will see just that.

      This goes against his previous actions somewhat, e.g. https://twitter.com/jasonfried/status/1168986962704982016

  39. Mar 2021
    1. Not only are these websites breaking my trust—when I visit your website, I entered into contact with you, not 80 other websites—but they are loading content from websites neither know nor trust. Some of which have been know to spread malware.

      The contract of a healthy community: basic respect for one another.

    1. a data donation platform that allows users of browsers to donate data on their usage of specific services (eg Youtube, or Facebook) to a platform.

      This seems like a really promising pattern for many data-driven problems. Browsers can support opt-in donation to contribute their data to improve Web search, social media, recommendations, lots of services that implicitly require lots of operational data.

    2. The idea is that many smaller tech companies would allow for more choice between services. This solution is flawed. For one, services like search or social media benefit from network effects. Having large datasets to train on, means search recommendations get better. Having all your friends in one place, means you don’t need five apps to contact them all. I would argue those are all things we like and might lose when Big Tech is broken up. What we want is to be able to leave Facebook and still talk to our friends, instead of having many Facebooks.

      I'd be interested to better understand this concern or critique. I think the goal of smaller, interoperable services is exactly the idea of being able to communicate with our Facebook friends even if we leave Facebook. Perhaps that is an argument for combining deconsolidation with interoperability.

    1. Our new feature, Total Cookie Protection, works by maintaining a separate “cookie jar” for each website you visit. Any time a website, or third-party content embedded in a website, deposits a cookie in your browser, that cookie is confined to the cookie jar assigned to that website, such that it is not allowed to be shared with any other website.
  40. Feb 2021
  41. www.joinhoney.com www.joinhoney.com
    1. Honey does not track your search engine history, emails, or your browsing on any site that is not a retail website (a site where you can shop and make a purchase). When you are on a pre-approved retail site, to help you save money, Honey will collect information about that site that lets us know which coupons and promos to find for you. We may also collect information about pricing and availability of items, which we can share with the rest of the Honey community.
    1. (F)unctionalSifting:APrivacy-PreservingReputationSystemThroughMulti-InputFunctionalEncryption(extendedversion)
  42. Jan 2021
    1. Despite some implementation challenges, patient portals have allowed millions of patients to access to their medical records, read physicians’ notes, message providers, and contribute valuable information and corrections.

      I wonder if patients have edit - or at least, flag - information in their record?

    1. In our bedrooms, we want to have powerover who has access to us; in our bathrooms, we just want others de-prived of that access.

      Reidman highlights two types of privacy.

      The privacy we want to have in the bathroom, which is the power to deprive others of access to us.

      And the privacy we want to have in the bedroom, which is the power to control who has access to us.

    2. By privacy, I understand the condition in which other people aredeprived of access to either some information about you or some ex-perience of you. For the sake of economy, I will shorten this and saythat privacy is the condition in which others are deprived of access toyou.

      Reiman defines privacy as the condition in which others are deprived of access to you (information (e.g. location) or experience (e.g. watching you shower))

    3. No doubt privacyis valuable to people who have mischief to hide, but that is not enoughto make it generally worth protecting. However, it is enough to re-mind us that whatever value privacy has, it also has costs. The moreprivacy we have, the more difficult it is to get the information that

      Privacy is valuable to people who have mischief to hide. This is not enough to make it worth protecting, but it tells us that there is also a cost.

    1. As you already noticed, the extension does not go in an manipulate the hrefs/urls in the DOM itself. While it may seem scary to you that an extension may manipulate a URL you're navigating to in-flight, I think it's far scarier to imagine an extension reading and manipulating all of the HTML on all of the pages you go to (bank accounts, utilities, crypto, etc) in order to provide a smidgeon of privacy for the small % of times you happen to click a link with some UTM params.
  43. Dec 2020
    1. I haven't met anyone who makes this argument who then says that a one stop convenient, reliable, private and secure online learning environment can’t be achieved using common every day online systems

      Reliable: As a simple example, I'd trust Google to maintain data reliability over my institutional IT support.

      And you'd also need to make the argument for why learning needs to be "private", etc.

    1. And then there was what Lanier calls “data dignity”; he once wrote a book about it, called Who Owns the Future? The idea is simple: What you create, or what you contribute to the digital ether, you own.

      See Tim Berners-Lee's SOLID project.

    1. “Being under constant surveillance in the workplace is psychological abuse,” Heinemeier Hansson added. “Having to worry about looking busy for the stats is the last thing we need to inflict on anyone right now.”

      I really like the Basecamp approach (I forget where I heard this...could have been in one of the Rework podcasts):

      Don't try to get the most out of everyone; try to get the best out of them.

      If you're looking for ways to build trust in a team, I can't recommend the following books published by Basecamp:

      • Rework
      • Remote
      • It doesn't have to be crazy at work
    2. For example, to help maintain privacy and trust, the user data provided in productivity score is aggregated over a 28-day period.

      So that the fact that the metrics are collected over 28 days is meant to maintain privacy and trust. How?

    1. Recent patent filings show that Microsoft has been exploring additional ideas to monitor workers in the interest of organizational productivity. One filing describes a “meeting insight computing system” that would generate a quality score for a meeting using data such as body language, facial expressions, room temperature, time of day, and number of people in the meeting.

      So this will require that you have to have video turned on. How will they sell this to employees? "You need to turn your video on so that the algorithm can generate an accurate meeting quality score using your body language and facial expression.

      Sounds perfect. Absolutely no concerns about privacy violations, etc. in this product.

  44. Nov 2020
    1. Online Exams & Proctoring (In Addition to Guidance Listed Above) Requiring students to turn on their camera to be watched or recorded at home during an exam poses significant privacy concerns and should not be undertaken lightly. Several proctoring services use machine learning, AI, eye-tracking, key-logging, and other technologies to detect potential cheating; these should be used only when no feasible alternatives exist. If instructors are using a proctoring service during the COVID-19 measures, they must provide explicit notice to the students before the exam. Instructors are encouraged to work with the Digital Learning Hub in the Commons and the Academic Integrity Office to consider privacy-protective options, including how to use question banks (in Canvas), that will uphold integrity and good assessment design. Proctors and instructors are strongly discouraged from requiring students to show their surroundings on camera. Computers are available in labs for students who do not have a computer to take their final exams. Finals CANNOT be held in a lab, that is, instructors cannot be present nor can students from a specific class be asked to gather there for a final. This is only for those students who need a computer to drop in and complete their exam.
    1. anonymous imageboard

      4chan is reasonably unique in the current online landscape, in that it permits conversation by totally anonymous users. This allows its users to post without much thought about their privacy status, which they often take for granted. This unique level of privacy fostered by anonymity, in a way, partially delivers on the Cyberspace rhetoric of the 1990s in that people can't be judged by their physical identities unless they offer identifying information up themselves. That's not to say that 4chan is a welcoming space for all (or even most) users, though, as it has been acknowledged, even later here in Ellis' article, that 4chan houses plenty of white supremacist tendencies, but, strictly speaking, as far as one's ideas go, they are judged purely based on their merit so long as no additional personal identifiers are offered. As Dillon Ludemann notes in his paper, /pol/emics: Ambiguity, scales, and digital discourse on 4chan, white supremacy, as well as other, "practiced and perceived deviancy is due to the default blanket of anonymity, and the general discourse of the website encourages users to remain unnamed. This is further enforced and embodied as named users, colloquially known as 'namefags,' are often vilified for their separation from the anonymous collective community" (Ludemann, 2018).

      Hypothetically, since all users start out as anonymous, one could also present their identity however they so please on the platform, and in theory what this means is that the technology behind the site promotes identity exploration (and thus cyberspace rhetoric), even though in practice, what most users experience is latent racism that depends on users' purposefully offered identifying information or generalized white supremacist posts that are broadcasted for all on the site to see.

      Work Cited:

      Ludemann, D. (2018). /pol/emics: Ambiguity, scales, and digital discourse on 4chan. Discourse, Context & Media, 24, 92-98. doi: 10.1016/j.dcm.2018.01.010

    1. A long term key is as secure as the minimum common denominator of your security practices over its lifetime. It's the weak link.

      Good phrasing of the idea: "you're only as secure as your weakest link".

      You are only as secure as the minimum common denominator of your security practices.

    1. The first is that the presence of surveillance means society cannot experiment with new things without fear of reprisal, and that means those experiments—if found to be inoffensive or even essential to society—cannot slowly become commonplace, moral, and then legal. If surveillance nips that process in the bud, change never happens. All social progress—from ending slavery to fighting for women’s rights—began as ideas that were, quite literally, dangerous to assert. Yet without the ability to safely develop, discuss, and eventually act on those assertions, our society would not have been able to further its democratic values in the way that it has. Consider the decades-long fight for gay rights around the world. Within our lifetimes we have made enormous strides to combat homophobia and increase acceptance of queer folks’ right to marry. Queer relationships slowly progressed from being viewed as immoral and illegal, to being viewed as somewhat moral and tolerated, to finally being accepted as moral and legal. In the end it was the public nature of those activities that eventually slayed the bigoted beast, but the ability to act in private was essential in the beginning for the early experimentation, community building, and organizing. Marijuana legalization is going through the same process: it’s currently sitting between somewhat moral, and—depending on the state or country in question—tolerated and legal. But, again, for this to have happened, someone decades ago had to try pot and realize that it wasn’t really harmful, either to themselves or to those around them. Then it had to become a counterculture, and finally a social and political movement. If pervasive surveillance meant that those early pot smokers would have been arrested for doing something illegal, the movement would have been squashed before inception. Of course the story is more complicated than that, but the ability for members of society to privately smoke weed was essential for putting it on the path to legalization. We don’t yet know which subversive ideas and illegal acts of today will become political causes and positive social change tomorrow, but they’re around. And they require privacy to germinate. Take away that privacy, and we’ll have a much harder time breaking down our inherited moral assumptions.

      One reason privacy is important is because society makes moral progress by experimenting with things on the fringe of what is legal.

      This is reminiscent of Signal's founder's argument that we should want law enforcement not to be 100% effective, because how else are we going to find out the gay sex, and marihuana use doesn't devolve and doesn't hurt anybody.

    1. For both the tailor-customer and doctor-patient examples, personal data is an input used to improve an output (dress, suit, medical treatment) such that the improvement directly serves the interests of the person whose information is being used.

      This reminds me of "Products are functions" where your personal data is a variable than enters into the function to determine the output.

    1. Preserving user privacy is difficult when detectingmore nuanced forms of censorshipSome forms of softcensorship might involve intentional performance degrada-tion or content manipulation. Detecting this type of behav-ior would require comparing performance or content acrossgroups of users, but such a comparison also implies thateach user must reveal their browsing history.

      If you want to investigate whether content for a user was manipulated or performance was degraded, there may be no other way but to access detailed logs of their usage. This might raise privacy concerns.

      Not only is personalization difficult to disambiguate from manipulation and censorship, personalization also makes it more costly to compare the personalized experience to some baseline value to determine if manipulation or performance degradation has taken place.

    1. If the EU is set to mandate encryption backdoors to enable law enforcement to pursue bad actors on social media, and at the same time intends to continue to pursue the platforms for alleged bad practices, then entrusting their diplomatic comms to those platforms, while forcing them to have the tools in place to break encryption as needed would seem a bad idea.

      One explanation for the shift away from Whatsapp (based off the Signal protocol) is that the EU themselves are seeking legislation to force a backdoor into consumer tech.

    1. But as long as the most important measure of success is short-term profit, doing things that help strengthen communities will fall by the wayside. Surveillance, which allows individually targeted advertising, will be prioritized over user privacy. Outrage, which drives engagement, will be prioritized over feelings of belonging. And corporate secrecy, which allows Facebook to evade both regulators and its users, will be prioritized over societal oversight.

      Schneier is saying here that as long as the incentives are still pointing in the direction of short-term profit, privacy will be neglected.

      Surveillance, which allows for targeted advertising will win out over user privacy. Outrage, will be prioritized over more wholesome feelings. Corporate secrecy will allow Facebook to evade regulators and its users.

    2. Increased pressure on Facebook to manage propaganda and hate speech could easily lead to more surveillance. But there is pressure in the other direction as well, as users equate privacy with increased control over how they present themselves on the platform.

      Two forces acting on the big tech platforms.

      One, towards more surveillance, to stop hate and propaganda.

      The other, towards less surveillance, stemming from people wanting more privacy and more control.

    1. At the same time, working through these principles is only the first step in building out a privacy-focused social platform. Beyond that, significant thought needs to go into all of the services we build on top of that foundation -- from how people do payments and financial transactions, to the role of businesses and advertising, to how we can offer a platform for other private services.

      This is what Facebook is really after. They want to build the trust to be able to offer payment services on top of Facebook.

    2. People want to be able to choose which service they use to communicate with people. However, today if you want to message people on Facebook you have to use Messenger, on Instagram you have to use Direct, and on WhatsApp you have to use WhatsApp. We want to give people a choice so they can reach their friends across these networks from whichever app they prefer.We plan to start by making it possible for you to send messages to your contacts using any of our services, and then to extend that interoperability to SMS too. Of course, this would be opt-in and you will be able to keep your accounts separate if you'd like.

      Facebook plans to make messaging interoperable across Instagram, Facebook and Whatsapp. It will be opt-in.