676 Matching Annotations
  1. Last 7 days
    1. Hidden below all of this is the normalization of surveillance that consistently targets marginalized communities. The difference between a smartwatch and an ankle monitor is, in many ways, a matter of context: Who wears one for purported betterment, and who wears one because they are having state power enacted against them?
    2. The conveniences promised by Amazon’s suite of products may seem divorced from this context; I am here to tell you that they’re not. These “smart” devices all fall under the umbrella of what the digital-studies scholar David Golumbia and I call “luxury surveillance”—that is, surveillance that people pay for and whose tracking, monitoring, and quantification features are understood by the user as benefits.
    1. Some of the sensitive data collection analyzed by The Markup appears linked to default behaviors of the Meta Pixel, while some appears to arise from customizations made by the tax filing services, someone acting on their behalf, or other software installed on the site. Report Deeply and Fix Things Because it turns out moving fast and breaking things broke some super important things. Give Now For example, Meta Pixel collected health savings account and college expense information from H&R Block’s site because the information appeared in webpage titles and the standard configuration of the Meta Pixel automatically collects the title of a page the user is viewing, along with the web address of the page and other data. It was able to collect income information from Ramsey Solutions because the information appeared in a summary that expanded when clicked. The summary was detected by the pixel as a button, and in its default configuration the pixel collects text from inside a clicked button.  The pixels embedded by TaxSlayer and TaxAct used a feature called “automatic advanced matching.” That feature scans forms looking for fields it thinks contain personally identifiable information like a phone number, first name, last name, or email address, then sends detected information to Meta. On TaxSlayer’s site this feature collected phone numbers and the names of filers and their dependents. On TaxAct it collected the names of dependents.

      Meta Pixel default behavior is to parse and send sensitive data

      Wait, wait, wait... the software has a feature that scans for privately identifiable information and sends that detected info to Meta? And in other cases, the users of the Meta Pixel decided to send private information ot Meta?

  2. Nov 2022
    1. Although complicated, Gen Z’s relationship with data privacy should be a consideration for brands when strategizing their data privacy policies and messaging for the future. Expectations around data privacy are shifting from something that sets companies apart in consumers’ minds to something that people expect the same way one might expect a service or product to work as advertised. For Gen Zers, this takes the form of skepticism that companies will keep their data safe, and their reluctance to give companies credit for getting it right means that good data privacy practices will increasingly be more about maintaining trust than building it.

      Gen-Z expectations are complicated

      The Gen-Z generation have notably different expectations about data privacy than previous generations. "Libraries" wasn't among the industry that showed up in their survey results. That Gen-Z expects privacy built in makes that factor a less differentiating characteristic as compared to older generations. It might also be harder to get trust back from members of the Gen-Z population if libraries surprise those users with data handling practices that they didn't expect.

    1. “You have to assume that things can go wrong,” shared Waymo’s head of cybersecurity, Stacy Janes. “You can’t just design for this success case – you have to design for the worst case.”

      Future proofing by asking "what if we're wrong?"

  3. Oct 2022
    1. A Midwestern hospital system is treating its use of Google and Facebook web tracking technologies as a data breach, notifying 3 million individuals that the computing giants may have obtained patient information.

      Substitute “library” for “hospital”

      In an alternate universe: “A Midwestern library system is treating its use of Google and Facebook web tracking technologies as a data breach, notifying 3 million individuals that the computing giants may have obtained search and borrowing histories.”

    1. On the other end, there was The Good Phone Foundation, a not-for-profit organization founded with a mission to create an open, transparent, and secure mobile ecosystem outside of Big Tech’s reach, who just released their own Android-based mobile OS and were looking for apps to rely on. They contacted me, and after a couple of calls, we realized that partnering up on the smartphone makes a lot of sense for both of us. So, here we are, introducing you to our brand new Simple Phone. Only having control over both software and hardware ensures the ultimate privacy and security. The target audience consists of more privacy-oriented people that do not want to be tracked or rely on big corporations, Google Play, etc. It focuses on people who just want to get things done in a simple way without having to keep closing ads and wondering what does the system do in the background. Not to mention consistency again as the core apps are developed by us. Hope you will like it just like we do 🙂

      Simple Phone's effort to release its own mobile OS is promising for ordinary users. Because Simple Mobile Tools represents a full suite of basic Android applications, in can, ideally, provide a privacy-friendly and user-friendly alternative to stock Android by providing a unified suite of apps. /e/ OS (aka Murena) is attempting something similar, but its app collection is not quite as unified as the Simple Mobile suite.

    1. in many ways, Law 25 is the most stringent of the three regimes
    2. Impact assessments: Law 25 is broad and requires a PIR to be carried out whenever conditions are met, regardless of the level of risk. The GDPR is less stringent, only requiring assessments in cases where processing is likely to result in a ‘high risk’ to rights and freedoms. Because the CCPA does not specifically focus on accountability-related obligations, it does not mandate impact assessments.
    3. Privacy by default: Bill 64’s “confidentiality by default” clause is far broader in scope and significantly more stringent than the “privacy by design” concept under the GDPR. The CCPA does not provide for this concept at all, instead taking an “after-the-event” remedial approach. 
    1. exige qu'une AIPD soit réalisée à chaque fois que la situation le nécessite et quel que soit le niveau de risque
    2. La confidentialité par défaut : La clause de "confidentialité par défaut" du projet de loi 64 a une portée beaucoup plus vaste et est beaucoup plus stricte que le concept de "confidentialité par conception" prévu par le RGPD. Le CCPA adopte plutôt une approche corrective "après coup".
    1. En cas de non-respect de la Loi, la Commission d’accès à l’information pourra imposer des sanctionsimportantes, qui pourraient s’élever jusqu’à 25 M$ ou à 4 % du chiffre d’affaires mondial. Cette sanctionsera proportionnelle, notamment, à la gravité du manquement et à la capacité de payer de l’entreprise.ENTREPRISES
  4. Sep 2022
    1. Denmark’s data protection regulator found that local schools did not really understand what Google was doing with students’ data and as a result blocked around 8,000 students from using the Chromebooks that had become a central part of their daily education.

      Danish data regulator puts a temporary ban on Google education products

    1. If someone came up to you in the street, said they’re from an online service provider and requested you store all of the above data about you with them, I imagine for many the answer would be a resounding NO!
  5. Aug 2022
    1. University can’t scan students’ rooms during remote tests, judge rules

      Room scan by proctoring tools violates protection against unreasonable searches Ohio court rules (US const 4th amendment) Univ defense was 'it's standard industry practice' and 'others did not complain'. In other words no actual moral consideration was made by univ. This is so bad, even without the fact that a third party keeps the recording of the video scan of this student's bedroom.

      Where there's a need for remote test taking, why try to copy over the cheating controls from in-person test taking? How about adapting the test content on the assumption that students will have material available to them during the test, reducing the proctoring need to only assuring the actual student is taking the test.

    1. Your personal data will be shared both within Beekeeper associated offices globally and with Greenhouse Software, Inc., a cloud services provider located in the United States of America and engaged by Controller to help manage its recruitment and hiring process on Controller’s behalf.

      Personal data will be accessible to different branches (i.e., national affiliates) of Beekeeper.

    1. NETGEAR is committed to providing you with a great product and choices regarding our data processing practices. You can opt out of the use of the data described above by contacting us at analyticspolicy@netgear.com

      You may opt out of these data use situations by emailing analyticspolicy@netgear.com.

    2. Marketing. For example, information about your device type and usage data may allow us to understand other products or services that may be of interest to you.

      All of the information above that has been consented to, can be used by NetGear to make money off consenting individuals and their families.

    3. USB device

      This gives Netgear permission to know what you plug into your computer, be it a FitBit, a printer, scanner, microphone, headphones, webcam — anything not attached to your computer.

  6. Jul 2022
    1. In the wake of Roe v. Wade being overturned, online privacy is on everyone's minds. But according to privacy experts, the entire way we think about and understand what 'privacy' actually means... is wrong. In this new Think Again, NBC News Correspondent Andrew Stern dives deep into digital privacy — what it really means, how we got to this point, how it impacts every facet of our lives, and how little of it we actually have.

      In the wake of Roe v. Wade being overturned, online privacy is on everyone's minds. But according to privacy experts, the entire way we think about and understand what 'privacy' actually means... is wrong. In this new Think Again, NBC News Correspondent Andrew Stern dives deep into digital privacy — what it really means, how we got to this point, how it impacts every facet of our lives, and how little of it we actually have.

  7. www.mojeek.com www.mojeek.com
    1. Mojeek

      Mojeek is the 4th largest English lang. web search engine after Google, Bing and Yandex which has it's own index, crawler and algo. Index has passed 5.7 billion pages. Growing. Privacy based.

      It uses it's own index with no backfill from others.

    1. she’s being dragged into the public eye nonetheless.

      Wow... I have heard of instances similar to this one. A stranger narrates the life of another stranger online, it goes viral and the identities of everyone are revealed. To me it seemed meaningless I never thought that the people involved could just want their privacy. I find it very scary how this can happen to anyone. Another reason why I limit my social media usage. I found myself to be engaging with very negative commentary and it really affected my mental health. I wonder how this womans mental health is going. Pretty much the whole world knows about her know unfortunately.

    1. Something has shifted online: We’ve arrived at a new era of anonymity, in which it feels natural to be inscrutable and confusing—forget the burden of crafting a coherent, persistent personal brand. There just isn’t any good reason to use your real name anymore. “In the mid 2010s, ambiguity died online—not of natural causes, it was hunted and killed,” the writer and podcast host Biz Sherbert observed recently. Now young people are trying to bring it back. I find this sort of exciting, but also unnerving. What are they going to do with their newfound freedom?
  8. Jun 2022
    1. Companies need to actually have an ethics panel, and discuss what the issues are and what the needs of the public really are. Any ethics board must include a diverse mix of people and experiences. Where possible, companies should look to publish the results of these ethics boards to help encourage public debate and to shape future policy on data use.

    1. The goal is to gain “digital sovereignty.”

      the age of borderless data is ending. What we're seeing is a move to digital sovereignty

    1. Using the network-provided DNS servers is the best way to blend in with other users. Network and web sites can fingerprint and track users based on a non-default DNS configuration.
    1. All wireless devices have small manufacturing imperfections in the hardware that are unique to each device. These fingerprints are an accidental byproduct of the manufacturing process. These imperfections in Bluetooth hardware result in unique distortions, which can be used as a fingerprint to track a specific device. For Bluetooth, this would allow an attacker to circumvent anti-tracking techniques such as constantly changing the address a mobile device uses to connect to Internet networks. 

      Tracking that evades address changes

      An operating system can change the hardware address it broadcasts in avoid tracking. But subtle differences in the signal itself can still be identified and tracked.

    1. Free public projects private projects starting at $9/month per project

      For many tools and apps payment for privacy is becoming the norm.

      Examples: - Kumu.io - Github for private repos - ...

      pros: - helps to encourage putting things into the commons

      cons: - Normalizes the idea of payment for privacy which can be a toxic tool.

      discuss...

    1. the one thing that you have to keep conveying to people about the consequences of surveillance is that it's all very well to say that you have nothing to hide, but when you're spied upon, everybody that's connected to you gets spied upon. And if we don't push back, the most vulnerable people in society, the people that actually keep really massive violations of human rights and illegality in check, they're the people who get most affected.

      "I Have Nothing To Hide" counter-argument

      Even if you have nothing to hide, that doesn't mean that those you are connected with aren't also being surveilled and are part of targeted communities.

  9. May 2022
    1. For example, we know one of the ways to make people care about negative externalities is to make them pay for it; that’s why carbon pricing is one of the most efficient ways of reducing emissions. There’s no reason why we couldn’t enact a data tax of some kind. We can also take a cautionary tale from pricing externalities, because you have to have the will to enforce it. Western Canada is littered with tens of thousands of orphan wells that oil production companies said they would clean up and haven’t, and now the Canadian government is chipping in billions of dollars to do it for them. This means we must build in enforcement mechanisms at the same time that we’re designing principles for data governance, otherwise it’s little more than ethics-washing.

      Building in pre-payments or a tax on data leaks to prevent companies neglecting negative externalities could be an important stick in government regulation.

      While it should apply across the board, it should be particularly onerous for for-profit companies.

    2. Even with data that’s less fraught than our genome, our decisions about what we expose to the world have externalities for the people around us.

      We need to think more about the externalities of our data decisions.

  10. Apr 2022
    1. This Playbuzz Privacy Policy (“Policy”) outlines what personal information is collected by Playbuzz Ltd. (“Playbuzz”, “we”, “us” or “our”), how we use such personal information, the choices you have with respect to such personal information, and other important information.

      We keep your personal information personal and private. We will not sell, rent, share, or otherwise disclose your personal information to anyone except as necessary to provide our services or as otherwise described in this Policy.

    1. Dorothea Salo (2021) Physical-Equivalent Privacy, The Serials Librarian, DOI: 10.1080/0361526X.2021.1875962

      Permanent Link: http://digital.library.wisc.edu/1793/81297

      Abstract

      This article introduces and applies the concept of “physical-equivalent privacy” to evaluate the appropriateness of data collection about library patrons’ use of library-provided e‑resources. It posits that as a matter of service equity, any data collection practice that causes e‑resource users to enjoy less information privacy than users of an information-equivalent print resource is to be avoided. Analysis is grounded in real-world e‑resource-related phenomena: secure (HTTPS) library websites and catalogs, the Adobe Digital Editions data-leak incident of 2014, and use of web trackers on e‑resource websites. Implications of physical-equivalent privacy for the SeamlessAccess single-sign-on proposal will be discussed.

    1. a child had gone missing in our town and the FBI came to town to investigate immediately and had gone to the library. They had a tip and wanted to seize and search the library’s public computers. And the librarians told the FBI that they needed to get a warrant. The town was grief stricken and was enraged that the library would, at a time like that, demand that the FBI get a warrant. Like everyone in town was like, are you kidding me? A child is missing and you’re– and what? This town meeting afterwards, the library budget, of course, is up for discussion as it is every year, and the people were still really angry with the library, but a patron and I think trustee of the library – again, a volunteer, someone living in town – an elderly woman stood up and gave the most passionate defense of the Fourth Amendment and civil liberties to the people on the floor that I have ever witnessed.

      An example of how a library in Vermont stood up to a warrantless request from the FBI to seize and search public library computers. This could have impacted the library's budget when the issue was brought to a town meeting, but a library patron was a passionate advocate for the 4th amendment.

    1. K-Anonymity, L-Diversity, and T-ClosenessIn this section, I will introduce three techniques that can be used to reduce the probability that certain attacks can be performed. The simplest of these methods is k-anonymity, followed by l-diversity, and then followed by t-closeness. Other methods have been proposed to form a sort of alphabet soup, but these are the three most commonly utilized. With each of these, the analysis that must be performed on the dataset becomes increasingly complex and undeniably has implications on the statistical validity of the dataset.

      privacy metrics

    1. Privacy is not secrecy. A private matter is something one doesn't want the whole world to know, but a secret matter is something one doesn't want anybody to know.

      Privacy is the power to decide when and what is secret and who to.

    1. I thought that the point of disappearing messages was to eat your cake and have it too, by allowing you to send a message to your adversary and then somehow deprive them of its contents. This is obviously a stupid idea.But the threat that Snapchat — and its disappearing message successors —was really addressing wasn’t communication between untrusted parties, it was automating data-retention agreements between trusted parties.

      Why use a disappearing message service

      The point of a disappearing message service is to have the parties to the message agree on the data-retention provisions of a message. The service automates that agreement by deleting the message at the specified time. The point isn't to send a message to an adversary and then delete it so they can't prove that it has been sent. There are too many ways of capturing the contents of a message—as simple as taking a picture of the message with another device.

    1. Weinberg’s tweet announcing the change generated thousands of comments, many of them from conservative-leaning users who were furious that the company they turned to in order to get away from perceived Big Tech censorship was now the one doing the censoring. It didn’t help that the content DuckDuckGo was demoting and calling disinformation was Russian state media, whose side some in the right-wing contingent of DuckDuckGo’s users were firmly on.

      There is an odd sort of self-selected information bubble here. DuckDuckGo promoted itself as privacy-aware, not unfiltered. On their Sources page, they talk about where they get content and how they don't sacrifice privacy to gather search results. Demoting disinformation sources in their algorithms would seem to be a good thing. Except if what you expect to see is disinformation, and then suddenly the search results don't match your expectations.

  11. Mar 2022
    1. Thus,information about people and their behaviour is made visible to other people, systems andcompanies.

      "Data trails"—active information and passive telemetry—provide a web of details about a person's daily life, and the analysis of that data is a form of knowledge about a person.

  12. Feb 2022
    1. Others because they want privacy

      AIUI, your account's contribution graph and feed are still public, not private, without a way to opt out—just like on GitHub.

  13. Dec 2021
    1. Efforts to clarify and disseminatethe differences between “privacy as advocacy” (e.g.,privacy is a fundamental right; privacy is an ethicalnorm) and “privacy as compliance” (e.g., ensuringprivacy policies and laws are followed; privacyprograms train, monitor, and measure adherence torules) help frame conversations and set expectations.

      This is an interesting distinction... privacy-because-it-is-the-right-thing-to-do versus privacy-because-you-must. I think the latter is where most institutions are today. It will take a lot more education to get institutions to the former.

    2. As informed and engagedstakeholders, students understand how and why theirinstitutions use academic and personal data.

      Interesting that there is a focus here on advocacy from an active student body. Is it the expectation that change from some of the more stubborn areas of the campus would be driven by informed student push-back? This section on "Students, Faculty, and Staff" doesn't have the same advocacy role from the other portions of the campus community.

    1. Questions, comments and requests, including any complaints, regarding us or this privacy policy are welcomed and should be addressed to privacy@marugroup.net.

      However, if you do this then your email, IP, broswer etc will be collected and shared as per the information above. To be safer, I would write a letter, stick a stamp on the envelope and send it in.

    2. stored at, a destination outside the European Economic Area ("EEA").

      Why? is that allowed? I don't think that I would be happy about that as I am not reassured that 'taking reasonable steps' is actually appropriate considering one of those would be to host within regions specified by GDPR

    3. third party, in which case personal data held by it about its customers will be one of the transferred assets.

      I was going to respond to the survey until I saw this. I am offering to provide feedback for free and yet my personal information is collected and becomes part of the sale of the business in the form of an asset. The question is why is my personal information being held for any length of time after I have completed the survey? Isn't that a violation of GDPR?

    4. In the event that we sell or buy any business or assets, in which case we will disclose your personal data to the prospective seller or buyer of such business or assets.

      Why? I came across this privacy policy as I had been asked to respond to a survey about the website. Not only am I giving my feedback for free but then they want to take my personal information and give it away to unknown buyers and sellers (third parties) all of whom are fictitious and in the future?

    1. About 7 in 10 Americans think their phone or other devices are listening in on them in ways they did not agree to.

      I'm enough of a tinfoil hat wearer to this this might be true. Especially since my google home talks to me entirely too much when I'm not talking to it.

  14. Nov 2021
    1. There Is No Antimimetics Division (qntm): This is the best new sci fi I've read in recent memory, I think because it feels fresh and modern, tackling some of the hardest social questions that the world is facing today. It's about antimemes, defined as "an idea with self-censoring properties...which, by its intrinsic nature, discourages or prevents people from spreading it."

      I like the idea of antimemes. The tougher question is how to actually implement it on the web?

      Is this just the idea of a digital secret?

      "The only way for two computers to keep a secret on the web is if all the computers are dead."—Chris Aldrich

    1. Pretty much anything that can be remembered can be cracked. There’s still one scheme that works. Back in 2008, I described the “Schneier scheme”: So if you want your password to be hard to guess, you should choose something that this process will miss. My advice is to take a sentence and turn it into a password. Something like “This little piggy went to market” might become “tlpWENT2m”. That nine-character password won’t be in anyone’s dictionary. Of course, don’t use this one, because I’ve written about it. Choose your own sentence — something personal.

      Good advice on creating secure passwords.

    1. ISO 29100/Privacy Framework [2] defines the privacy principles as:1.Consent and choice,2.Purpose legitimacy and specification,3.Collection limitation,4.Data minimization,5.Use, retention and disclosure limitation,6.Accuracy and quality,7.Openness, transparency and notice,8.Individual participation and access,9.Accountability,10.Information security, and11.Privacy compliance.
  15. Oct 2021
    1. A screenshot from the document providing an overview of different data retention periods. Image: Motherboard.

      Is it possible that FBI stores this data on us?

    1. We will also show you how to de-link your Chrome profile from your Google account(s) by stopping Chrome from syncing with Google in the first place. This will help keep your Chrome profile separate from your Google account and enhance your online privacy.
    2. To do that, Chrome automatically links your Chrome profile to a Google account when you sign in to any Google service on the web. That helps Google deliver a ‘seamless experience’ across all devices by letting you sync your history, bookmarks, passwords, etc., across multiple devices. Meanwhile, privacy-conscious users see this as a major threat to their online privacy and advise users to remove their Google account from Chrome.
    3. As mentioned already, Chrome automatically signs you in to your Google account every time you sign into a Google service, like Gmail, YouTube, Google Photos, etc. It also links your current Chrome profile to that account. While Google says that it does so to offer a ‘seamless experience’, it is a privacy nightmare for many users.
  16. Sep 2021
  17. Aug 2021
    1. You can request that Zoom delete any and all information they hold on you. Information on your data rights and how to get in contact with Zoom to request they erase your data can be found in their privacy policy. Once you have made the request, follow up to ensure you get confirmation that your data has been removed from their servers.
    1. U.S. Senate Subcommittee on Communications, Technology, Innovation, and the Internet, "Optimizing for Engagement: Understanding the Use of Persuasive Technology on Internet Platforms," 25 June 2019, www.commerce.senate.gov/2019/6/optimizing-for-engagement-understanding-the-use-of-persuasive-technology-on-internet-platforms.

      Perhaps we need plurality in the areas for which social data are aggregated?

      What if we didn't optimize for engagement, but optimized for privacy, security, or other axes in the space?

  18. Jul 2021
    1. whereas now, they know that user@domain.com was subscribed to xyz.net at some point and is unsubscribing. Information is gold. Replace user@domain with abcd@senate and xyz.net with warezxxx.net and you've got tabloid gold.
    1. Roberts noted that the risks of physical danger to donors are heightened “with each passing year” as changes in technology enables “anyone with access to a computer” to “compile a wealth of information about” anyone.

      He's going to be shocked at what's in his Facebook (shadow) profile...

    1. consumer friendly

      Including the "consumer" here is a red herring. We're meant to identify as the consumer and so take from this statement that our rights and best interests have been written into these BigTech-crafted laws.

      But a "consumer" is different from a "citizen," a "person," we the people.

    2. passage in March of a consumer data privacy law in Virginia, which Protocol reported was originally authored by Amazon

      From the article:

      Marsden and Virginia delegate Cliff Hayes held meetings with other large tech companies, including Microsoft; financial institutions, including Capital One; and non-profit groups and small businesses...

      Which all have something at stake here: the ability to monitor people and mine their data in order to sell it.

      Weak privacy laws give the illusion of privacy while maintaining the corporate panopticon.

    3. consumers would have to opt out of rather than into tracking

      Example of a dark pattern.

  19. Jun 2021
    1. But after using it for a few days you quickly realize that there is one major privacy issue that has been installed consciously by Amazon and Ring.The ring app allows you to delete videos on the system but it does Not allow you to delete motion sensor and window sensor history.So Amazon/ring knows everything that happens inside your home and there is no way for you to delete that history. They know when you’re inside, they know when you open your door, they know when you closed it. etc. etc. etc. So they essentially know everything about you and your motions within your home.This is a major privacy issue. And it is not some mistake that was overlooked. This was a conscious choice on Amazon/rings part to track the motions of you and your family inside your own home.I spoke with the customer service rep from Ring and she admitted that many many people call up and complain that they can’t delete sensor history. Of course it would’ve been much more ethical to explain to potential customers BEFORE they buy ring products that this breach of privacy has been installed.But Amazon/ring does not warn their customers about this privacy breach. They don’t warn customers because they created the privacy breech and Will continue to have an always have very personal information on the motions of your family inside your own home.If you care about your privacy. Don’t buy Ring products.
    1. Το ότι αποτελούν αντικείμενο ρύθμισης δεν είναι κάποια ριζοσπαστική θέση, είναι η θέση που έχει εκφράσει στο κογκρέσο των ΗΠΑ ο ιδρυτής  και ιδιοκτήτης του Fb Mark Zuckerberg: «Η θέση μου δεν είναι ότι δεν πρέπει να υπάρχει ρύθμιση. Πιστεύω ότι το πραγματικό ερώτημα, καθώς το διαδίκτυο γίνεται ολοένα και πιο σημαντικό για τις ζωές των ανθρώπων, είναι ποιος είναι ο σωστός τρόπος ρύθμισης, και όχι αν είναι απαραίτητο να υπάρχει ρύθμιση»

      Τσακαλώτος στα καλύτερά του, επιχειρηματολογέι εναντια στην ιδεολογία της ιδιώτευσης στο Fb.

    1. Yet books are curious objects: their strength is to be both intensely private and intensely social — and marginalia is a natural bridge between these two states.

      Books represent a dichotomy in being both intensely private and intensely social at the same time.

      Are there other objects that have this property?

      Books also have the quality of providing people with identities.

  20. May 2021
    1. <small><cite class='h-cite via'> <span class='p-author h-card'>jenny (phire) zhang</span> in jenny (phire) zhang on Twitter: "@markpopham the OSS/indieweb world falls into this trap a lot imo!! thank you for reading <3" / Twitter (<time class='dt-published'>05/06/2021 07:20:50</time>)</cite></small>

    2. In 1962, a book called Silent Spring by Rachel Carson documenting the widespread ecological harms caused by synthetic pesticides went off like a metaphorical bomb in the nascent environmental movement.

      Where is the Silent Spring in the data, privacy, and social media space?

    3. Amidst the global pandemic, this might sound not dissimilar to public health. When I decide whether to wear a mask in public, that’s partially about how much the mask will protect me from airborne droplets. But it’s also—perhaps more significantly—about protecting everyone else from me. People who refuse to wear a mask because they’re willing to risk getting Covid are often only thinking about their bodies as a thing to defend, whose sanctity depends on the strength of their individual immune system. They’re not thinking about their bodies as a thing that can also attack, that can be the conduit that kills someone else. People who are careless about their own data because they think they’ve done nothing wrong are only thinking of the harms that they might experience, not the harms that they can cause.

      What lessons might we draw from public health and epidemiology to improve our privacy lives in an online world? How might we wear social media "masks" to protect our friends and loved ones from our own posts?

    4. In an individual model of privacy, we are only as private as our least private friend.

      So don't have any friends?

      Obviously this isn't a thing, but the implications of this within privacy models can be important.

      Are there ways to create this as a ceiling instead of as a floor? How might we use topology to flip this script?

    5. 130 years on, privacy is still largely conceived of as an individual thing, wherein we get to make solo decisions about when we want to be left alone and when we’re comfortable being trespassed upon.

      How could one design a mathematical balancing system to help separate individuals embedded within a variety of societies or publics to enforce a balance of levels of privacy.

      • There's the interpersonal level between the individuals
      • There's the person's individual privacy and the public's reaction/response to the thing captured, for which the public may shun or not
      • There's the takers rights (possibly a journalist or news outlet) to inform the broader public which may shame or not
      • There's the publics' potential right to know, the outcome may effect them or dramatically change society as a whole
      • others facets?
      • how many facets?
      • how to balance all these to create an optimum outcome for all parties?
      • How might the right to forget look like and be enforced?
      • How do economic incentives play out (paparazzi, journalism, social media, etc.?)
    1. Draft notes, E-mail, plans, source code, to-do lists, what have you

      The personal nature of this information means that users need control of their information. Tim Berners-Lee's Solid (Social Linked Data) project) looks like it could do some of this stuff.

    1. The seminal 1890 Harvard Law Review article The Right to Privacy—which every essay about data privacy is contractually obligated to cite—argued that the right of an individual to object to the publication of photographs ought to be considered part of a general ‘right to be let alone’.

      <small><cite class='h-cite via'> <span class='p-author h-card'>Jenny</span> in left alone, together | The Roof is on Phire (<time class='dt-published'>05/08/2021 18:32:41</time>)</cite></small>

      See also: https://en.wikipedia.org/wiki/The_Right_to_Privacy_(article)

    1. “For one of the most heavily guarded individuals in the world, a publicly available Venmo account and friend list is a massive security hole. Even a small friend list is still enough to paint a pretty reliable picture of someone's habits, routines, and social circles,” Gebhart said.

      Massive how? He's such a public figure that most of these connections are already widely reported in the media or easily guessable by an private invistigator. The bigger issue is the related data of transactions which might open them up for other abuses or potential leverage as in the other examples.

    1. Although I believe people have a right to secure and private communication, I disagree with those who extrapolate from this that we have a right to anonymous property transfer. It’s totally in the public’s legitimate interest to keep track of who owns what, and to settle which transfers of ownership are legitimate, for instance by disallowing coerced ones.

      I found this thought helpful. I had feelings like this but could not articulate them before.

  21. Apr 2021
    1. People can take the conversations with willing co-workers to Signal, Whatsapp, or even a personal Basecamp account, but it can't happen where the work happens anymore.

      Do note that two of the three systems that Fried use for examples are private. In other words, only people who you explicitly want to see what you're writing will see just that.

      This goes against his previous actions somewhat, e.g. https://twitter.com/jasonfried/status/1168986962704982016

  22. Mar 2021
    1. Not only are these websites breaking my trust—when I visit your website, I entered into contact with you, not 80 other websites—but they are loading content from websites neither know nor trust. Some of which have been know to spread malware.

      The contract of a healthy community: basic respect for one another.

    1. a data donation platform that allows users of browsers to donate data on their usage of specific services (eg Youtube, or Facebook) to a platform.

      This seems like a really promising pattern for many data-driven problems. Browsers can support opt-in donation to contribute their data to improve Web search, social media, recommendations, lots of services that implicitly require lots of operational data.

    2. The idea is that many smaller tech companies would allow for more choice between services. This solution is flawed. For one, services like search or social media benefit from network effects. Having large datasets to train on, means search recommendations get better. Having all your friends in one place, means you don’t need five apps to contact them all. I would argue those are all things we like and might lose when Big Tech is broken up. What we want is to be able to leave Facebook and still talk to our friends, instead of having many Facebooks.

      I'd be interested to better understand this concern or critique. I think the goal of smaller, interoperable services is exactly the idea of being able to communicate with our Facebook friends even if we leave Facebook. Perhaps that is an argument for combining deconsolidation with interoperability.

    1. Our new feature, Total Cookie Protection, works by maintaining a separate “cookie jar” for each website you visit. Any time a website, or third-party content embedded in a website, deposits a cookie in your browser, that cookie is confined to the cookie jar assigned to that website, such that it is not allowed to be shared with any other website.
  23. Feb 2021
  24. www.joinhoney.com www.joinhoney.com
    1. Honey does not track your search engine history, emails, or your browsing on any site that is not a retail website (a site where you can shop and make a purchase). When you are on a pre-approved retail site, to help you save money, Honey will collect information about that site that lets us know which coupons and promos to find for you. We may also collect information about pricing and availability of items, which we can share with the rest of the Honey community.
    1. (F)unctionalSifting:APrivacy-PreservingReputationSystemThroughMulti-InputFunctionalEncryption(extendedversion)
  25. Jan 2021
    1. Despite some implementation challenges, patient portals have allowed millions of patients to access to their medical records, read physicians’ notes, message providers, and contribute valuable information and corrections.

      I wonder if patients have edit - or at least, flag - information in their record?

    1. In our bedrooms, we want to have powerover who has access to us; in our bathrooms, we just want others de-prived of that access.

      Reidman highlights two types of privacy.

      The privacy we want to have in the bathroom, which is the power to deprive others of access to us.

      And the privacy we want to have in the bedroom, which is the power to control who has access to us.

    2. By privacy, I understand the condition in which other people aredeprived of access to either some information about you or some ex-perience of you. For the sake of economy, I will shorten this and saythat privacy is the condition in which others are deprived of access toyou.

      Reiman defines privacy as the condition in which others are deprived of access to you (information (e.g. location) or experience (e.g. watching you shower))

    3. No doubt privacyis valuable to people who have mischief to hide, but that is not enoughto make it generally worth protecting. However, it is enough to re-mind us that whatever value privacy has, it also has costs. The moreprivacy we have, the more difficult it is to get the information that

      Privacy is valuable to people who have mischief to hide. This is not enough to make it worth protecting, but it tells us that there is also a cost.

    1. As you already noticed, the extension does not go in an manipulate the hrefs/urls in the DOM itself. While it may seem scary to you that an extension may manipulate a URL you're navigating to in-flight, I think it's far scarier to imagine an extension reading and manipulating all of the HTML on all of the pages you go to (bank accounts, utilities, crypto, etc) in order to provide a smidgeon of privacy for the small % of times you happen to click a link with some UTM params.
  26. Dec 2020
    1. I haven't met anyone who makes this argument who then says that a one stop convenient, reliable, private and secure online learning environment can’t be achieved using common every day online systems

      Reliable: As a simple example, I'd trust Google to maintain data reliability over my institutional IT support.

      And you'd also need to make the argument for why learning needs to be "private", etc.

    1. And then there was what Lanier calls “data dignity”; he once wrote a book about it, called Who Owns the Future? The idea is simple: What you create, or what you contribute to the digital ether, you own.

      See Tim Berners-Lee's SOLID project.

    1. “Being under constant surveillance in the workplace is psychological abuse,” Heinemeier Hansson added. “Having to worry about looking busy for the stats is the last thing we need to inflict on anyone right now.”

      I really like the Basecamp approach (I forget where I heard this...could have been in one of the Rework podcasts):

      Don't try to get the most out of everyone; try to get the best out of them.

      If you're looking for ways to build trust in a team, I can't recommend the following books published by Basecamp:

      • Rework
      • Remote
      • It doesn't have to be crazy at work
    2. For example, to help maintain privacy and trust, the user data provided in productivity score is aggregated over a 28-day period.

      So that the fact that the metrics are collected over 28 days is meant to maintain privacy and trust. How?

    1. Recent patent filings show that Microsoft has been exploring additional ideas to monitor workers in the interest of organizational productivity. One filing describes a “meeting insight computing system” that would generate a quality score for a meeting using data such as body language, facial expressions, room temperature, time of day, and number of people in the meeting.

      So this will require that you have to have video turned on. How will they sell this to employees? "You need to turn your video on so that the algorithm can generate an accurate meeting quality score using your body language and facial expression.

      Sounds perfect. Absolutely no concerns about privacy violations, etc. in this product.

  27. Nov 2020
    1. Online Exams & Proctoring (In Addition to Guidance Listed Above) Requiring students to turn on their camera to be watched or recorded at home during an exam poses significant privacy concerns and should not be undertaken lightly. Several proctoring services use machine learning, AI, eye-tracking, key-logging, and other technologies to detect potential cheating; these should be used only when no feasible alternatives exist. If instructors are using a proctoring service during the COVID-19 measures, they must provide explicit notice to the students before the exam. Instructors are encouraged to work with the Digital Learning Hub in the Commons and the Academic Integrity Office to consider privacy-protective options, including how to use question banks (in Canvas), that will uphold integrity and good assessment design. Proctors and instructors are strongly discouraged from requiring students to show their surroundings on camera. Computers are available in labs for students who do not have a computer to take their final exams. Finals CANNOT be held in a lab, that is, instructors cannot be present nor can students from a specific class be asked to gather there for a final. This is only for those students who need a computer to drop in and complete their exam.
    1. anonymous imageboard

      4chan is reasonably unique in the current online landscape, in that it permits conversation by totally anonymous users. This allows its users to post without much thought about their privacy status, which they often take for granted. This unique level of privacy fostered by anonymity, in a way, partially delivers on the Cyberspace rhetoric of the 1990s in that people can't be judged by their physical identities unless they offer identifying information up themselves. That's not to say that 4chan is a welcoming space for all (or even most) users, though, as it has been acknowledged, even later here in Ellis' article, that 4chan houses plenty of white supremacist tendencies, but, strictly speaking, as far as one's ideas go, they are judged purely based on their merit so long as no additional personal identifiers are offered. As Dillon Ludemann notes in his paper, /pol/emics: Ambiguity, scales, and digital discourse on 4chan, white supremacy, as well as other, "practiced and perceived deviancy is due to the default blanket of anonymity, and the general discourse of the website encourages users to remain unnamed. This is further enforced and embodied as named users, colloquially known as 'namefags,' are often vilified for their separation from the anonymous collective community" (Ludemann, 2018).

      Hypothetically, since all users start out as anonymous, one could also present their identity however they so please on the platform, and in theory what this means is that the technology behind the site promotes identity exploration (and thus cyberspace rhetoric), even though in practice, what most users experience is latent racism that depends on users' purposefully offered identifying information or generalized white supremacist posts that are broadcasted for all on the site to see.

      Work Cited:

      Ludemann, D. (2018). /pol/emics: Ambiguity, scales, and digital discourse on 4chan. Discourse, Context & Media, 24, 92-98. doi: 10.1016/j.dcm.2018.01.010