732 Matching Annotations
  1. Jan 2020
    1. “Privacy in law means various things,” he writes; “and one of the things it means is protection from intrusion.” He argues that in advertising, open performance, and public-address systems, “these may validly be regulated” to prevent porn from being thrust upon the unsuspecting and unwilling. It is an extension of broadcast regulation. And that is something we grapple with still: What is shown to us, whether we want it shown to us, and how it gets there: by way of algorithm or editor or bot. What is our right not to see?

      Privacy as freedom from is an important thing. I like this idea.

    1. Now, Google has to change its practices and prompt users to choose their own default search engine when setting up a European Android device that has the Google Search app built in. Not all countries will have the same options, however, as the choices included in Google’s new prompt all went to the highest bidders.As it turns out, DuckDuckGo must have bid more aggressively than other Google competitors, as it’s being offered as a choice across all countries in the EU.
    1. A Microsoft programme to transcribe and vet audio from Skype and Cortana, its voice assistant, ran for years with “no security measures”, according to a former contractor who says he reviewed thousands of potentially sensitive recordings on his personal laptop from his home in Beijing over the two years he worked for the company.

      Wonderful. This, combined with the fact that Skype users can—fairly easily—find out which contacts another person has, is horrifying.

      Then again, most people know that Microsoft have colluded with American authorities to divulge chat/phone history for a long time, right?

  2. Dec 2019
    1. any protester who brings a phone to a public demonstration is tracked and that person’s presence at the event is duly recorded in commercial datasets. At the same time, political parties are beginning to collect and purchase phone location for voter persuasion.
    2. “Privacy needs to start being seen as a human right.”
    1. I understand that GitHub uses "Not Found" where it means "Forbidden" in some circumstances to prevent inadvertently reveling the existence of a private repository. Requests that require authentication will return 404 Not Found, instead of 403 Forbidden, in some places. This is to prevent the accidental leakage of private repositories to unauthorized users. --GitHub This is a fairly common practice around the web, indeed it is defined: The 404 (Not Found) status code indicates that the origin server did not find a current representation for the target resource or is not willing to disclose that one exists. --6.5.4. 404 Not Found, RFC 7231 HTTP/1.1 Semantics and Content (emphasis mine)
    1. Google found 1,494 device identifiers in SensorVault, sending them to the ATF to comb through. In terms of numbers, that’s unprecedented for this form of search. It illustrates how Google can pinpoint a large number of mobile phones in a brief period of time and hand over that information to the government
    2. Google found 1,494 device identifiers in SensorVault, sending them to the ATF to comb through. In terms of numbers, that’s unprecedented for this form of search. It illustrates how Google can pinpoint a large number of mobile phones in a brief period of time and hand over that information to the government
  3. Nov 2019
    1. Loading this iframe allows Facebook to know that this specific user is currently on your website. Facebook therefore knows about user browsing behaviour without user’s explicit consent. If more and more websites adopt Facebook SDK then Facebook would potentially have user’s full browsing history! And as with “With great power comes great responsibility”, it’s part of our job as developers to protect users privacy even when they don’t ask for.
    1. half of iPhone users don’t know there’s a unique ID on their phone (called an IDFA, for “identifier for advertisers”) tracking their app activity and sending it to third-party advertisers by default.
    1. How you use our services and your devicesThis includes: call records containing phone numbers you call and receive calls from, websites you visit, text records, wireless location, application and feature usage, product and device-specific information and identifiers, router connections, service options you choose, mobile and device numbers, video streaming and video packages and usage, movie rental and purchase data, TV and video viewership, and other similar information.
    2. Demographic and interest dataFor example, this information could include gender, age range, education level, sports enthusiast, frequent diner and other demographics and interests.
    3. Information from social media platformsThis may include interests, "likes" and similar information you permit social media companies to share in this way.
    4. Information from Verizon MediaFor example, we may receive information from Verizon Media to help us understand your interests to help make our advertising more relevant to you.
    5. Learn about the information Verizon collects about you, your devices and your use of products and services we provide. We collect information when you interact with us and use our products and services. The types of information we collect depends on your use of our products and services and the ways that you interact with us. This may include information about: Contact, billing and other information you provide 1 How you use our services and your devices 2 How you use our websites and apps 3 How our network and your devices are working 4 Location of your wireless devices

      Verizon Privacy Policy

    1. Google has confirmed that it partnered with health heavyweight Ascension, a Catholic health care system based in St. Louis that operates across 21 states and the District of Columbia.

      What happened to 'thou shalt not steal'?

    1. Found a @facebook #security & #privacy issue. When the app is open it actively uses the camera. I found a bug in the app that lets you see the camera open behind your feed.

      So, Facebook uses your camera even while not active.

    1. Speaking with MIT Technology Review, Rohit Prasad, Alexa’s head scientist, has now revealed further details about where Alexa is headed next. The crux of the plan is for the voice assistant to move from passive to proactive interactions. Rather than wait for and respond to requests, Alexa will anticipate what the user might want. The idea is to turn Alexa into an omnipresent companion that actively shapes and orchestrates your life. This will require Alexa to get to know you better than ever before.

      This is some next-level onslaught.

    1. From Peg Cheechi, an instructional designer at Rush University: informing faculty members about the advantages of working with experts in course design.

      The Chronicle of Higher Education is a website and newspaper informing students and faculty of college affairs and news.

      Rating: 9/10

    1. An explosive trove of nearly 4,000 pages of confidential internal Facebook documentation has been made public, shedding unprecedented light on the inner workings of the Silicon Valley social networking giant.

      I can't even start telling you how much schadenfreude I feel at this. Even though this paints a vulgar picture, Facebook are still doing it, worse and worse.

      Talk about hiding in plain sight.

    1. Clear affirmative action means someone must take deliberate action to opt in, even if this is not expressed as an opt-in box. For example, other affirmative opt-in methods might include signing a consent statement, oral confirmation, a binary choice presented with equal prominence, or switching technical settings away from the default. The key point is that all consent must be opt-in consent – there is no such thing as ‘opt-out consent’. Failure to opt out is not consent. You may not rely on silence, inactivity, default settings, pre-ticked boxes or your general terms and conditions, or seek to take advantage of inertia, inattention or default bias in any other way.

      On opt in vs opt out in GDPR.

    1. Although the GDPR doesn’t specifically ban opt-out consent, the Information Commissioner’s Office (ICO) says that opt-out options “are essentially the same as pre-ticked boxes, which are banned”.

      On opt in vs opt out in GDPR.

    1. Somewhere in a cavernous, evaporative cooled datacenter, one of millions of blinking Facebook servers took our credentials, used them to authenticate to our private email account, and tried to pull information about all of our contacts. After clicking Continue, we were dumped into the Facebook home page, email successfully “confirmed,” and our privacy thoroughly violated.
    1. We are disturbed by the idea that search inquiries are systematically monitored and stored by corporations like AOL, Yahoo!, Google, etc. and may even be available to third parties. Because the Web has grown into such a crucial repository of information and our search behaviors profoundly reflect who we are, what we care about, and how we live our lives, there is reason to feel they should be off-limits to arbitrary surveillance. But what can be done?
    1. If the apparatus of total surveillance that we have described here were deliberate, centralized, and explicit, a Big Brother machine toggling between cameras, it would demand revolt, and we could conceive of a life outside the totalitarian microscope.
  4. Oct 2019
    1. ) Blockchain MemoryWe let LL be the blockchain mem-ory space, represented as the hastable L:{0,1}256→{0,1}NL:\{0,1\}^{256}\rightarrow \{0, 1\}^{N}, where N≫N \gg 256 and can store sufficiently-large documents. We assume this memory to be tamperproof under the same adversarial model used in Bitcoin and other blockchains. To intuitively explain why such a trusted data-store can be implemented on any blockchain (including Bitcoin), consider the following simplified, albeit inefficient, implementation: A blockchain is a sequence of timestamped transactions, where each transaction includes a variable number of output addresses (each address is a 160-bit number). LL could then be implemented as follows - the first two outputs in a transaction encode the 256-bit memory address pointer, as well as some auxiliary meta-data. The rest of the outputs construct the serialized document. When looking up L[k]L[k], only the most recent transaction is returned, which allows update and delete operations in addition to inserts.

      This paragraph explains how blockchain hides one's individual identity and privacy, while giving them a secure way of using the funds. In my opinion lot hacker ransomware are done using block-chain technology coins, this and one more paragraph here is really interesting to read about how blockchain helps protect personal data. and i also related this this hacking and corruption or money laundering

    1. Per Bloomberg, which cited an memo from an anonymous Google staffer, employees discovered that the company was creating the new tool as a Chrome browser extension that would be installed on all employees’ systems and used to monitor their activities.

      From the Bloomberg article:

      Earlier this month, employees said they discovered that a team within the company was creating the new tool for the custom Google Chrome browser installed on all workers’ computers and used to search internal systems. The concerns were outlined in a memo written by a Google employee and reviewed by Bloomberg News and by three Google employees who requested anonymity because they aren’t authorized to talk to the press.

  5. Sep 2019
    1. On social media, we are at the mercy of the platform. It crops our images the way it wants to. It puts our posts in the same, uniform grids. We are yet another profile contained in a platform with a million others, pushed around by the changing tides of a company's whims. Algorithms determine where our posts show up in people’s feeds and in what order, how someone swipes through our photos, where we can and can’t post a link. The company decides whether we're in violation of privacy laws for sharing content we created ourselves. It can ban or shut us down without notice or explanation. On social media, we are not in control.

      This is why I love personal web sites. They're your own, you do whatever you want with them, and you control them. Nothing is owned by others and you're completely free to do whatever you want.

      That's not the case with Facebook, Microsoft, Slack, Jira, whatever.

    1. There is already a lot of information Facebook can assume from that simple notification: that you are probably a woman, probably menstruating, possibly trying to have (or trying to avoid having) a baby. Moreover, even though you are asked to agree to their privacy policy, Maya starts sharing data with Facebook before you get to agree to anything. This raises some serious transparency concerns.

      Privacy International are highlighting how period-tracking apps are violating users' privacy.

  6. Aug 2019
    1. Even if you choose not to use Wi-Fi services we make available at MGM Resorts, we may still collect information concerning the precise physical location of your mobile device within and around MGM Resorts for non-marketing purposes. 

      Holy cow

    1. Now, I'd rather pay for a product that sticks around than have my personal data sold to use a free product that may not be around tomorrow. I value my privacy much more today. If you're not paying for the product... you are the product being sold.
  7. Jul 2019
    1. Even if we never see this brain-reading tech in Facebook products (something that would probably cause just a little concern), researchers could use it to improve the lives of people who can’t speak due to paralysis or other issues.
    2. That’s very different from the system Facebook described in 2017: a noninvasive, mass-market cap that lets people type more than 100 words per minute without manual text entry or speech-to-text transcription.
    3. Their work demonstrates a method of quickly “reading” whole words and phrases from the brain — getting Facebook slightly closer to its dream of a noninvasive thought-typing system.
    1. If Bluetooth is ON on your Apple device everyone nearby can understand current status of your device, get info about battery, device name, Wi-Fi status, buffer availability, OS version and even get your mobile phone number
    1. Comparison between web browsers

      This is one of the best resources on web privacy I've ever seen. I warmly recommend it!

    1. Two years ago, when he moved from Boston to London, he had to register with a general practitioner. The doctor’s office gave him a form to sign saying that his medical data would be shared with other hospitals he might go to, and with a system that might distribute his information to universities, private companies and other government departments.The form added that the although the data are anonymized, “there are those who believe a person can be identified through this information.”“That was really scary,” Dr. de Montjoye said. “We are at a point where we know a risk exists and count on people saying they don’t care about privacy. It’s insane.”
    2. Scientists at Imperial College London and Université Catholique de Louvain, in Belgium, reported in the journal Nature Communications that they had devised a computer algorithm that can identify 99.98 percent of Americans from almost any available data set with as few as 15 attributes, such as gender, ZIP code or marital status.

      This goes to show that one should not trust companies and organisations which claim to "anonymise" your data.

  8. Jun 2019
  9. educatorinnovator.org educatorinnovator.org
    1. snafus, like those of privacy settings

      I'm struck by the choice of "snafu" to describe "privacy settings." I worry describing privacy as a snafu undermines the seriousness with which teachers and students should evaluate a technology's privacy settings when choosing to incorporate the technology into a classroom and other learning environment.

  10. www.joinhoney.com www.joinhoney.com
    1. Honey’s products do not support Do Not Track requests at this time, which means that we collect information about your online activity while you are using Honey’s products in the manner described above.

      So even if you ask us not to track you, we will anyway.

    2. Once you delete your profile, there is no longer any data attributable to you.

      Which means they do not delete all your information.

    3. After you have terminated your use of Honey’s products, we will store your information in an aggregated and anonymised format.

      We keep your info forever, in other words.

    4. as long as is required

      Which is?

    5. That means while you are using the Extension and Honey is saving you money,

      Slickly written. These dudes are good!

    6. While you are using the Extension, this does NOT include any information from your search engine history or from your email.

      Trust us!

  11. May 2019
    1. They’ve learned, and that’s more dangerous than caring, because that means they’re rationally pricing these harms. The day that 20% of consumers put a price tag on privacy, freemium is over and privacy is back.

      Google want you to say yes, not because they're inviting positivity more than ever, but because they want you to purchase things and make them richer. This is the essence of capitalism.

    1. Unsurprisingly living up to its reputation, Facebook refuses to comply with my GDPR Subject Access Requests in an appropriate manner.

      Facebook never has cared about privacy of individuals. This is highly interesting.

    1. Now, how does that mother build an online scrapbook of all the items that were poured into the system?

      The assumptions here are interesting. Does mom have the right to every picture taken at her party? Do the guests have the right to take pictures and post them on the web?

  12. Apr 2019
    1. The report also noted a 27 percent increase in the number of foreigners whose communications were targeted by the NSA during the year. In total, an estimated 164,770 foreign individuals or groups were targeted with search terms used by the NSA to monitor their communications, up from 129,080 on the year prior.
    1. we get some of it by collecting data about your interactions, use and experiences with our products. The data we collect depends on the context of your interactions with Microsoft and the choices that you make, including your privacy settings and the products and features that you use. We also obtain data about you from third parties.
    1. Washington state Attorney General Bob Ferguson said Thursday that Motel 6 shared the information of about 80,000 guests in the state from 2015 to 2017. That led to targeted investigations of guests with Latino-sounding names, according to Ferguson. He said many guests faced questioning from ICE, detainment or deportation as a result of the disclosures. It's the second settlement over the company's practice in recent months.

      If you stay at Motel 6, prepare to have your latino-tinged data handed over to the authorities who are looking to harm you permanently.

    1. LastPass is run by LogMeIn, Inc. which is based in United States. So let’s say the NSA knocks on their door: “Hey, we need your data on XYZ so we can check their terrorism connections!” As we know by now, NSA does these things and it happens to random people as well, despite not having any ties to terrorism. LastPass data on the server is worthless on its own, but NSA might be able to pressure the company into sending a breach notification to this user.
    1. Facebook users are being interrupted by an interstitial demanding they provide the password for the email account they gave to Facebook when signing up. “To continue using Facebook, you’ll need to confirm your email,” the message demands. “Since you signed up with [email address], you can do that automatically …”A form below the message asked for the users’ “email password.”

      So, Facebook tries to get users to give them their private and non-Facebook e-mail-account password.

      This practice is called spear phishing.

  13. Mar 2019
    1. As one of 13 million officially designated “discredited individuals,” or laolai in Chinese, 47-year-old Kong is banned from spending on “luxuries,” whose definition includes air travel and fast trains.
    2. Discredited individuals have been barred from taking a total of 17.5 million flights and 5.5 million high-speed train trips as of the end of 2018, according to the latest annual report by the National Public Credit Information Center.The list of “discredited individuals” was introduced in 2013, months before the State Council unveiled a plan in 2014 to build a social credit system by 2020.

      This is what surveillance capitalism brings. This is due to what is called China's "Golden Shield", a credit-statement system that, for example, brings your credit level down if you search for terms such as "Tianmen Square Protest" or post "challenging" pictures on Facebook.

      This is surveillance capitalism at its worst, creating a new lower class for the likes of Google, Facebook, Microsoft, Amazon, and insurance companies. Keep the rabble away, as it were.

    1. Amazon has been beta testing the ads on Apple Inc.’s iOS platform for several months, according to people familiar with the plan. A similar product for Google’s Android platform is planned for later this year, said the people, who asked not to be identified because they’re not authorized to share the information publicly.

      Sounds like one of the best reasons I've ever heard to run Brave Browser both on desktop and mobile. https://brave.com/

    1. Sharing of user data is routine, yet far from transparent. Clinicians should be conscious of privacy risks in their own use of apps and, when recommending apps, explain the potential for loss of privacy as part of informed consent. Privacy regulation should emphasise the accountabilities of those who control and process user data. Developers should disclose all data sharing practices and allow users to choose precisely what data are shared and with whom.

      Horrific conclusion, which clearly states that "sharing of user data is routine" where the medical profession is concerned.

    2. To investigate whether and how user data are shared by top rated medicines related mobile applications (apps) and to characterise privacy risks to app users, both clinicians and consumers.

      "24 of 821 apps identified by an app store crawling program. Included apps pertained to medicines information, dispensing, administration, prescribing, or use, and were interactive."

  14. Feb 2019
    1. Less than a third of the apps that collect identifiers take only the Advertising ID, as recommended by Google's best practices for developers.

      33% apps violate Google Advertising ID policy

    1. “It’s, like, maybe you could have a conversation about whether you should be able to pay and not see ads. That doesn’t feel like a moral question to me. But the question of whether you can pay to have different privacy controls feels wrong.”

      surveillance capitalism or pay-for-privacy capitalism knocking on the door...

    2. though it might break Facebook’s revenue machine by pulling the most affluent and desired users out of the ad targeting pool.

      I doubt the vast majority of the most active FB users are "affluent"

    1. Growing Focus on Measuring Learning

      This topic belongs here, but I would have liked to see an acknowledgement about privacy concerns related to measuring learning. How are we engaging students in the design of this work?

    1. Nearly half of FBI rap sheets failed to include information on the outcome of a case after an arrest—for example, whether a charge was dismissed or otherwise disposed of without a conviction, or if a record was expunged

      This explains my personal experience here: https://hyp.is/EIfMfivUEem7SFcAiWxUpA/epic.org/privacy/global_entry/default.html (Why someone who had Global Entry was flagged for a police incident before he applied for Global Entry).

    2. Applicants also agree to have their fingerprints entered into DHS’ Automatic Biometric Identification System (IDENT) “for recurrent immigration, law enforcement, and intelligence checks, including checks against latent prints associated with unsolved crimes.

      Intelligence checks is very concerning here as it suggests pretty much what has already been leaked, that the US is running complex autonomous screening of all of this data all the time. This also opens up the possibility for discriminatory algorithms since most of these are probably rooted in machine learning techniques and the criminal justice system in the US today tends to be fairly biased towards certain groups of people to begin with.

    3. It cited research, including some authored by the FBI, indicating that “some of the biometrics at the core of NGI, like facial recognition, may misidentify African Americans, young people, and women at higher rates than whites, older people, and men, respectively.

      This re-affirms the previous annotation that the set of training data for the intelligence checks the US runs on global entry data is biased towards certain groups of people.

    4. for as long as your fingerprints and associated information are retained in NGI, your information may be disclosed pursuant to your consent or without your consent.

      Meaning they can give your information to with or without your consent.

    5. people enrolled in, or applying to, the program consent to have their personal data added to the FBI’s Next Generation Identification (NGI) database, shared with “federal, state, local, tribal, territorial, or foreign government agencies”, and DHS third-party “grantees, experts, [and] consultants” forever.

      So it's not just shared with the US government but any government official from any country. Also third-party experts pretty much opens it up for personal information to be shared with anyone.

    1. as part of the application process, TSA collects a cache of personal information about you, including your prints. They’re held in a database for 75 years, and the database is queried by the FBI and state and local law enforcement as needed to solve crimes at which fingerprints are lifted from crime scenes, according to Nojeim. The prints may also be used for background checks.

      While Global Entry itself only lasts for 4 years, the data you give them and allow them to store lasts for almost your entire life.

    1. by providing their passport information and a copy of their fingerprints. According to CBP, registrants must also pass a background check and an interview with a CBP officer before they may be enrolled in the program

      I was at my Global Entry interview (not at all sure I made the right decision to apply) and a person who already had Global Entry came into the room because he had gotten flagged. The lady at the desk asked him if he had ever been arrested, he said no. She said their new system (they continuously update it with new algorithms to find this info) had flagged a police incident that had happened prior to him applying for Global Entry. He hadn’t been arrested, wasn’t guilty of any crime but his name had apparently made it into some police report and that gave them cause to question him when he re-entered his country.

    2. including data breaches and bankruptcy, experienced by “Clear,” a similar registered traveler program

      Clear was another travel program that had a breach of traveler's personal information so it is not unreasonable to be cautious of Global Entry which has the same information and same legal protections in place (or lack there of).

    1. Both afford us the op-portunity to learn with others, but they are very different environments with different po-tential risks and benefits.

      As mentioned earlier in this article, experiences that incorporate private and public contexts can help people advance their understanding and facility in negotiating these different spaces.

  15. Jan 2019
    1. 被修复的并不是这些互联网巨头,而是区块链本身。那些承诺将世界从资本主义的枷锁中解放出来的加密货币创业公司,现在甚至无法保证其自己员工的收益。Facebook的方法是整合区块链的碎片并紧跟潮流,从而让股东更容易接受。

      <big>评:</big><br/><br/>鲁迅先生曾说过这么一句话:「我家院子里有两棵树,一棵是枣树,另一棵也是枣树」。有趣的是,以「榨取」用户隐私商业价值起家的 Facebook,其创始人 Zuckerberg 为了避免狗仔队的骚扰,把自家房子周围的其他四所房子也给买了下来。现在,我们可以回答,what is beside the walled garden? It’s another walled garden.

  16. Dec 2018
    1. Instagram, otra red de su propiedad. “¿Por qué debería alguien seguir creyendo en Facebook?”, fue uno de los artículos publicados. Foto: AP

      Willing to find refugee, to escape from one's own mind. The high winners. Their realities replicated in millions of minds.

  17. Nov 2018
    1. Does the widespread and routine collection of student data in ever new and potentially more-invasive forms risk normalizing and numbing students to the potential privacy and security risks?

      What happens if we turn this around - given a widespread and routine data collection culture which normalizes and numbs students to risk as early as K-8, what are our responsibilities (and strategies) to educate around this culture? And how do our institutional practices relate to that educational mission?

  18. Oct 2018
    1. how do we help students navigate privacy issues in learning spaces augmented with social/digital media. There was a specific request for examples to walk students through this. Here is what I do.

      I'm a little unnerved by the semi-legal nature of the "Interactive Project Release Form" but I think it's a great model (whether really legally enforceable or just a class constitution-type document).

  19. Sep 2018
    1. // Download a json but don't reveal who is downloading it fetch("sneaky.json", {referrerPolicy: "no-referrer"}) .then(function(response) { /* consume the response */ }); // Download a json but pretend another page is downloading it fetch("sneaky.json", {referrer: "https://example.site/fake.html"}) .then(function(response) { /* consume the response */ }); // You can only set same-origin referrers. fetch("sneaky.json", {referrer: "https://cross.origin/page.html"}) .catch(function(exc) { // exc.name == "TypeError" // exc.message == "Referrer URL https://cross.origin/page.html cannot be cross-origin to the entry settings object (https://example.site)." }); // Download a potentially cross-origin json and don't reveal // the full referrer URL across origins fetch(jsonURL, {referrerPolicy: "origin-when-cross-origin"}) .then(function(response) { /* consume the response */ }); // Download a potentially cross-origin json and reveal a // fake referrer URL on your own origin only. fetch(jsonURL, {referrer: "https://example.site/fake.html", referrerPolicy: "origin-when-cross-origin"}) .then(function(response) { /* consume the response */ });
  20. Aug 2018
    1. A file containing personal information of 14.8 million Texas residents was discovered on an unsecured server. It is not clear who owns the server, but the data was likely compiled by Data Trust, a firm created by the GOP.

    1. By last year, Google’s parent, Alphabet, was spending more money on lobbyists than any other corporation in America.
    2. “Under this law, the attorney general of California will become the chief privacy officer of the United States of America,” Mactaggart argued.
    3. “Silicon Valley’s model puts the onus on the user to decide if the bargain is fair,” Soltani told me recently. “It’s like selling you coffee and making it your job to decide if the coffee has lead in it.” When it comes to privacy, he said, “we have no baseline law that says you can’t put lead in coffee.”

      An interesting analogy for privacy

    1. Google also says location records stored in My Activity are used to target ads. Ad buyers can target ads to specific locations — say, a mile radius around a particular landmark — and typically have to pay more to reach this narrower audience. While disabling “Web & App Activity” will stop Google from storing location markers, it also prevents Google from storing information generated by searches and other activity. That can limit the effectiveness of the Google Assistant, the company’s digital concierge. Sean O’Brien, a Yale Privacy Lab researcher with whom the AP shared its findings, said it is “disingenuous” for Google to continuously record these locations even when users disable Location History. “To me, it’s something people should know,” he said.
    2. Sen. Mark Warner of Virginia told the AP it is “frustratingly common” for technology companies “to have corporate practices that diverge wildly from the totally reasonable expectations of their users,” and urged policies that would give users more control of their data. Rep. Frank Pallone of New Jersey called for “comprehensive consumer privacy and data security legislation” in the wake of the AP report.
    3. Google says that will prevent the company from remembering where you’ve been. Google’s support page on the subject states: “You can turn off Location History at any time. With Location History off, the places you go are no longer stored.” That isn’t true. Even with Location History paused, some Google apps automatically store time-stamped location data without asking. (It’s possible, although laborious, to delete it .)
    4. Storing your minute-by-minute travels carries privacy risks and has been used by police to determine the location of suspects — such as a warrant that police in Raleigh, North Carolina, served on Google last year to find devices near a murder scene. So the company lets you “pause” a setting called Location History.
    1. I am not, and will never be, a simple writer. I have sought to convict, accuse, comfort, and plead with my readers. I’m leaving the majority of my flaws online: Go for it, you can find them if you want. It’s a choice I made long ago.
  21. Jul 2018
    1. where applicable, any rating in the form of a data trust score that may be assignedto the data fiduciary under section 35;and

      A Data Trust score. Thankfully, it isn't mandatory to have a data trust score, which mean that apps and services can exist without there being a trust score

    2. the period for which the personal data will beretained in terms of section 10 or where such period is not known, the criteria for determining such period;

      This defines the terms for data retention. From a company perspective, they are likely to keep this as broad as possible.

    3. Upon receipt of notification, the Authority shall determine whether such breach should be reported by the data fiduciaryto the data principal, taking into account the severity of the harm that may be caused to such data principal or whether some action is required on the part of the data principal to mitigate suchharm.

      This means that users aren't always informed about a breach of data. That's the prerogative of the Data Protection Authority, and not mandatory, in the interest of the user.

    4. “Personal data breach”means any unauthorised or accidental disclosure, acquisition, sharing, use, alteration, destruction, loss of access to, of personal data that compromises the confidentiality, integrity or availability of personal data to a data principal;

      Personal data breach here includes "accidental disclosure" as well.

    5. Notwithstanding anything contained in sub-sections (1) and (2), the Act shall not apply toprocessing ofanonymised data.

      The law isn't applicable to anonymised data. However it doesn't deal with pseudonomised data.

    6. in connection with any activity which involves profiling of data principals within the territory of India.

      This clause gives the law jurisdiction over data of Indian residents or visitors, processed beyond the physical boundaries of India

    7. in connection with any business carried on in India, or any systematic activity of offering goods or services to data principals within the territory of India; or

      Since the Internet is boundary-less, this law will apply to all online services that are being consumed in India: apps downloaded, websites viewed.

    8. Where the data principal withdraws consentfor the processing of any personal data necessary for the performance of a contract to which the data principal is a party, all legal consequences for the effects of such withdrawal shall be borne by the data principal.

      How does it serve public interest and individual rights to hold people liable for the withdrawal of consent to the processing of their personal data?

    1. Privacy

      Privacy is super important! I'm glad they reference this.

    1. challenging and time-consuming

      I'd agree with all of the challenges identified here. Understanding these is useful in designing ways to help support faculty and staff regarding OEP. An additional challenge that emerged in my recent research on OEP was faculty concerns regarding privacy and identity -- this included defining (and continually negotiating) personal/professional & teacher/student boundaries in their open practice. Exploring such tensions is an important part of supporting faculty and staff consideration/exploration of open practices.

    1. But Blair is not just posting about her own life; she has taken non-consenting parties along for the ride.
    1. Privacy advocates tried to explain that persuasion was just the tip of the iceberg. Commercial databases were juicy targets for spies and identity thieves, to say nothing of blackmail for people whose data-trails revealed socially risky sexual practices, religious beliefs, or political views.
  22. Jun 2018
  23. inst-fs-iad-prod.inscloudgate.net inst-fs-iad-prod.inscloudgate.net
    1. IDEAS FOR TECHNICAL MECHANISMSA technique called differential privacy1 provides a way to measure the likelihood of negative impact and also a way to introduce plausible deniability, which in many cases can dramatically reduce risk exposure for sensitive data.Modern encryption techniques allow a user’s information to be fully encrypted on their device, but using it becomes unwieldy. Balancing the levels of encryption is challenging, but can create strong safety guarantees. Homomorphic encryption2 can allow certain types of processing or aggregation to happen without needing to decrypt the data.Creating falsifiable security claims allows independent analysts to validate those claims, and invalidate them when they are compromised. For example, by using subresource integrity to lock the code on a web page, the browser will refuse to load any compromised code. By then publishing the code’s hash in an immutable location, any compromise of the page is detectable easily (and automatically, with a service worker or external monitor).Taken to their logical conclusion these techniques suggest building our applications in a more decentralized3 way, which not only provides a higher bar for security, but also helps with scaling: if everyone is sharing some of the processing, the servers can do less work. In this model your digital body is no longer spread throughout servers on the internet; instead the applications come to you and you directly control how they interact with your data.
  24. May 2018
  25. Apr 2018
    1. What can we build that would allow people to 1.) annotate terms of service related to tools they adopt in a classroom? and 2.) see an aggregated list of all current annotations. Last, if we were to start critically analyzing EdTech Terms of Service, what questions should we even ask?

    1. A purpose that is vague or general, such as for instance ‘Improving users’ experience’, ‘marketing purposes’, or ‘future research’ will – without further detail – usually not meet the criteria of being ‘specific’”.[

      I see a lot of cookie notices that give vague reasons like "improving user experience". Specifically disallowed by GDPR?

    2. The GDPR permits the opt-out approach when the purposes that the companies want to use the data for are “compatible” with the original purpose for which personal data were shared by users.[6] In addition to the opt-out notice, users also have to be told of their right to object at any time to the use of their data for direct marketing.[7]

      GDPR can allow opt out rather than opt in.

    1. The alternative, of a regulatory patchwork, would make it harder for the West to amass a shared stock of AI training data to rival China’s.

      Fascinating geopolitical suggestion here: Trans-Atlantic GDPR-like rules as the NATO of data privacy to effectively allow "the West" to compete against the People's Republic of China in the development of artificial intelligence.

    1. Data Re-Use. Contractor agrees that any and all Institutional Data exchanged shall be used expressly and solely for the purposes enumerated in the Agreement. UH Institutional Data shall not be distributed, repurposed or shared across other applications, environments, or business units of the Contractor. The Contractor further agrees that no Institutional Data of any kind shall be revealed, transmitted, exchanged or otherwise passed to other vendors or interested parties except on a case-by-case basis as specifically agreed to in writing by a University officer with designated data, security, or signature authority.

      Like this clause. Wonder if this is the exception or the rule in Uni procurement deals these days?

  26. Mar 2018
  27. Feb 2018
    1. We will not require a child to provide more information than is reasonably necessary in order to participate in an online activity.
  28. Jan 2018
  29. Dec 2017
    1. Starting Tuesday, any time someone uploads a photo that includes what Facebook thinks is your face, you’ll be notified even if you weren’t tagged.

      This is eerily like in the book The Circle where facial recognition is done over all photos and video on the web--including CCTV. No more secrets.

    1. Projects by IF is a limited company based in London, England. We run this website (projectsbyif.com) and its subdomains. We also use third party services to publish work, keep in touch with people and understand how we can do those things better. Many of those services collect some data about people who are interested in IF, come to our events or work with us. Here you can find out what those services are, how we use them and how we store the information they collect. If you’ve got any questions, or want to know more about data we might have collected about you, email hello@projectsbyif.com This page was published on 25 August 2017. You can see any revisions by visiting the repository on Github.

      As you'd expect, If's privacy page is fantastic

  30. Nov 2017
    1. While the teacher can correlate individual responses with the children’s names, no one else—not the app, not the museum—has any personal information about the learners.
    2. creates a highly personalized experience for the children while simultaneously alleviating privacy concerns.
    1. The users of a website known as Ashleymadison.com which was used by people who wanted to have secret relationships had 30 million of its users names released. This resulted in 2 suicides which were linked to the disclosure. The article talks about the “illusion” of internet securuity and if someone knows how to they can steal sensitive data and ruin lives. It shows that internet data is never truly safe. Related posts:

      i completely agree that today hackers can get into almost any device or software and this needs to be dealt with asap as people who do get exposed suffer miserably not only with depression but so much pressure too.

    2. i completely agree that today hackers can get into almost any device or software and this needs to be dealt with asap as people who do get exposed suffer miserably not only with depression but so much pressure too.

    1. Yes, it is very probable that you can due to the high probability that there is only one person of a specific gender and D.O.B., living in your zip code. However, it is possible that there could be a few people of the same demographic all living in a larger city.

      I do agree with the possibility of being able to trace someone based on all three aspects. I never really considered the likability for having the same demographic when living in a larger city.

    1. Every site you access and every vendor you purchase from keeps data on you and so does your computer. I think it is very important for everyone to be aware of this. If you access unreputable sites, it could be used against you in a job search for instance.

      I agree that everyone should be aware that every activity made on the Internet is monitored. Thus, with that data other people will be able to use those information to cause hindrance into our life. For example, our credit card information got hacked.

    1. It is likely that you can. Because odds are there is only one person that is the same age and birth day that lives in your zip code. But it is possible that you would only have a couple options.

      I agree as based on a study they could re-identify credit card users 90% of the time just based on information which were not personal to the credit card users.

  31. Oct 2017
    1. SomescholarshavechallengedthesortingeffectsoftheGooglesearchenginetohighlightthatitsoperation(1)isbasedondecisionsinscribedintoalgorithmsthatfavouranddiscriminatecontent,(2)issubjecttopersonalization,localization,andselection,and(3)threatensprivacy
    2. Opennessinrelationtosharingthushasmultiplemeaningsandisamatterofpoliticalcontestationthatbeliesthepositiveformulationsofitasafoundingimaginaryofcyberspace.Ontheonehand,itmeansmakinggovernmentstransparent,democratizingknowledge,collaboratingandco-producing,andimprovingwell-beingbutontheother,exposing,makingvisible,andopeningupsubjectstovariousknownandunknownpracticesandinterventions.[76]Alongwithparticipatingandconnecting,sharinggeneratesthesetensions,especiallyinrelationtowhatisoftenreducedtoasquestionsofprivacy.Thistensionthatopennessgeneratesincreasinglycreatesadditionaldemandsthatcitizenssecurethemselvesfromandberesponsibleforthepotentialandevenunknowableconsequencesoftheirdigitalconduct.
    3. Actsofconnectingrespondtoacallingthatpersistseveninlightofthetraceabilityofdigitalactionsandconcernsaboutprivacy.Thosewhoaremakingrightsclaimstoprivacyanddataownershiparebyfaroutnumberedbythosewhocontinuetosharedatawithoutconcern.Thatadatatraceisamaterialthatcanbemined,shared,analysed,andacteduponbynumerouspeoplemakestheimaginaryofopennessvulnerabletooftenunknownorunforeseeableacts.Butdigitaltracesalsointroduceanothertension.Anothercalling,thatofsharingdigitalcontentandtraces,isademandthatevokestheimaginaryofopennessfundamentaltotheveryarchitectureofsharedresourcesandgifteconomythatformedtheonce-dominantlogicofcyberspace
    1. The learning analytics and education data mining discussed in this handbook hold great promise. At the same time, they raise important concerns about security, privacy, and the broader consequences of big data-driven education. This chapter describes the regulatory framework governing student data, its neglect of learning analytics and educational data mining, and proactive approaches to privacy. It is less about conveying specific rules and more about relevant concerns and solutions. Traditional student privacy law focuses on ensuring that parents or schools approve disclosure of student information. They are designed, however, to apply to paper “education records,” not “student data.” As a result, they no longer provide meaningful oversight. The primary federal student privacy statute does not even impose direct consequences for noncompliance or cover “learner” data collected directly from students. Newer privacy protections are uncoordinated, often prohibiting specific practices to disastrous effect or trying to limit “commercial” use. These also neglect the nuanced ethical issues that exist even when big data serves educational purposes. I propose a proactive approach that goes beyond mere compliance and includes explicitly considering broader consequences and ethics, putting explicit review protocols in place, providing meaningful transparency, and ensuring algorithmic accountability. Export Citation: Plain Text (APA
  32. Sep 2017
    1. AsRonaldDeibertrecentlysuggested,whiletheInternetusedtobecharacterizedasanetworkofnetworksitisperhapsmoreappropriatenowtoseeitasanetworkoffiltersandchokepoints.[4]ThestruggleoverthethingswesayanddothroughtheInternetisnowapoliticalstruggleofourtimes,andsoistheInternetitself.

    Tags

    Annotators

    1. Co-regulation encompasses initiatives in which government and industry share responsibility for drafting and en-forcing regulatory standards
    2. policy makers and scholars should explore an alternative approach known as “co-regulation.
    1. extremely cool, but...

      comparing with tahoe-lafs:

      clearly separates writecap from readcap, but... does it grok readcap as separate from idcap?

      client-side encryption?

      n-of-k erasure encoding?

    1. State does have a legitimate interest when it monitors the web to secure the nation against cyber attacks and the activities of terrorists.

      Legitimate state interest

    2. Apart from national security, the state may have justifiable reasons for the collection and storage of data. In a social welfare state, the government embarks upon programmes which provide benefits to impoverished and marginalised sections of society. There is a vital state interest in ensuring that scarce public resources are not dissipated by the diversion of resources to persons whodo not qualify as recipients

      Limits on privacy, national security and public good

    3. Liberty has a broader meaning of which privacy is a subset. All liberties may not be exercised in privacy. Yet others can be fulfilled only within a private space. Privacy enables the individual to retain the autonomy of the body and mind. The autonomy of the individual is the ability to make decisions on vital matters of concern to life

      Privacy as subset of liberty

    4. Privacy attaches to the person and not to the place where it is associated
    5. Privacy is an intrinsic recognition of heterogeneity, of the right of the individual to be different and to stand against the tide of conformity in creating a zone of solitude.

      privacy and heterogenity

    6. The concept is founded on the autonomy of the individual. The ability of an individual to make choices lies at the core of the human personality. The notion of privacy enables the individual to assert and control the human element which is inseparable from the personality of the individual. The inviolable nature of the human personality is manifested in the ability to make decisions on matters intimate to human life. The autonomy of the individual is associated over matters which can be kept private. These are concerns over which there is a legitimate expectation of privacy. The body and the mind are inseparable elements of the human personality. The integrity of the body and the sanctity of the mind can exist on the foundation that each individual possesses an inalienable ability and right to preserve a private space in which the human personality can develop. Without the ability to make choices, the inviolability of the personality would be in doubt. Recognizing a zone of privacy is but an acknowledgment that each individual must be entitled to chart and pursue the course of development of personality. Hence privacy is a postulate of human dignity itself.

      privacy and autonomy. Privacy a postulate of human dignity

    7. right to privacy must be forsaken in the interest of welfare entitlements provided by the State

      privacy is an elitist concern

    8. that there is a statutory regime by virtue of which the right to privacyis adequately protected and hence it is not necessary to read a constitutional right to privacy into the fundamental rights. This submission is sought to be fortified by contending that privacy is merely a common law right and the statutory protection is a reflection of that position

      A statutory and common law right to privacy negates the need for a constitutional right

    9. Anita Allen

      Anita Allen

      • spatial
      • informational
      • decisional
      • reputational
      • associational
    10. Roger Clarke

      Clarke's maslow pyramid classification

      • bodily privacy
      • spatial privacy
      • privacy of communication
      • privacy of personal data
    11. Alan Westin

      Westin's four states of privacy - solitude, intimacy, anonymity, reservation

    12. dangers of privacy when it is used to cover up physical harm done to women by perpetrating their subjection.

      Feminist critique of privacy

    13. privacy should be protected only when access to information would reduce its value such as when a student is allowed access to a letter of recommendation for admission, rendering such a letter less reliable. According to Posner, privacy when manifested as control over information about oneself, is utilised to mislead or manipulate others

      Economic critique of privacy - posner

    14. Judith Jarvis Thomson,in an article published in 1975, noted that while there is little agreement on the content of privacy, ultimately privacy is a cluster of rights which overlap with property rights or the right to bodily security. In her view, the right to privacy is derivative in the sense that a privacy violation is better understood as violation of a more basic right

      Reductionist critique of privacy - JJ Thomson used by respondents to support the argument that privacy itself is not a right, but privacy violations may lead to other violations.

    15. The purpose of elevating certain rights to the stature of guaranteed fundamental rights is to insulate their exercise from the disdain of majorities, whether legislative or popular. The guarantee of constitutional rights does not depend upon their exercise being favourably regarded by majoritarian opinion. The test of popular acceptance does not furnish a valid basis to disregard rights which are conferred with the sanctity of constitutional protection

      Need for fundamental rights, and statutory protection not being sufficient

    16. narrow tailoring of the regulation to meet the needs of a compelling interest

      Narrow tailoring + compelling interest test

    17. The view about the absenceof a right to privacy is an isolated observation which cannot coexist with the essential determination rendered on the first aspect of the regulation. Subsequent Benches of this Court in the last five decades and more, have attempted to make coherent doctrine out of the uneasy coexistence between the first and the second parts of the decision in Kharak Singh

      Kharak Singh - the observation on absence of rt to privacy an isolated one at variuance with the first part?

    18. The observation in regard to the absence of the right to privacy in our Constitution was strictly speaking, not necessary for the decision of the Court in M PSharmaand the observation itself is no more than a passing observation.

      Observations on privacy in MP Sharma not part of the ratio

    19. adverted to international conventions acceded to by India including the UDHR and ICCPR. Provisions in these conventions which confer a protection against arbitrary and unlawful interference with a person’s privacy, family and home would, it was held, be read in a manner which harmonizes the fundamental rights contained in Articles 14, 15, 19 and 21 with India’s international obligations

      Nalsa - recognition of international conventions in interpreting FRs

    20. our considered opinion that subjecting a person to the impugned techniques in an involuntary manner violates the prescribed boundaries of privacy. Forcible interference with a person's mental processes is not provided for under any statute and it most certainly comes into conflict with the “right against self-incrimination”

      Narco analysis, polygraph etc. right not to be compelled to give evidence seen as part of privacy as well, esp where investigative techniques involve interference with metal processes.

    21. The crucial consideration is that a woman's right to privacy, dignity and bodily integrity should be respected

      Woman's right to choose

    22. Also, a large number of people are non-vegetarian and they cannot be compelled to become vegetarian for a long period. What one eats is one's personal affair and it is a part of his right to privacy which is included in Article 21 of our Constitution as held by several decisions of this Court.

      Hinsa Virodhak Sangh - aside from right to practise trade under 19 (1) (g), right to make one's eating choices was also invoked - example of privacy including decisional autonomy.

      Important to note that this principles is qualified by only being applied if the ban was for a considerable period of time

    23. reasonable expectation that it will be utilised

      Does the constitutional right to privacy envisage the purpose limitation principle? Does it only apply to state/private parties acting on behalf of state or for purely horizontal relationships as well?

    24. access to bank records to the Collectordoes not permit a delegation of those powers by the Collector to a private individual. Hence even when the power to inspect and search is validly exercisable by an organ of the state, necessary safeguards would be required to ensure that the information does not travel to unauthorised private hands.

      Where delegation of responsibilities, need for proper safeguards. Very relevant observation in the context of PPP models of governance and data collection/processing

    25. India’s international commitments under the Universal Declaration of Human Rights (UDHR) and International Covenant on Civil and Political Rights(ICCPR)

      ICCPR and UDHR as instrumental in foundation for affirmation of privacy

    26. The significance of the judgment in Canara Banklies first in its reaffirmation of the right to privacy as emanating from the liberties guaranteed by Article 19 and from the protection of life and personal liberty under Article 21

      privacy derived from freedoms under 19, as well life and liberty under 21

    27. Court repudiated the notion that a person who places documents with a bank would, as a result, forsake an expectation of confidentiality. In the view of the Court, even if the documents cease to be at a place other than in the custody and control of the customer, privacy attaches to persons and not places and hence the protection of privacy is not diluted

      2 important observations

      • recognition of privacy attached to persons and and not places (moving beyond a propertarian view of privacy)

      • sharing of information does not lead to forsaking a reasonable expectation of privacy. Without reference, repudiation of third party doctrine. privacy not quivalent with secrecy.

    28. penumbras created by the Bill of Rights resulting in a zone of privacy101leading up eventually to a “reasonable expectation of privacy”

      Canara Bank - reference made to penumbra of rights creating a zone of privacy

    29. The need to read the fundamental constitutional guarantees with a purpose illuminated by India’s commitment to the international regime of human rights’ protection also weighed in the decision

      reading FRs in light of international commitments

    30. Article 21, in the view of the Court, has to be interpreted in conformity with international law

      International law

    31. While it is true that in Rajagopalit is a private publisher who was seeking to publish an article about a death row convict, itis equally true that the Court dealt with a prior restraint on publication imposed by the

      DYC responds to Bhatia's critique of Rajagopal. While Rajagopal dealt with private actions, Frs are invoked due to state action in the form restraint placed on the publication by the state and prison officials.

    32. The right to privacy is implicit in the right to life and liberty guaranteed to the citizens of this country by Article 21

      Rajagopal - recignition of privacy as implicit in liberty

    33. bodily integrity of a woman, as an incident of her privacy.

      Maharashtra v. Madhukar two imp. observations - a woman of easy virtue is also entitled to the same constitutional protections. furthers the view that rights are available to all citizens (counter to the view in Malkani which said that privacy is not to protect the guilty)

      more importantly, established a woman's bodily integrity as a part of privacy

    34. observations in Malak Singhon the issue of privacy indicate that an encroachment on privacy infringes personal liberty under Article 21 and the right to the freedom of movement under Article 19(1)(d). Without specifically holding that privacy is a protected constitutional value under Article 19 or Article 21, the judgment of this Court indicates that serious encroachments on privacy impinge upon personal liberty and the freedom of movement

      Malak SIngh is on lines of the view of advanced by the respondents, that some violations of privacy could infringe other recognised rights such as personal liberty under 21 or freedom of movement under 19 (1) (d)