732 Matching Annotations
  1. Sep 2017
    1. This analysis of the decision in Gobindassumes significance because subsequent decisions of smaller Benches have proceeded on the basis that Gobinddoes indeed recognise a right to privacy

      Subsequent judgments follow the mistaken belief that Gobind recognises a right to privacy

    2. implicit in the concept of ordered liberty

      Gobind traces privacy to ordered liberty in 21

    3. Yet a close reading of the decision in Gobindwould indicate that the Court eventually did not enter a specific finding on the existence of a right to privacy under the Constitution

      Gobind did not expressly recognize the right to privacy

    4. he personal intimacies of the home, the family, marriage, motherhood, procreation and child rearing

      dimensions of privacy inclusively enumerated in Gobind

    5. Gobind

      Gobind decided in the post Griswold, post Roe v Wade world. Penumbral rights created by specific rights

  2. Aug 2017
    1. Embracing a culture of sharing that breaks down silos while maintaining ethical and privacy standards will be paramount.

      This is gnarly stuff though and deserves its own deep dive/bullet point.

    2. The embedding of maker culture in K–12 education has made students active contributors to the knowledge ecosystem rather than merely participants and consumers of knowledge.

      How does this get balanced with privacy concerns? I have yet to see an argument or practice that successfully navigates this tension?

    1. Surveillance is the business model of the internet. Everyone is under constant surveillance by many companies, ranging from social networks like Facebook to cellphone providers. This data is collected, compiled, analyzed, and used to try to sell us stuff. Personalized advertising is how these companies make money, and is why so much of the internet is free to users. We’re the product, not the customer.

      Nice succinct statement on the issue.

    1. The request from the DOJ demands that DreamHost hand over 1.3 million visitor IP addresses — in addition to contact information, email content, and photos of thousands of people — in an effort to determine who simply visited the website. (Our customer has also been notified of the pending warrant on the account.)

      That information could be used to identify any individuals who used this site to exercise and express political speech protected under the Constitution’s First Amendment. That should be enough to set alarm bells off in anyone’s mind.

  3. Jul 2017
    1. Security & privacy technology research and engineering

      Researcher in security and privacy technology.

    1. 1 Privacy Considerations for Web Protocols

      Interesting document about Privacy

    1. If one would give me six lines written by the hand of the most honest man, I would find something in them to have him hanged.

      https://www.youtube.com/watch?v=CINVwWHlzTY

      Why The Government Shouldn't Break WhatsApp

  4. Jun 2017
    1. In January 2018, the World Press Photo Foundation, in association with Newcastle University’s Open Lab, University of California Irvine and ArtEZ in the Netherlands will launch Talking Pictures, an open course in critical visual storytelling.

    1. he discussion on renewing FISA 702, international intelligence-sharing agreements and the electronic sharing of health data.
  5. May 2017
    1. We're vulnerable to state-sponsored attacks, he says, because we are too narrowly technological in our solutions.

      I refer to this sentence in my annotation above, as it seems at odds with Mike's earlier statement that this is a tools debate, not a legal one.

    2. People want to turn this into a legal debate, but it's not. It's a tools debate, and the main product of a builder of social tools is not the tool itself but the culture that it creates. So what sort of society do you want to create?

      Not trying to nitpick, but I'm a bit confused between this statement and the one below where Mike says "We're vulnerable to state-sponsored attacks, he says, because we are too narrowly technological in our solutions."

      So far in this debate I've been thinking that we are too quick to jump to technical solutions (as Mike's latter point would suggest) when I don't think the issues online are categorically different than they are offline. While certainly tools can help shape social relations and culture, we also have social/cultural mechanisms to deal with situations generated via online tools.

      Abuse is not limited to online activity and remedies for abuse are not purely technological. If a person abuses another offline, we have (imperfect) mechanisms to address that abuse. Are we considering those offline mechanisms in our confrontation with online abuse?

  6. Apr 2017
    1. The Echo Look suffers from two dovetailing issues: the overwhelming potential for invasive data collection, and Amazon’s lack of a clear policy on how it might prevent that.

      Important to remember. Amazon shares very little about what it collects and what it does with what it collects.

    1. Illustration cynique devant l’impuissance des États à réguler cette concentration, Google a davantage à craindre de ses concurrents directs qui ont des moyens financiers largement supérieurs aux États pour bloquer sa progression sur les marchés. Ainsi, cet accord entre Microsoft et Google, qui conviennent de régler désormais leurs différends uniquement en privé, selon leurs propres règles, pour ne se concentrer que sur leur concurrence de marché et non plus sur la législation.

      Trop gros, les GAFAM ne se sentent plus soumis aux lois. Ils s'arrangent entre eux.

    2. En produisant des services gratuits (ou très accessibles), performants et à haute valeur ajoutée pour les données qu’ils produisent, ces entreprises captent une gigantesque part des activités numériques des utilisateurs. Elles deviennent dès lors les principaux fournisseurs de services avec lesquels les gouvernements doivent composer s’ils veulent appliquer le droit, en particulier dans le cadre de la surveillance des populations et des opérations de sécurité.

      Voilà pourquoi les GAFAM sont aussi puissants (voire plus) que des États.

    3. En fait, je pense que la plupart des gens ne veulent pas que Google réponde à leurs questions. Ils veulent que Google dise ce qu’ils doivent faire ensuite.

      Qui a dit que Google répond à vos questions ? Depuis longtemps, Google fait les questions et les réponses (à vos dépens).

    1. Privacy tech doesn’t take the place of having the law on your side.

      Nowadays, to protect your privacy you need to:

      have the law on your side + trust service providers + use privacy tech

  7. Mar 2017
    1. “At the heart of that First Amendment protection is the right to browse and purchase expressive materials anonymously, without fear of government discovery,” Amazon wrote in its memorandum of law.  

      Amazon doesn't provide information about a murder to protect users from the government. This must be a joke!

    1.  Interior enforcement of our Nation's immigration laws is critically important to the national security and public safety of the United States.  Many aliens who illegally enter the United States and those who overstay or otherwise violate the terms of their visas present a significant threat to national security and public safety.  This is particularly so for aliens who engage in criminal conduct in the United States.

      Like so.

    1. You can delete the data. You can limit its collection. You can restrict who sees it. You can inform students. You can encourage students to resist. Students have always resisted school surveillance.

      The first three of these can be tough for the individual faculty member to accomplish, but informing students and raising awareness around these issues can be done and is essential.

  8. Feb 2017
    1. and developing solutions to real challenges

      Lots of mention of engaging in real world issues/solutions/etc. Is this at odds with the mandates of FERPA and privacy/security in general that govern ed-tech integration in education?

    1. Instead of his usual gear, the Seattle-based security researcher and founder of a stealth security startup brings a locked-down Chromebook and an iPhone SE that’s set up to sync with a separate, non-sensitive Apple account.

      You do what you have to do...

    1. All along the way, or perhaps somewhere along the way, we have confused surveillance for care. And that’s my takeaway for folks here today: when you work for a company or an institution that collects or trades data, you’re making it easy to surveil people and the stakes are high. They’re always high for the most vulnerable. By collecting so much data, you’re making it easy to discipline people. You’re making it easy to control people. You’re putting people at risk. You’re putting students at risk.
  9. Jan 2017
    1. e) We also may make use of third party tracking pixels used by advertising or analytical partners. Some such partners include, but are not limited to: (i) Google Analytics: Used to track statistical information such as page visits and traffic source information allowing us to improve the performance and quality of the Site. For more information please visit: http://www.google.com/analytics/learn/privacy.html. (ii) Google Advertising: Used to track conversions from advertisements on the Google Search and Google Display network. For more information please visit: http://www.google.com/policies/technologies/ads/. Third party pixels and content may make use of cookies. We do not have access or control over these third party cookies and this Policy does not cover the use of third party cookies.

      When the VPN client you intend to use is in fact the one that will leak your personal data!

      What a shame!

    1. The open architecture of the internet reflected the liberal worldview of its creators. As well as being decentralised, the internet was also deliberately designed to be a dumb network.

      Open, decentralised and not meant to know what is transmitted: that's how the Internet has been created. Perfect to protect privacy!

      Sad there are so many people who fight against the Internet today...

    1. I shouldn’t have daisy-chained two such vital accounts — my Google and my iCloud account — together.

      Lesson learned: not chain different accounts by "logging in with" (most of the time Google, Facebook, Twitter)

    2. In short, the very four digits that Amazon considers unimportant enough to display in the clear on the web are precisely the same ones that Apple considers secure enough to perform identity verification.

      Security considered from different perspectives leads to security flaws!

    1. Almost half of eight- to 11-year-olds have agreed impenetrable terms and conditions to give social media giants such as Facebook and Instagram control over their data, without any accountability, according to the commissioner’s Growing Up Digital taskforce. The year-long study found children regularly signed up to terms including waiving privacy rights and allowing the content they posted to be sold around the world, without reading or understanding their implications.
  10. Dec 2016
    1. Former members of the Senate Church Committee write to President Obama and Attorney General Loretta Lynch requesting leniency for Edward Snowden.

      https://www.brennancenter.org/sites/default/files/news/Snowden_memo.pdf

  11. Nov 2016
    1. Mike Pompeo is Trump's pick for CIA director. In January 2016, Pompeo advocated "re-establishing collection of all metadata, and combining it with publicly available financial and lifestyle information into a comprehensive, searchable database. Legal and bureaucratic impediments to surveillance should be removed" (At least they acknowledge that backdoors in US hardware and software would do little good.)

      Oh, cute. Pompeo made a name for himself during the Benghazi investigation.<br> http://www.nytimes.com/2016/11/19/us/politics/donald-trump-mike-pompeo-cia.html

    1. EFF guide to attending protests, especially how to handle smartphones. (Part of the guide to surveillance self-defense.)

    1. Do students recognize the importance of password-protecting their devices and having different passwords across platforms?

      I'm curious to know if the answer to this question would differ from Generation Y to Generation Z.

    2. risks of blogging/tweeting, which include opening avenues for abuse.
  12. Oct 2016
    1. Hemisphere isn’t a “partnership” but rather a product AT&T developed, marketed, and sold at a cost of millions of dollars per year to taxpayers. No warrant is required to make use of the company’s massive trove of data, according to AT&T documents, only a promise from law enforcement to not disclose Hemisphere if an investigation using it becomes public.

      ...

      Once AT&T provides a lead through Hemisphere, then investigators use routine police work, like getting a court order for a wiretap or following a suspect around, to provide the same evidence for the purpose of prosecution. This is known as “parallel construction.”

    1. Outside of the classroom, universities can use connected devices to monitor their students, staff, and resources and equipment at a reduced operating cost, which saves everyone money.
    1. For G Suite users in primary/secondary (K-12) schools, Google does not use any user personal information (or any information associated with a Google Account) to target ads.

      In other words, Google does use everyone’s information (Data as New Oil) and can use such things to target ads in Higher Education.

  13. Sep 2016
    1. Proposed changes to "Rule 41" will make it too easy for government agents to get permission to hack remote computers. Petition Congress to prevent this.

    1. rovider  will  store  and  process  Data  in  accordance  with  industry  best  practice

      Sounds like we're good here.

    2. d  Provider  has  a  limited,  nonexclusive  license  solely  for  the  purpose  of  performing  its  obligations  as  outlined  in  the  Agreeme

      Here we are good and much better than, say, Genius:

      When you post User Content to the Service or otherwise submit it to us, you hereby grant, and you represent and warrant that you have the right to grant, to Genius an irrevocable, perpetual, non-exclusive, transferable, fully paid, worldwide license (with the right to sublicense through multiple tiers) to use, reproduce, publicly perform, publicly display, modify, translate, excerpt (in whole or in part), create derivative works of, distribute and otherwise fully exploit all Intellectual Property Rights in and to such User Content for purposes of providing, operating and promoting the Service or otherwise conducting the business of Genius.

    3.  all  intellectual  property  rights,  shall  remain  the  exclusive  property  of  the  [School/District],

      This is definitely not the case. Even in private groups would it ever make sense to say this?

    4. Access

      This really just extends the issue of "transfer" mentioned in 9.

    5. Data  Transfer  or  Destruction

      This is the first line item I don't feel like we have a proper contingency for or understand exactly how we would handle it.

      It seems important to address not just due to FERPA but to contracts/collaborations like that we have with eLife:

      What if eLife decides to drop h. Would we, could we delete all data/content related to their work with h? Even outside of contract termination, would we/could we transfer all their data back to them?

      The problems for our current relationship with schools is that we don't have institutional accounts whereby we might at least technically be able to collect all related data.

      Students could be signing up for h with personal email addresses.

      They could be using their h account outside of school so that their data isn't fully in the purview of the school.

      Question: if AISD starts using h on a big scale, 1) would we delete all AISD related data if they asked--say everything related to a certain email domain? 2) would we share all that data with them if they asked?

    6. Data  cannot  be  shared  with  any  additional  parties  without  prior  written  consent  of  the  Userexcept  as  required  by  law.”

      Something like this should probably be added to our PP.

    7. Data  Collection

      I'm really pleased with how hypothes.is addresses the issues on this page in our Privacy Policy.

    8. There  is  nothing  wrong  with  a  provider  usingde-­‐identified  data  for  other  purposes;  privacy  statutes,  after  all,  govern  PII,  not  de-­‐identified  data.

      Key point.

    9. Modification  of  Terms  of  Se

      We cover this in the TOS but not the Privacy Policy.

    1. Responsible Use

      Again, this is probably a more felicitous wording than “privacy protection”. Sure, it takes as a given that some use of data is desirable. And the preceding section makes it sound like Learning Analytics advocates mostly need ammun… arguments to push their agenda. Still, the notion that we want to advocate for responsible use is more likely to find common ground than this notion that there’s a “data faucet” that should be switched on or off depending on certain stakeholders’ needs. After all, there exists a set of data use practices which are either uncontroversial or, at least, accepted as “par for the course” (no pun intended). For instance, we probably all assume that a registrar should receive the grade data needed to grant degrees and we understand that such data would come from other sources (say, a learning management system or a student information system).

    2. captures values such as transparency and student autonomy

      Indeed. “Privacy” makes it sound like a single factor, hiding the complexity of the matter and the importance of learners’ agency.

    1. “We need much more honesty, about what data is being collected and about the inferences that they’re going to make about people. We need to be able to ask the university ‘What do you think you know about me?’”
  14. Aug 2016
    1. What if, as the cybersecurity consultant Matt Tait asked last month in relation to the DNC emails, a source — like, say, a hacker working for a Russian intelligence agency — provided WikiLeaks with a cache of documents that was tampered with in order to smear a political candidate?
    1. (Apple, Google and Samsung confirm that they don't collect any information about your shopping habits.)
  15. Jul 2016
    1. I could have easily chosen a different prepositional phrase. "Convivial Tools in an Age of Big Data.” Or “Convivial Tools in an Age of DRM.” Or “Convivial Tools in an Age of Venture-Funded Education Technology Startups.” Or “Convivial Tools in an Age of Doxxing and Trolls."

      The Others.

    2. demanded by education policies — for more data
    3. education technologies that are not build upon control and surveillance
  16. Jun 2016
    1. Whenever possible, it is important to give creators the right of refusal if they do not wish their work to be highly visible. Because of the often highly personal content of zines, creators may object to having their material being publicly accessible. Zinesters (especially those who created zines before the Internet era) typically create their work without thought to their work ending up in institutions or being read by large numbers of people. To some, exposure to a wider audience is exciting, but others may find it unwelcome

      makes the important distinction between the widely acknowledged categories of visibility - public and private - and equally important, but less widely recognized points on the spectrum of visibility - more and less visible, digitized and searchable versus digitized and not searchable, among others

    1. Snapchat does not currently respond to do-not-track signals that may be sent from your device.

      Protecting yourself with do-not-track requests doesn't matter - it isn't listened to.

    2. Revoking Permissions. If you change your mind about our ongoing ability to collect information from certain sources that you have already consented to, such as your phonebook, camera, photos, or location services, you can simply revoke your consent by changing the settings on your device if your device offers those options.

      What happens to the data already shared if permission is revoked?

    3. we cannot promise that deletion will occur within a specific timeframe. And we may also retain certain information in backup for a limited period of time or as required by law.

      "Deletion" is a term used loosely in this case.

    4. f you submit content to one of our inherently public features, such as Live, Local, or any other crowd-sourced service, we may retain the content indefinitely.

      Is there a clear distinction in the flow of the app for the common user that there is a significant difference between the methods of sharing images or video?

    5. With third parties as part of a merger or acquisition.

      Another identifier that the data collected by Snapchat is an asset, not used for the benefit of the users.

    6. With our affiliates. We may share information with entities within the Snapchat family of companies.

      What is the "Snapchat family of companies?"

    7. Provide you with an amazing set of products and services that we relentlessly improve.

      The value proposition tries to hide the functioning of the organization. Note that the real reason (ads, data collection, and tracking) are buried in the list below.

    8. enhance the safety and security of our products and services

      Collecting more information about users makes the service less secure. A breach, however small, can release PII because of the variety of metrics collected. Less data = more security.

    9. For example, if another user allows us to collect information from their device phonebook—and you’re one of that user’s contacts—we may combine the information we collect from that user’s phonebook with other information we have collected about you.

      I may choose not to allow my phone number to be shared, but a friend allows Snapchat to access their phonebook. My once-private information is now shared with a company without my consent.

    10. Information We Get When You Use Our Services

      It's important to remember that all of the following falls under "data which is necessary to provide a service."

      The quantity of information gathered here is enormous. Most of it seems innocuous, but it's all very personal and, implicit in their use of data for advertising, stored on Snapchat's servers with no time for deletion noted.

      This section outlines how Snapchat will continue to stay in business - by farming the information of its users and selling it for advertising.

    11. Of course, you’ll also provide us whatever information you send through the services,

      How many people forget that Snapchat actually sees everything sent? Is this explicitly shared with students when we're teaching?

      We cannot assume they're aware of the transfer of data that takes place when that data is sent through an app or website.

    1. or group to seclude themselves

      Group privacy is much more infrequently discussed than individual privacy. At least in Euro-American contexts. Quite likely a very important bias.

  17. May 2016
    1. including the use of passwords to gain access to your personal information.

      Why is this noteworthy?

    2. If Turnitin is involved in a merger, acquisition, or sale of all or a portion of its assets, you will be notified via email and/or a prominent notice on our website, of any change in ownership, uses of your personal information, and choices you may have regarding your personal information.

      A. merger voids any privacy pledges. The new owner has no requirement to follow these policys.

      B. The information is kept for an extended period of time. Once a teacher leaves a school district, their email will be invalid. It's a vacuous promise to say that they'll be contacted.

    3. etc.

      what??? The following personal information is anything else too.

    4. we make it available on our homepage and at every point where we request your personal information

      Why is this noteworthy?

    5. TRUSTe Privacy Seal

      "TRUSTe-certified sites are more than twice as likely to be untrustworthy." http://www.benedelman.org/news/092506-1.html

  18. Apr 2016
    1. Short URLs can be brute forced. They should not be used for pages that contain personal information, or pages that allow anyone with the URL to upload files.

  19. Feb 2016
    1. Some plural User subject that is conjoined by a proxy link or other means could be composed of different types of addressable subjects: two humans in different countries, or a human and a sensor, a sensor and a bot, a human and a robot and a sensor, a whatever and a whatever. In principle, any one of these subcomponents could not only be part of multiple conjoined positions, but might not even know or need to know which meta-User they contribute to, any more than the microbial biome in your gut needs to know your name.

      Anonymity is not a binary, it is the limit of dissolution into a coherent plural subject.

  20. Jan 2016
    1. Vigilant Solutions, a surveillance technology company, is making shady deals with police departments in Texas. They lend the police equipment and database access. The police use it to spot people with outstanding warrants, whom they can stop and take payments from by credit card -- with a 25% processing fee tacked on for the tech company. The company also intends to keep all the license plate data collected by the police.

    1. One of the drawbacks of anonymity on the Web is romance scams. Scammers set up fake personas on social media. They often use photos stolen from a real person's accounts

      When the same person has their photos stolen repeatedly, Facebook could prevent this easily. But they don't.

    1. “traffic analysis.” It’s basis lies in observing all message traffic traveling on a network and discerning who’s communicating with whom, how much, and when.

      The strategy seems to be archive everything in case traffic analysis finds something worth going back and reading.

      Defense against traffic analysis:

      Messages from users must be padded to be uniform in size and combined into relatively large “batches,” then shuffled by some trustworthy means, with the resulting items of the randomly ordered output batch then distributed to their respective destinations. (Technically, decryption needs to be included in the shuffling.) Mix network

      He then talks about limited anonymity and pairwise pseudonyms as ways of solving problems with complete anonymity versus public identification. There is an article in Wired about his proposed system.

    1. State legislators in New York and California have introduced bills that would require smartphone vendors to be able to decrypt users' phones.

    1. from Hawaii to Alabama to New Hampshire, a diverse, bipartisan coalition of state legislators will simultaneously announce state legislative proposals that, although varied, are all aimed at empowering their constituents to #TakeCTRL of their personal privacy. These bills would go far in ensuring students, employees, and everyone else has more of a say over who can know their whereabouts, track their activities online, and view information they share with friends.
    1. Finding [Silk Road founder Ross] Ulbricht really boiled down to this: a bunch of Google searches done by an investigator for the Internal Revenue Service (IRS).<br> . . .<br> His preferred tool: Google. Particularly the advanced search option that lets you focus in on a date range.<br> . . .<br> Alford couldn’t be at Ulbricht’s arrest, but he did receive a plaque. The NYT reports that Alford’s superiors had it inscribed with this quote from Sherlock Holmes: "The world is full of obvious things which nobody by chance ever observes."

  21. Dec 2015
    1. So you think mass surveillance isn't a problem, since you have nothing to hide? There are so many federal crimes that it is impossible to count them. If the government decides to focus on you, they can probably find a crime that fits your actions.

    1. A personal API builds on the domain concept—students store information on their site, whether it’s class assignments, financial aid information or personal blogs, and then decide how they want to share that data with other applications and services. The idea is to give students autonomy in how they develop and manage their digital identities at the university and well into their professional lives
    1. despite having promised not to track students, Google is abusing its position of power as a provider of some educational services to profit off of students’ data when they use other Google services—services that Google has arbitrarily decided don’t deserve any protection.
    1. Congress on Friday adopted a $1.15 trillion spending package that included a controversial cybersecurity measure that only passed because it was slipped into the US government's budget legislation. House Speaker Paul Ryan, a Republican of Wisconsin, inserted the Cybersecurity Information Sharing Act (CISA) into the Omnibus Appropriations Bill—which includes some $620 billion in tax breaks for business and low-income wage earners. Ryan's move was a bid to prevent lawmakers from putting a procedural hold on the CISA bill and block it from a vote. Because CISA was tucked into the government's overall spending package on Wednesday, it had to pass or the government likely would have had to cease operating next week.

      House 316-113<br> Senate 65-33

      The Verge "This morning, Congress passed the Cybersecurity Information Sharing Act of 2015, attached as the 14th rider to an omnibus budget bill. The bill is expected to be signed into law by the president later today."

      Techdirt 15 Dec

      1. Allows data to be shared directly with the NSA and DOD, rather than first having to go through DHS.
      2. Removes restrictions on using the data for surveillance activities.
      3. Removes limitation on using the data for cybersecurity purposes, and allows it to be used for investigating other crimes -- making it likely that the DEA and others will abuse CISA.
      4. Removes the requirement to "scrub" the data of personal information unrelated to a cybersecurity threat before sharing the data.

      ACLU

    1. Manhattan district attorney Cyrus R. Vance Jr. says that law enforcement agencies want Google and Apple to return to systems without full-disk encryption -- those before iOS 8 and Android Lollipop -- which they could unlock in compliance with a warrant.

      He says that's all they're asking. If that's true, they should be speaking out loudly against mass surveillance and FBI demands for backdoors.

    1. Negotiated in secret and tucked in legislation thousands of pages long, Congress is about to pass an awful surveillance bill under the guise of “cybersecurity” that could open the door to the NSA acquiring much more private information of Americans.
    1. And the latest is that it's getting worse. Not only is Congress looking to include it in the end of year omnibus bill -- basically a "must pass" bill -- to make sure it gets passed, but it's clearly dropping all pretense that CISA isn't about surveillance. Here's what we're hearing from people involved in the latest negotiations. The latest version of CISA that they're looking to put into the omnibus:
    1. The Senate’s recently passed bill, known as the Cybersecurity Information Sharing Act (CISA), is expected to serve as the basis for the finished language. The compromise text will also likely include elements from a bill that originated in the House Intelligence Committee, observers said.This completed product would mostly sideline the privacy advocate-preferred bill from the House Homeland Security Committee. They believe the Homeland Security bill includes the strongest provisions to protect people’s sensitive data from falling into the NSA's hands.Specifically, the Homeland Security bill would give the greatest role to the Department of Homeland Security (DHS) for collecting cyber threat data from the private sector and disseminating it throughout the government.It’s believed the DHS is best suited to scrub data sets of personal information.

      It seems necessary to encourage -- or force -- industrial and financial firms to share information with the government about hacks and attempted hacks. But that should not be used as license to transfer and collect customer metadata,

    1. The San Bernardino shootings are also being cited by some Republicans, including presidential candidate Sen. Marco Rubio, as a reason to reinstate the warrantless bulk collection of domestic telephone data — the one program that was shut down by Congress after NSA whistleblower Edward Snowden revealed a massive, secret surveillance dragnet. An Associated Press story on Saturday added fuel to the fire when it claimed that as a result of the shutdown, the government could no longer access historical call records by the San Bernardino couple. But as Emptywheel blogger Marcy Wheeler amply explained, the FBI has plenty of other ways of getting the information.
    1. The National Security Letter (NSL) is a potent surveillance tool that allows the government to acquire a wide swath of private information—all without a warrant. Federal investigators issue tens of thousands of them each year to banks, ISPs, car dealers, insurance companies, doctors, and you name it. The letters don't need a judge's signature and come with a gag to the recipient, forbidding the disclosure of the NSL to the public or the target.
  22. Oct 2015
    1. From wiretapping to censorship of pornography, governmental entities infringe upon the private or, at the very least, draw and redraw the line between the public and the private.

      Yes....in an age of networks, the private and public spheres give way to the private and public sectors where the latter of which (public sector) is increasing encroached upon by the former (private sector networks, infrastructures, data, etc).

    2. Hospitality is the defining ethical predicament of networked life, as it describes the difficulties surrounding “what it is that turns up, what comes our way by e-mail or the Internet.”[15] Life in a networked society means that terms such as place, home, host, and guest are thrown into question.

      And with it, issues of what counts as private and public.

    3. But Facebook privacy functions shift and Dropbox usernames are hacked, meaning that the concept of the invitation is mostly a pleasant fiction.

      This is an idea that me and Rob Gehl wrote about a little concerning the shifting natures of terms of service agreements as/as contracts, but I LOVE how Brown is setting it up here as perhaps the grounding for a networked ethics. I'll be tagging such references as I go along.

      http://thenewinquiry.com/essays/cookie-cutters/

    1. When I told him about my NSA excursion, he sighed and shook his head. Surveillance, he said, was pointless, a total waste. The powers that be should instead invite people to confess their secrets willingly. He envisioned vast centers equipped with mics and headphones where people could speak in detail and at length about their experiences, thoughts, and feelings, delivering in the form of monologues what the eavesdroppers could gather only piecemeal.

      Reminds me about THX 1138.

    2. Referring to the notion of abandoning all hope to preserve privacy and instead of protesting about governments gathering intelligence without our permission, design a society were people simply confess what's in their minds in huge specially designated centres.

    1. Internet Commons

      European Parliament conference on “Internet as a Commons: Public Space in the Digital Age”, organised in cooperation with Commons Network and Heinrich Böll Foundation. Discussing how to re-decentralize and reclaim the Internet for all.

      [ Prologue ]

      The Internet as a whole has become an important part of our global public sphere. Internet provides access to a wealth of information and knowledge, and the possibility to participate, create and communicate. This public space made up of internet infrastructures is increasingly threatened from two sides; by the centralization and commercialization through the dominant positions held by giant telecom and Internet companies, as well as by an increasing trend in state regulation and censorship of the net. This poses important questions about how we choose to organize and regulate our digital societies, and how Internet governance models can be developed and implemented to ensure fair and democratic participation.

      When it comes to the future of the Internet, a key discussion is one of infrastructures; who owns, runs and controls them. The question of regulation, and who oversees the regulators, is made complicated by the transnational nature of the net.

      As much as people expect a broadly and equitably accessible Internet open to diversity, we are, slowly but surely, moving away from it. Monopolization of Internet infrastructures and services by companies such as Facebook and Google has gone hand in hand with privacy intrusions, surveillance and the unbounded use of personal data for commercial gain. As we all interact in these centralized commercial platforms that monetize our actions we see an effective enclosure and manipulation of our public spaces. Decentralization and democratization of the Internet infrastructure and activities is essential to keep a free, open and democratic Internet for all to enjoy equitably. But can the “small is beautiful”-idea be compatible with the building of state-of-the-art successful infrastructure in the future?

      The debates around net neutrality, infrastructure neutrality and Internet monopolies reflect the important choices that are to be made. It is essential the EU formulates a comprehensive vision on the internet that addresses the protection of civil liberties such as free speech and privacy, but also the growing commercialization of our digital public spaces and the commodification of personal data with the effect of the market encroaching on all aspects of our daily lives. Only then can it make relevant interventions regarding the Internet and its governance.

      Let´s discuss how to re-decentralize and reclaim the Internet for all.

      [ Introduction ]

      Opening remarks from Benkler & Bloemen:

      2:16 Yochai Benkler (Harvard Professor)

      The two major challenges of 21st Century Capitalism are the result of the impact of increasing well-being and welfare throughout the globe. The impact on the natural environment and the social environment.

      And while the last forty years has seen a steady struggle to increase understanding of the threat to the natural environment. We've actually seen over the last forty years a retreat in the understanding of the impact on the social environment.

      Throughout the industrialised world in particular, we've seen increased inequality and a series of ideas around Neoliberalism, initially finding root in the United States and the United Kingdom, then expanding to liberalisation in Europe and ultimately translating into the Washington consensus as a core development policy.

      These were anchored in a set of ideas, we largely think of as Neoliberalism, that argued that uncertainty and complexity makes centralised economic planning impossible, and so prices and decentralised decisions in markets by individuals will produce good information.

      They modelled universal rationality as self-interested, self-maximising human behaviour. They understood collective behaviour as always failing, always corrupting into illegitimate power. And that then meant that deregulation and freeing of markets from social and legal controls were the way to increase both welfare and liberty.

      What we've seen in the last twenty-five years is that the idea of the Commons is beginning to offer a framework, to respond to these deeply corrosive ideas, and begin to allow us to create frameworks that teach us how we can increase human welfare, improve the human condition, but without undermining the social relations in the way that has been so corrosive for the last forty years.

      Three schools of the Commons: The work that came out Elinor Ostrom's work and the Ostrom School, the Global Commons work coming out of the environmental movement, and what's most relevant to us here in today's meeting, is the Internet Commons.

      The thing that became clear with the Internet Commons, is that even at the heart of the most advanced economies, at the cutting edge of technology and in the areas of greatest economic growth and innovation, commons are at the very heart.

      From the very Internet engineering task force that created the internet protocols, through the World Wide Web, to core infrastructure like spectrum commons like WiFi or software, all the way to this great knowledge facility of Wikipedia.

      We've seen commons work, we've seen how they work, we've seen their limitations, we've been able to learn how to make them operate and we continue to learn about them. But from the mentally, they offer existence proof that there is another way.

      The past quarter century of commons, both on and offline, has taught us that people can affectively act collectively to govern their own utilisation of resources. They've taught us with many details that people respond to diverse motivations and that economic utility is valuable, but it's only part of a range of social emotional and rational ethical commitments.

      Property and markets vs State planning and ownership, don't exhaust the capabilities, we live with a much more diverse set of ways of organising economic production, and in particular voluntaristic actions in commons, can support growth, can support innovation, can be more efficient, while at the same time being sustainable and socially more integrated.

      At a higher level of abstraction we have come to understand that production and resource management are socially embedded activities, social embededness is not something from which we need to free markets, it instead something we need to achieve.

      Freedom is self-governance, individual and collective, not free choice in the market, and property based market as we saw in copyright and patents, as we saw in a variety of our other areas, can actually undermine freedom in both of these senses.

      So what are we to do?

      Our experience of Internet Commons tells us, that three major shifts needs to happen before the 21st century capitalism challenge can be answered in a socially sustainable way.

      We need to increase our use of peer cooperativism. Taking the experience we've garnered over the last fifteen years with commons based peer production and translating into a way that expanded to ever larger propositions of provisioning, so that it can provide a practical anchor and a normative anchor to material production in the market.

      We also cannot give up on socially embedded market production, there is no one right path to market production, there is genuine room for ethical choice, not only on the environmental side, not only on the rights side in terms of human rights, but also on the side of economic equality and social sustainability.

      And finally, we need to turn our political understanding to one that has peer pragmatism, that understands the limitations of the traditional State, while it also understands the limitations of the Market. That builds on our experience in self-governing communities like Wikipedia, with the overlapping and nested relationship, with the distinct continued ethical commitment of Citizens to their practices. With continuous challenging, but also with distribution of power to much more local bases, to form a new political theory- based in our commons based practices, of our relations as Citizens and the State.

      So however important a particular part of the Internet Commons may be from a practical level, at the level of ideas, our experience in Internet Commons over the last quarter of the century, is beginning to teach us how to shape Capitalism for the 21st Century, so that is not only sustainable from the natural environment perspective, but that it is also embedded and supportive of it's social environment.

      9:25 Sophie Bloemen (Commons Network)

      The Commons is a perspective that looks at stewardship, equitable access and sustainability, and it looks at the collective good beyond individual rights exclusively. So instead of conceiving of Society as a collection of atomised individuals, principally living as consumers, Commons points to the reality of people's lives being deeply embedded in social relationships- communities, histories, traditions.

      So this perspective is very helpful when conceiving of the Internet as a public space, as a common good, and how we might want to organise this public space. What kind of infrastructure is provided and who controls the infrastructure. This is what it insists on, on the protection of the Internet as a public space, accessible to everyone. So just like a bridge or street, it's an infrastructure, and it must be controlled and managed in the interests of Citizens.

      The central issue of the debate on net neutrality, has also been will it be continue to be managed as a mixed use of commons, or will discriminatory tiers of service transform the internet to a predominately commercial system, for production and distribution.

      So the key questions are: Who controls the infrastructure? What are the terms and conditions under which the public gets access? and this has far reaching implications for our society.

      The domination of the Internet by several large actors raises important policy questions, about how to manage it. The thwarting of net neutrality rules in Europe just suggests just how vulnerable the open internet really is and it's therefore necessary for policy makers to have a real vision that acknowledges the gravity of these issues.

      It was reading professor Benkler's book 'Wealth of Networks' years ago, that give me enlightened research, key insights, why we are and how we are living in a time of deep economic change, change of the modes of production, due to digital technologies, and what the role of social peer production can be, might be.

      But also, that it's not a given in which direction we will go. It's not pre-determined, we have to give it a certain shape.

      What he also alluded to now is that, our institutional frameworks to a certain extent, reflect outdated conceptions of human agency. The idea of the rational individual who is just out there to increase his material gain through rational calculation. We create and we share because of curiosity, because of social connectedness, because of psychological well-being, there is an element of cooperation and human reciprocity there as well.

      So this human capability has really been shown or has really been brought out by the Internet, by digital technology, but it's also taking place, these forms of cooperation and collective action, are also taking shape offline; lots of commoning initiatives, community gardening, co-housing, ethical financing.

      So to go back to these institutional frameworks, how can we as professor Benkler said, he named these three things, how can we increase the use of peer cooperativism, and how can we make sure there's a shift towards socially embedded market productions where there's self-governance as well, which is community based. The third point he made is to enhance the political understanding of these commons based practices that are beyond the Market and beyond the State, and I guess that's partly what we're doing here, enhancing this political understanding.

      So how do we need to tweak the institutional frameworks, what do we have to take away, what do we have to add? and that's also why in the analysis in our paper 'A Commons Perspective on European Knowledge Policy' we discuss this and we talk about copyright legislation and net neutrality and european positions at the world intellectual property organisation, which are all relevant to this.

      What kind of sharing economy do we want, do we want a democratised one where we empower everyone to be a producer, or are most of us still consumers in this economy. Are we producers just in the sense that we share our data, and all our actions online and offline are commodified, we pay with our privacy to be part of it.

      So in order to get a good grip on where we should go, how to go ahead, we should take a step back. Take a step back and see what kind of society we would like.

      And a key question is: How can we create a structural environment that enables society to fully reap the benefits of knowledge sharing and collaborative production, in a way that's also socially sustainable?

      And what could the role of EU be? At this moment, the European parliament is considering a new copyright framework, there's a digital single market strategy, there's the data regulations, lots of things going on. So the next panels will set out some big ideas, and will also give some very practical examples of people engaging with building these peer to peer networks or other initiatives, that will make more concrete what we are talking about.

    1. It’s free. 

      Free as in “tracked”. Sure, Google signed the privacy pledged and they don’t use data to advertise directly to students. But there are many loopholes. As rms makes very clear, GAfE is the exact opposite of Free Software. It’s “not having to pay for your own enslavement”.

  23. Sep 2015
    1. In fact, he finds privacy is achieved more often through rules regulating interpersonal behavior rather than by direct manipu- lation of the environment

      Maybe instead of actually building things that provide privacy, creating a societal structure that respects the need for privacy..?

  24. Aug 2015
    1. The United States Federal Trade Commission and any of the 50 state attorney generals (or even a privacy commissioner in one of the many countries that now has privacy commissioners to enforce privacy laws) could go after Google or one of the the thousands of other websites that have posted deceptive P3P policies. However, to date, no regulators have announced that they are investigating any website for a deceptive P3P policy. For their part, a number of companies and industry groups have said that circumventing IE’s privacy controls is an acceptable thing to do because they consider the P3P standard to be dead (even though Microsoft still makes active use of it in the latest version of their browser and W3C has not retired it).

      That seems pretty dead.

  25. Jun 2015
    1. Apple’s superior position on privacy needs to be the icing on the cake, not their primary selling point.

      Yeah... 'cause apparently no one actually cares...

  26. Apr 2015
    1. Requests made from a document, and for navigations away from that document are associated with a referer header. While the header can be suppressed for links with the noreferrer link type, authors might wish to control the referer header more directly for a number of reasons:
    1. For a long time, I’ve argued that the debate about privacy isn’t about privacy per se, but that it’s really about control. The first step is making people feel comfortable that they are in control of their data. The second step is building tools to help them monetize their own data.