752 Matching Annotations
  1. Nov 2020
    1. Online Exams & Proctoring (In Addition to Guidance Listed Above) Requiring students to turn on their camera to be watched or recorded at home during an exam poses significant privacy concerns and should not be undertaken lightly. Several proctoring services use machine learning, AI, eye-tracking, key-logging, and other technologies to detect potential cheating; these should be used only when no feasible alternatives exist. If instructors are using a proctoring service during the COVID-19 measures, they must provide explicit notice to the students before the exam. Instructors are encouraged to work with the Digital Learning Hub in the Commons and the Academic Integrity Office to consider privacy-protective options, including how to use question banks (in Canvas), that will uphold integrity and good assessment design. Proctors and instructors are strongly discouraged from requiring students to show their surroundings on camera. Computers are available in labs for students who do not have a computer to take their final exams. Finals CANNOT be held in a lab, that is, instructors cannot be present nor can students from a specific class be asked to gather there for a final. This is only for those students who need a computer to drop in and complete their exam.
    1. anonymous imageboard

      4chan is reasonably unique in the current online landscape, in that it permits conversation by totally anonymous users. This allows its users to post without much thought about their privacy status, which they often take for granted. This unique level of privacy fostered by anonymity, in a way, partially delivers on the Cyberspace rhetoric of the 1990s in that people can't be judged by their physical identities unless they offer identifying information up themselves. That's not to say that 4chan is a welcoming space for all (or even most) users, though, as it has been acknowledged, even later here in Ellis' article, that 4chan houses plenty of white supremacist tendencies, but, strictly speaking, as far as one's ideas go, they are judged purely based on their merit so long as no additional personal identifiers are offered. As Dillon Ludemann notes in his paper, /pol/emics: Ambiguity, scales, and digital discourse on 4chan, white supremacy, as well as other, "practiced and perceived deviancy is due to the default blanket of anonymity, and the general discourse of the website encourages users to remain unnamed. This is further enforced and embodied as named users, colloquially known as 'namefags,' are often vilified for their separation from the anonymous collective community" (Ludemann, 2018).

      Hypothetically, since all users start out as anonymous, one could also present their identity however they so please on the platform, and in theory what this means is that the technology behind the site promotes identity exploration (and thus cyberspace rhetoric), even though in practice, what most users experience is latent racism that depends on users' purposefully offered identifying information or generalized white supremacist posts that are broadcasted for all on the site to see.

      Work Cited:

      Ludemann, D. (2018). /pol/emics: Ambiguity, scales, and digital discourse on 4chan. Discourse, Context & Media, 24, 92-98. doi: 10.1016/j.dcm.2018.01.010

    1. A long term key is as secure as the minimum common denominator of your security practices over its lifetime. It's the weak link.

      Good phrasing of the idea: "you're only as secure as your weakest link".

      You are only as secure as the minimum common denominator of your security practices.

    1. The first is that the presence of surveillance means society cannot experiment with new things without fear of reprisal, and that means those experiments—if found to be inoffensive or even essential to society—cannot slowly become commonplace, moral, and then legal. If surveillance nips that process in the bud, change never happens. All social progress—from ending slavery to fighting for women’s rights—began as ideas that were, quite literally, dangerous to assert. Yet without the ability to safely develop, discuss, and eventually act on those assertions, our society would not have been able to further its democratic values in the way that it has. Consider the decades-long fight for gay rights around the world. Within our lifetimes we have made enormous strides to combat homophobia and increase acceptance of queer folks’ right to marry. Queer relationships slowly progressed from being viewed as immoral and illegal, to being viewed as somewhat moral and tolerated, to finally being accepted as moral and legal. In the end it was the public nature of those activities that eventually slayed the bigoted beast, but the ability to act in private was essential in the beginning for the early experimentation, community building, and organizing. Marijuana legalization is going through the same process: it’s currently sitting between somewhat moral, and—depending on the state or country in question—tolerated and legal. But, again, for this to have happened, someone decades ago had to try pot and realize that it wasn’t really harmful, either to themselves or to those around them. Then it had to become a counterculture, and finally a social and political movement. If pervasive surveillance meant that those early pot smokers would have been arrested for doing something illegal, the movement would have been squashed before inception. Of course the story is more complicated than that, but the ability for members of society to privately smoke weed was essential for putting it on the path to legalization. We don’t yet know which subversive ideas and illegal acts of today will become political causes and positive social change tomorrow, but they’re around. And they require privacy to germinate. Take away that privacy, and we’ll have a much harder time breaking down our inherited moral assumptions.

      One reason privacy is important is because society makes moral progress by experimenting with things on the fringe of what is legal.

      This is reminiscent of Signal's founder's argument that we should want law enforcement not to be 100% effective, because how else are we going to find out the gay sex, and marihuana use doesn't devolve and doesn't hurt anybody.

    1. For both the tailor-customer and doctor-patient examples, personal data is an input used to improve an output (dress, suit, medical treatment) such that the improvement directly serves the interests of the person whose information is being used.

      This reminds me of "Products are functions" where your personal data is a variable than enters into the function to determine the output.

    1. Preserving user privacy is difficult when detectingmore nuanced forms of censorshipSome forms of softcensorship might involve intentional performance degrada-tion or content manipulation. Detecting this type of behav-ior would require comparing performance or content acrossgroups of users, but such a comparison also implies thateach user must reveal their browsing history.

      If you want to investigate whether content for a user was manipulated or performance was degraded, there may be no other way but to access detailed logs of their usage. This might raise privacy concerns.

      Not only is personalization difficult to disambiguate from manipulation and censorship, personalization also makes it more costly to compare the personalized experience to some baseline value to determine if manipulation or performance degradation has taken place.

    1. If the EU is set to mandate encryption backdoors to enable law enforcement to pursue bad actors on social media, and at the same time intends to continue to pursue the platforms for alleged bad practices, then entrusting their diplomatic comms to those platforms, while forcing them to have the tools in place to break encryption as needed would seem a bad idea.

      One explanation for the shift away from Whatsapp (based off the Signal protocol) is that the EU themselves are seeking legislation to force a backdoor into consumer tech.

    1. But as long as the most important measure of success is short-term profit, doing things that help strengthen communities will fall by the wayside. Surveillance, which allows individually targeted advertising, will be prioritized over user privacy. Outrage, which drives engagement, will be prioritized over feelings of belonging. And corporate secrecy, which allows Facebook to evade both regulators and its users, will be prioritized over societal oversight.

      Schneier is saying here that as long as the incentives are still pointing in the direction of short-term profit, privacy will be neglected.

      Surveillance, which allows for targeted advertising will win out over user privacy. Outrage, will be prioritized over more wholesome feelings. Corporate secrecy will allow Facebook to evade regulators and its users.

    2. Increased pressure on Facebook to manage propaganda and hate speech could easily lead to more surveillance. But there is pressure in the other direction as well, as users equate privacy with increased control over how they present themselves on the platform.

      Two forces acting on the big tech platforms.

      One, towards more surveillance, to stop hate and propaganda.

      The other, towards less surveillance, stemming from people wanting more privacy and more control.

    1. At the same time, working through these principles is only the first step in building out a privacy-focused social platform. Beyond that, significant thought needs to go into all of the services we build on top of that foundation -- from how people do payments and financial transactions, to the role of businesses and advertising, to how we can offer a platform for other private services.

      This is what Facebook is really after. They want to build the trust to be able to offer payment services on top of Facebook.

    2. People want to be able to choose which service they use to communicate with people. However, today if you want to message people on Facebook you have to use Messenger, on Instagram you have to use Direct, and on WhatsApp you have to use WhatsApp. We want to give people a choice so they can reach their friends across these networks from whichever app they prefer.We plan to start by making it possible for you to send messages to your contacts using any of our services, and then to extend that interoperability to SMS too. Of course, this would be opt-in and you will be able to keep your accounts separate if you'd like.

      Facebook plans to make messaging interoperable across Instagram, Facebook and Whatsapp. It will be opt-in.

    3. An important part of the solution is to collect less personal data in the first place, which is the way WhatsApp was built from the outset.

      Zuckerberg claims Whatsapp was built with the goal of not collecting much data from the outset.

    4. As we build up large collections of messages and photos over time, they can become a liability as well as an asset. For example, many people who have been on Facebook for a long time have photos from when they were younger that could be embarrassing. But people also really love keeping a record of their lives.

      Large collections of photos are both a liability and an asset. They might be embarrassing but it might also be fun to look back.

    5. We increasingly believe it's important to keep information around for shorter periods of time. People want to know that what they share won't come back to hurt them later, and reducing the length of time their information is stored and accessible will help.

      In addition to a focus on privacy, Zuckerberg underlines a focus on impermanence — appeasing people's fears that their content will come back to haunt them.

    6. I understand that many people don't think Facebook can or would even want to build this kind of privacy-focused platform -- because frankly we don't currently have a strong reputation for building privacy protective services, and we've historically focused on tools for more open sharing.

      Zuckerberg acknowledges that Facebook is not known for its reputation on privacy and has focused on open sharing in the past.

    7. But now, with all the ways people also want to interact privately, there's also an opportunity to build a simpler platform that's focused on privacy first.

      Zuckerberg says there's an opportunity for a platform that is focused on privacy first.

    8. Today we already see that private messaging, ephemeral stories, and small groups are by far the fastest growing areas of online communication.

      According to Zuckerberg, in 2019 we're seeing private messaging, ephemeral stories and small groups as the fastest growing areas of online communication.

    9. As I think about the future of the internet, I believe a privacy-focused communications platform will become even more important than today's open platforms. Privacy gives people the freedom to be themselves and connect more naturally, which is why we build social networks.

      Mark Zuckerberg claims he believes privacy focused communications will become even more important than today's open platforms (like Facebook).

    1. Now let me get back to your question. The FBI presents its conflict with Apple over locked phones as a case as of privacy versus security. Yes, smartphones carry a lot of personal data—photos, texts, email, and the like. But they also carry business and account information; keeping that secure is really important. The problem is that if you make it easier for law enforcement to access a locked device, you also make it easier for a bad actor—a criminal, a hacker, a determined nation-state—to do so as well. And that's why this is a security vs. security issue.

      The debate should not be framed as privacy-vs-security because when you make it easier for law enforcement to access a locked device, you also make it easier for bad actors to do so as well. Thus it is a security-vs-security issue.

    1. The FBI — along with many other law enforcement and surveillance agents — insists that it is possible to make crypto that will protect our devices, transactions, data, communications and lives, but which will fail catastrophically whenever a cop needs it to. When technologists explain that this isn't a thing, the FBI insists that they just aren't nerding hard enough.

      The infosec community has labelled the argument by the government that there should be a solution to the dilemma of wanting secure consumer tech, but also granting access to government officials as: nerd harder.

    1. Barr makes the point that this is about “consumer cybersecurity” and not “nuclear launch codes.” This is true, but it ignores the huge amount of national security-related communications between those two poles. The same consumer communications and computing devices are used by our lawmakers, CEOs, legislators, law enforcement officers, nuclear power plant operators, election officials and so on. There’s no longer a difference between consumer tech and government tech—it’s all the same tech.

      The US government's defence for wanting to introduce backdoors into consumer encryption is that in doing so they would not be weakening the encryption for, say, nuclear launch codes.

      Schneier holds that this distinction between government and consumer tech no longer exists. Weakening consumer tech amounts to weakening government tech. Therefore it's not worth doing.

    1. The answers for law enforcement, social networks, and medical data won’t be the same. As we move toward greater surveillance, we need to figure out how to get the best of both: how to design systems that make use of our data collectively to benefit society as a whole, while at the same time protecting people individually.

      Each challenge needs to be treated on its own. The trade off to be made will be different for law enforcement vs. social media vs. health officials. We need to figure out "how to design systems that make use of our data collectively to benefit society as a whole, while at the same time protecting people individually."

    1. A future in which privacy would face constant assault was so alien to the framers of the Constitution that it never occurred to them to call out privacy as an explicit right. Privacy was inherent to the nobility of their being and their cause.

      When we wrote our constitutions, it was a time where privacy was a given. It was inconceivable that there would be a threat of being surveilled everywhere we go.

    2. We do nothing wrong when we make love or go to the bathroom. We are not deliberately hiding anything when we seek out private places for reflection or conversation. We keep private journals, sing in the privacy of the shower, and write letters to secret lovers and then burn them. Privacy is a basic human need.

      Privacy is a basic human, psychological need.

    3. Privacy protects us from abuses by those in power, even if we’re doing nothing wrong at the time of surveillance.

      Privacy is what protects us from abuse at the hands of the powerful, even if we're doing nothing wrong at the time we're being surveilled.

    4. My problem with quips like these — as right as they are — is that they accept the premise that privacy is about hiding a wrong. It’s not. Privacy is an inherent human right, and a requirement for maintaining the human condition with dignity and respect.

      Common retorts to "If you aren't doing anything wrong, what do you have to hide?" accept the premise that privacy is about having a wrong to hide.

      Bruce Schneier posits that Privacy is an inherent human right, and "a requirement for maintaining the human condition with dignity and respect".

  2. Oct 2020
    1. Legislation to stem the tide of Big Tech companies' abuses, and laws—such as a national consumer privacy bill, an interoperability bill, or a bill making firms liable for data-breaches—would go a long way toward improving the lives of the Internet users held hostage inside the companies' walled gardens. But far more important than fixing Big Tech is fixing the Internet: restoring the kind of dynamism that made tech firms responsive to their users for fear of losing them, restoring the dynamic that let tinkerers, co-ops, and nonprofits give every person the power of technological self-determination.
    1. In fact, these platforms have become inseparable from their data: we use “Facebook” to refer to both the application and the data that drives that application. The result is that nearly every Web app today tries to ask you for more and more data again and again, leading to dangling data on duplicate and inconsistent profiles we can no longer manage. And of course, this comes with significant privacy concerns.
    1. I find it somewhat interesting to note that with 246 public annotations on this page using Hypothes.is, that from what I can tell as of 4/2/2019 only one of them is a simple highlight. All the rest are highlights with an annotation or response of some sort.

      It makes me curious to know what the percentage distribution these two types have on the platform. Is it the case that in classroom settings, which many of these annotations appear to have been made, that much of the use of the platform dictates more annotations (versus simple highlights) due to the performative nature of the process?

      Is it possible that there are a significant number of highlights which are simply hidden because the platform automatically defaults these to private? Is the friction of making highlights so high that people don't bother?

      I know that Amazon will indicate heavily highlighted passages in e-books as a feature to draw attention to the interest relating to those passages. Perhaps it would be useful/nice if Hypothes.is would do something similar, but make the author of the highlights anonymous? (From a privacy perspective, this may not work well on articles with a small number of annotators as the presumption could be that the "private" highlights would most likely be directly attributed to those who also made public annotations.

      Perhaps the better solution is to default highlights to public and provide friction-free UI to make them private?

      A heavily highlighted section by a broad community can be a valuable thing, but surfacing it can be a difficult thing to do.

    1. A friend of mine asked if I’d thought through the contradiction of criticizing Blair publicly like this, when she’s another not-quite public figure too.

      Did this really happen? Or is the author inventing it to diffuse potential criticism as she's writing about the same story herself and only helping to propagate it?

      There's definitely a need to write about this issue, so kudos for that. Ella also deftly leaves out the name of the mystery woman, I'm sure on purpose. But she does include enough breadcrumbs to make the rest of the story discover-able so that one could jump from here to participate in the piling on. I do appreciate that it doesn't appear that she's given Blair any links in the process, which for a story like this is some subtle internet shade.

    2. Even when the attention is positive, it is overwhelming and frightening. Your mind reels at the possibility of what they could find: your address, if your voting records are logged online; your cellphone number, if you accidentally included it on a form somewhere; your unflattering selfies at the beginning of your Facebook photo archive. There are hundreds of Facebook friend requests, press requests from journalists in your Instagram inbox, even people contacting your employer when they can’t reach you directly. This story you didn’t choose becomes the main story of your life. It replaces who you really are as the narrative someone else has written is tattooed onto your skin.
    3. the woman on the plane has deleted her own Instagram account after receiving violent abuse from the army Blair created.

      Feature request: the ability to make one's social media account "disappear" temporarily while a public "attack" like this is happening.

      We need a great name for this. Publicity ghosting? Fame cloaking?

  3. Sep 2020
    1. L’homme asservi n’est pas seulement contraint, il consent à sa contrainte.

      …mais la personne «asservie» le consent-elle vraiment en connaissance de cause?

      Une étude montre que plus les gens sont conscients de ce qui est à l’œuvre, plus ils sont réticents à utiliser les services qui exploitent les données de leur vie privée.

  4. Aug 2020
  5. Jul 2020
    1. a new kind of power

      This is what Shoshana Zuboff sustains in The Age of Surveillance Capitalism: a new kind of power which can, at first, be apprehended through Marx’s lenses; but as a new form of capitalism, it <mark>“cannot be reduced to known harms—monopoly, privacy—and therefore do not easily yield to known forms of combat.”</mark>

      It is <mark>“a new form of capitalism on its own terms and in its own words”</mark> which therefore requires new conceptual frameworks to be understood, negotiated.

    1. Our membership inference attack exploits the observationthat machine learning models often behave differently on thedata that they were trained on versus the data that they “see”for the first time.

      How well would this work on some of the more recent zero-shot models?

    1. It is the natural trajectory of business to seek out new ways to drive revenue from products like microwaves, televisions, refrigerators, and speakers. And now that microwaves and TVs can effectively operate as mini-computers, it feels inevitable that manufacturers would look to collect potentially valuable data — whether for resale, for product optimization, or to bring down the sticker price of the device.
  6. Jun 2020
    1. This advertising system is designed to enable hyper-targeting, which has many unintended consequences that have dominated the headlines in recent years, such as the ability for bad actors to use the system to influence elections, to exclude groups in a way that facilitates discrimination, and to expose your personal data to companies you’ve never even heard of.

      Where your Google data goes to

    2. Alarmingly, Google now deploys hidden trackers on 76% of websites across the web to monitor your behavior and Facebook has hidden trackers on about 25% of websites, according to the Princeton Web Transparency & Accountability Project. It is likely that Google and/or Facebook are watching you on most sites you visit, in addition to tracking you when using their products.

    1. Of course, with Facebook being Facebook, there is another, more commercial outlet for this type of metadata analysis. If the platform knows who you are, and knows what you do based on its multi-faceted internet tracking tools, then knowing who you talk to and when could be a commercial goldmine. Person A just purchased Object 1 and then chatted to Person B. Try to sell Object 1 to Person B. All of which can be done without any messaging content being accessed.
    1. As uber-secure messaging platform Signal has warned, “Signal is recommended by the United States military. It is routinely used by senators and their staff. American allies in the EU Commission are Signal users too. End-to-end encryption is fundamental to the safety, security, and privacy of conversations worldwide.”
  7. May 2020
    1. Someone had taken control of my iPad, blasting through Apple’s security restrictions and acquiring the power to rewrite anything that the operating system could touch. I dropped the tablet on the seat next to me as if it were contagious. I had an impulse to toss it out the window. I must have been mumbling exclamations out loud, because the driver asked me what was wrong. I ignored him and mashed the power button. Watching my iPad turn against me was remarkably unsettling. This sleek little slab of glass and aluminum featured a microphone, cameras on the front and back, and a whole array of internal sensors. An exemplary spy device.
    1. We present results from technical experiments which reveal that WeChat communications conducted entirely among non-China-registered accounts are subject to pervasive content surveillance that was previously thought to be exclusively reserved for China-registered accounts.

      WeChat not only tracks Chinese accounts

    1. Google encouraging site admins to put reCaptcha all over their sites, and then sharing the resulting risk scores with those admins is great for security, Perona thinks, because he says it “gives site owners more control and visibility over what’s going on” with potential scammer and bot attacks, and the system will give admins more accurate scores than if reCaptcha is only using data from a single webpage to analyze user behavior. But there’s the trade-off. “It makes sense and makes it more user-friendly, but it also gives Google more data,”
    1. “Until CR 1.0 there was no effective privacy standard or requirement for recording consent in a common format and providing people with a receipt they can reuse for data rights.  Individuals could not track their consents or monitor how their information was processed or know who to hold accountable in the event of a breach of their privacy,” said Colin Wallis, executive director, Kantara Initiative.  “CR 1.0 changes the game.  A consent receipt promises to put the power back into the hands of the individual and, together with its supporting API — the consent receipt generator — is an innovative mechanism for businesses to comply with upcoming GDPR requirements.  For the first time individuals and organizations will be able to maintain and manage permissions for personal data.”
  8. Apr 2020
    1. Don’t share any private, identifiable information on social media It may be fun to talk about your pets with your friends on Instagram or Twitter, but if Fluffy is the answer to your security question, then you shouldn’t share that with the world. This may seem quite obvious, but sometimes you get wrapped up in an online conversation, and it is quite easy to let things slip out. You may also want to keep quiet about your past home or current home locations or sharing anything that is very unique and identifiable. It could help someone fake your identity.
    2. Don’t share vacation plans on social media Sharing a status of your big trip to the park on Saturday may be a good idea if you are looking to have a big turnout of friends to join you, but not when it comes to home and personal safety. For starters, you have just broadcasted where you are going to be at a certain time, which can be pretty dangerous if you have a stalker or a crazy ex. Secondly, you are telling the time when you won’t be home, which can make you vulnerable to being robbed. This is also true if you are sharing selfies of yourself on the beach with a caption that states “The next 2 weeks are going to be awesome!” You have just basically told anyone who has the option to view your photo and even their friends that you are far away from home and for how long.
    1. Finally, from a practical point of view, we suggest the adoption of "privacy label," food-like notices, that provide the required information in an easily understandable manner, making the privacy policies easier to read. Through standard symbols, colors and feedbacks — including yes/no statements, where applicable — critical and specific scenarios are identified. For example, whether or not the organization actually shares the information, under what specific circumstances this occurs, and whether individuals can oppose the share of their personal data. This would allow some kind of standardized information. Some of the key points could include the information collected and the purposes of its collection, such as marketing, international transfers or profiling, contact details of the data controller, and distinct differences between organizations’ privacy practices, and to identify privacy-invasive practices.
    1. people encountering public Hypothesis annotations anywhere don’t have to worry about their privacy.

      In the Privacy Policy document there is an annotation that says:

      I decided against using hypothes.is as the commenting system for my blog, since I don't want my readers to be traceable by a third party I choose on their behalf

      Alhtough this annotation is a bit old -from 2016- I understand that Hypothes.is server would in fact get information from these readers through HTTP requests, correct? Such as IP address, browser's agent, etc. I wonder whether this is the traceability the annotator was referring to.

      Anyway, I think this wouldn't be much different to how an embedded image hosted elsewhere would be displayed on one such site. And Hypothes.is' Privacy Policy states that

      This information is collected in a log file and retained for a limited time

    1. Before we get to passwords, surely you already have in mind that Google knows everything about you. It knows what websites you’ve visited, it knows where you’ve been in the real world thanks to Android and Google Maps, it knows who your friends are thanks to Google Photos. All of that information is readily available if you log in to your Google account. You already have good reason to treat the password for your Google account as if it’s a state secret.
    1. Alas, you'll have to manually visit each site in turn and figure out how to actually delete your account. For help, turn to JustDelete.me, which provides direct links to the cancellation pages of hundreds of services.
    1. When you visit a website, you are allowing that site to access a lot of information about your computer's configuration. Combined, this information can create a kind of fingerprint — a signature that could be used to identify you and your computer. Some companies use this technology to try to identify individual computers.
    1. Our approach strikes a balance between privacy, computation overhead, and network latency. While single-party private information retrieval (PIR) and 1-out-of-N oblivious transfer solve some of our requirements, the communication overhead involved for a database of over 4 billion records is presently intractable. Alternatively, k-party PIR and hardware enclaves present efficient alternatives, but they require user trust in schemes that are not widely deployed yet in practice. For k-party PIR, there is a risk of collusion; for enclaves, there is a risk of hardware vulnerabilities and side-channels.
    2. Privacy is at the heart of our design: Your usernames and passwords are incredibly sensitive. We designed Password Checkup with privacy-preserving technologies to never reveal this personal information to Google. We also designed Password Checkup to prevent an attacker from abusing Password Checkup to reveal unsafe usernames and passwords. Finally, all statistics reported by the extension are anonymous. These metrics include the number of lookups that surface an unsafe credential, whether an alert leads to a password change, and the web domain involved for improving site compatibility.
    1. "If someone knows your old passwords, they can catch onto your system. If you're in the habit of inventing passwords with the name of a place you've lived and the zip code, for example, they could find out where I have lived in the past by mining my Facebook posts or something."Indeed, browsing through third-party password breaches offers glimpses into the things people hold dear — names of spouses and children, prayers, and favorite places or football teams. The passwords may no longer be valid, but that window into people's secret thoughts remains open.
    1. “A phone number is worth more on the dark web than a Social Security number. Your phone is so much more rich with data,” says J.D. Mumford, who runs Anonyome Labs Inc. in Salt Lake City.

      “Facial recognition technology is now cheap enough where you can put it in every Starbucks and have your coffee ready when you’re in the front of the line,” says Lorrie Cranor, a computer science professor at Carnegie Mellon University who runs its CyLab Usable Privacy and Security Laboratory in Pittsburgh. In March, the New York Times put three cameras on a rooftop in Manhattan, spent $60 on Amazon’s Rekognition system, and identified several people. I took an Air France flight that had passengers board using our faceprints, taken from our passports without our permission.

      Private companies such as Vigilant Solutions Inc., headquartered in the Valley, have cameras that have captured billions of geotagged photos of cars on streets and in parking lots that they sell on the open market, mostly to police and debt collectors.

      Project Kovr runs a similar workshop at schools, in which it assigns some kids to stalk another child from a distance so they can create a data profile and tailor an ad campaign for the stalkee. Baauw has also been planning a project in which he chisels a statue of Facebook Chief Executive Officer Mark Zuckerberg as a Roman god. “He’s the Zeus of our time,” he says.

      Until people demand a law that makes privacy the default, I’m going to try to remember, each time I click on something, that free things aren’t free. That when I send an email or a text outside of Signal or MySudo, I should expect those messages to one day be seen.

    1. Someone, somewhere has screwed up to the extent that data got hacked and is now in the hands of people it was never intended to be. No way, no how does this give me license to then treat that data with any less respect than if it had remained securely stored and I reject outright any assertion to the contrary. That's a fundamental value I operate under
    1. Unlike Zoom, Apple’s FaceTime video conference service is truly end-to-end encrypted. Group FaceTime calls offer a privacy-conscious alternative for up to 32 participants. The main caveat is that this option only works if everyone on the call has an Apple device that currently supports this feature.
    1. Google's move to release location data highlights concerns around privacy. According to Mark Skilton, director of the Artificial Intelligence Innovation Network at Warwick Business School in the UK, Google's decision to use public data "raises a key conflict between the need for mass surveillance to effectively combat the spread of coronavirus and the issues of confidentiality, privacy, and consent concerning any data obtained."
    1. Thousands of enterprises around the world have done exhaustive security reviews of our user, network, and data center layers and confidently selected Zoom for complete deployment. 

      This doesn't really account for the fact that Zoom have committed some atrociously heinous acts, such as (and not limited to):

  9. Mar 2020
    1. This is known as transport encryption, which is different from end-to-end encryption because the Zoom service itself can access the unencrypted video and audio content of Zoom meetings. So when you have a Zoom meeting, the video and audio content will stay private from anyone spying on your Wi-Fi, but it won’t stay private from the company.
    1. Data privacy now a global movementWe’re pleased to say we’re not the only ones who share this philosophy, web browsing companies like Brave have made it possible so you can browse the internet more privately; and you can use a search engine like DuckDuckGo to search with the freedom you deserve.
  10. www.graphitedocs.com www.graphitedocs.com
    1. When you think about data law and privacy legislations, cookies easily come to mind as they’re directly related to both. This often leads to the common misconception that the Cookie Law (ePrivacy directive) has been repealed by the General Data Protection Regulation (GDPR), which in fact, it has not. Instead, you can instead think of the ePrivacy Directive and GDPR as working together and complementing each other, where, in the case of cookies, the ePrivacy generally takes precedence.
    1. What information is being collected? Who is collecting it? How is it collected? Why is it being collected? How will it be used? Who will it be shared with? What will be the effect of this on the individuals concerned? Is the intended use likely to cause individuals to object or complain?
    1. If your agreement with Google incorporates this policy, or you otherwise use a Google product that incorporates this policy, you must ensure that certain disclosures are given to, and consents obtained from, end users in the European Economic Area along with the UK. If you fail to comply with this policy, we may limit or suspend your use of the Google product and/or terminate your agreement.
    1. Google Analytics created an option to remove the last octet (the last group of 3 numbers) from your visitor’s IP-address. This is called ‘IP Anonymization‘. Although this isn’t complete anonymization, the GDPR demands you use this option if you want to use Analytics without prior consent from your visitors. Some countris (e.g. Germany) demand this setting to be enabled at all times.
    1. A data subject should have the right of access to personal data which have been collected concerning him or her, and to exercise that right easily and at reasonable intervals, in order to be aware of, and verify, the lawfulness of the processing
    2. Every data subject should therefore have the right to know and obtain communication in particular with regard to the purposes for which the personal data are processed, where possible the period for which the personal data are processed, the recipients of the personal data, the logic involved in any automatic personal data processing
    1. As a condition of use of this site, all users must give permission for CIVIC to use its access logs to attempt to track users who are reasonably suspected of gaining, or attempting to gain, unauthorised access. All log file information collected by CIVIC is kept secure and no access to raw log files is given to any third party.