425 Matching Annotations
  1. May 2020
    1. What is more frightening than being merely watched, though, is being controlled. When Facebook can know us better than our parents with only 150 likes, and better than our spouses with 300 likes, the world appears quite predictable, both for governments and for businesses. And predictability means control.

      "Predictability means control"

    1. Chu, H. Y., Englund, J. A., Starita, L. M., Famulare, M., Brandstetter, E., Nickerson, D. A., Rieder, M. J., Adler, A., Lacombe, K., Kim, A. E., Graham, C., Logue, J., Wolf, C. R., Heimonen, J., McCulloch, D. J., Han, P. D., Sibley, T. R., Lee, J., Ilcisin, M., … Bedford, T. (2020). Early Detection of Covid-19 through a Citywide Pandemic Surveillance Platform. New England Journal of Medicine, NEJMc2008646. https://doi.org/10.1056/NEJMc2008646

  2. Apr 2020
    1. Edward Snowden disclosed in 2013 that the US government's Upstream program was collecting data people reading Wikipedia articles. This revelation had significant impact the self-censorship of the readers, as shown by the fact that there were substantially fewer views for articles related to terrorism and security.[12] The court case Wikimedia Foundation v. NSA has since followed.
    1. Google's move to release location data highlights concerns around privacy. According to Mark Skilton, director of the Artificial Intelligence Innovation Network at Warwick Business School in the UK, Google's decision to use public data "raises a key conflict between the need for mass surveillance to effectively combat the spread of coronavirus and the issues of confidentiality, privacy, and consent concerning any data obtained."
    1. Thousands of enterprises around the world have done exhaustive security reviews of our user, network, and data center layers and confidently selected Zoom for complete deployment. 

      This doesn't really account for the fact that Zoom have committed some atrociously heinous acts, such as (and not limited to):

  3. Mar 2020
    1. This is known as transport encryption, which is different from end-to-end encryption because the Zoom service itself can access the unencrypted video and audio content of Zoom meetings. So when you have a Zoom meeting, the video and audio content will stay private from anyone spying on your Wi-Fi, but it won’t stay private from the company.
    2. But despite this misleading marketing, the service actually does not support end-to-end encryption for video and audio content, at least as the term is commonly understood. Instead it offers what is usually called transport encryption, explained further below
    1. Comment savoir si l’élève fait son travail tout seul?La continuité pédagogique est destinée à s’assurer que les élèves poursuivent des activités scolaires leur permettant de progresser dans leurs apprentissages. Il s’agit d’attirer l’attention des élèves sur l’importance et la régularité du travail personnel quelle que soit l’activité, même si elle est réalisée avec l’aide d’un pair ou d’un tiers. Des travaux réguliers et évalués régulièrement y contribuent. Toutefois, le professeur ne peut contrôler l’assiduité dans ce cadre, ni sanctionner son éventuel défaut.
    1. And if people were really cool about sharing their personal and private information with anyone, and totally fine about being tracked everywhere they go and having a record kept of all the people they know and have relationships with, why would the ad tech industry need to spy on them in the first place? They could just ask up front for all your passwords.
    2. The deception enabled by dark pattern design not only erodes privacy but has the chilling effect of putting web users under pervasive, clandestine surveillance, it also risks enabling damaging discrimination at scale.
    1. Enligt Polismyndighetens riktlinjer ska en konsekvensbedömning göras innan nya polisiära verktyg införs, om de innebär en känslig personuppgiftbehandling. Någon sådan har inte gjorts för det aktuella verktyget.

      Swedish police have used Clearview AI without any 'consequence judgement' having been performed.

      In other words, Swedish police have used a facial-recognition system without being allowed to do so.

      This is a clear breach of human rights.

      Swedish police has lied about this, as reported by Dagens Nyheter.

    1. Mastercard acquired NuData Security in 2017 and it has been making advances in biometric identification.
    2. The payment provider told MarketWatch that everyone has a unique walk, and it is investigating innovative behavioral biometrics such as gait, face, heartbeat and veins for cutting edge payment systems of the future.

      This is a true invasion into people's lives.

      Remember: this is a credit-card company. We use them to pay for stuff. They shouldn't know what we look like, how we walk, how our hearts beat, nor how our 'vein technology' works.

  4. Feb 2020
    1. Last year, Facebook said it would stop listening to voice notes in messenger to improve its speech recognition technology. Now, the company is starting a new program where it will explicitly ask you to submit your recordings, and earn money in return.

      Given Facebook's history with things like breaking laws that end up with them paying billions of USD in damages (even though it's a joke), sold ads to people who explicitly want to target people who hate jews, and have spent millions of USD every year solely on lobbyism, don't sell your personal experiences and behaviours to them.

      Facebook is nefarious and psychopathic.

    1. I suspect that Wacom doesn’t really think that it’s acceptable to record the name of every application I open on my personal laptop. I suspect that this is why their privacy policy doesn’t really admit that this is what that they do.
  5. Jan 2020
    1. A Microsoft programme to transcribe and vet audio from Skype and Cortana, its voice assistant, ran for years with “no security measures”, according to a former contractor who says he reviewed thousands of potentially sensitive recordings on his personal laptop from his home in Beijing over the two years he worked for the company.

      Wonderful. This, combined with the fact that Skype users can—fairly easily—find out which contacts another person has, is horrifying.

      Then again, most people know that Microsoft have colluded with American authorities to divulge chat/phone history for a long time, right?

  6. Dec 2019
    1. We are barrelling toward a country with 350 million serfs serving 3 million lords. We attempt to pacify the serfs with more powerful phones, bigger TVs, great original scripted television, and Mandalorian action figures delivered to your doorstep within the hour. The delivery guy might be forced to relieve himself in your bushes if not for the cameras his boss installed on every porch.
    1. Google found 1,494 device identifiers in SensorVault, sending them to the ATF to comb through. In terms of numbers, that’s unprecedented for this form of search. It illustrates how Google can pinpoint a large number of mobile phones in a brief period of time and hand over that information to the government
  7. Nov 2019
    1. half of iPhone users don’t know there’s a unique ID on their phone (called an IDFA, for “identifier for advertisers”) tracking their app activity and sending it to third-party advertisers by default.
    1. Google has confirmed that it partnered with health heavyweight Ascension, a Catholic health care system based in St. Louis that operates across 21 states and the District of Columbia.

      What happened to 'thou shalt not steal'?

    1. Found a @facebook #security & #privacy issue. When the app is open it actively uses the camera. I found a bug in the app that lets you see the camera open behind your feed.

      So, Facebook uses your camera even while not active.

    1. Speaking with MIT Technology Review, Rohit Prasad, Alexa’s head scientist, has now revealed further details about where Alexa is headed next. The crux of the plan is for the voice assistant to move from passive to proactive interactions. Rather than wait for and respond to requests, Alexa will anticipate what the user might want. The idea is to turn Alexa into an omnipresent companion that actively shapes and orchestrates your life. This will require Alexa to get to know you better than ever before.

      This is some next-level onslaught.

    1. Somewhere in a cavernous, evaporative cooled datacenter, one of millions of blinking Facebook servers took our credentials, used them to authenticate to our private email account, and tried to pull information about all of our contacts. After clicking Continue, we were dumped into the Facebook home page, email successfully “confirmed,” and our privacy thoroughly violated.
    1. In 2013, Facebook began offering a “secure” VPN app, Onavo Protect, as a way for users to supposedly protect their web activity from prying eyes. But Facebook simultaneously used Onavo to collect data from its users about their usage of competitors like Twitter. Last year, Apple banned Onavo from its App Store for violating its Terms of Service. Facebook then released a very similar program, now dubbed variously “Project Atlas” and “Facebook Research.” It used Apple’s enterprise app system, intended only for distributing internal corporate apps to employees, to continue offering the app to iOS users. When the news broke this week, Apple shut down the app and threw Facebook into some chaos when it (briefly) booted the company from its Enterprise Developer program altogether.
    1. If the apparatus of total surveillance that we have described here were deliberate, centralized, and explicit, a Big Brother machine toggling between cameras, it would demand revolt, and we could conceive of a life outside the totalitarian microscope.
    1. The FBI is currently collecting data about our faces, irises, walking patterns, and voices, permitting the government to pervasively identify, track, and monitor us. The agency can match or request a match of our faces against at least 640 million images of adults living in the U.S. And it is reportedly piloting Amazon’s flawed face recognition surveillance technology.

      FBI and Amazon are being sued because of surveillance of people living in the USA.

    1. Senior government officials in multiple U.S.-allied countries were targeted earlier this year with hacking software that used Facebook Inc’s (FB.O) WhatsApp to take over users’ phones, according to people familiar with the messaging company’s investigation.
  8. Oct 2019
    1. Per Bloomberg, which cited an memo from an anonymous Google staffer, employees discovered that the company was creating the new tool as a Chrome browser extension that would be installed on all employees’ systems and used to monitor their activities.

      From the Bloomberg article:

      Earlier this month, employees said they discovered that a team within the company was creating the new tool for the custom Google Chrome browser installed on all workers’ computers and used to search internal systems. The concerns were outlined in a memo written by a Google employee and reviewed by Bloomberg News and by three Google employees who requested anonymity because they aren’t authorized to talk to the press.

    1. A highly interesting article where a well-known company prefers blood money to allowing employees to talk about politics. This is capitalism at its core: all profit, no empathy.

    2. Meanwhile at Microsoft's GitHub, employees at both companies have objected to GitHub's business with ICE, not to mention Microsoft's government contracts. Employees at Amazon have also urged the company not to sell its facial recognition technology to police and the military.
    3. If you can see how people might respond to IBM, infamous for providing technology that helped the Nazis in World War II, saying, "Who has time to look into the source of this hard German currency?" you can imagine how GitLab's policy amendment has been received.
    1. there's still the issue of user IP addresses, which Tencent would see for those using devices with mainland China settings. That's a privacy concern, but its one among many given that other Chinese internet companies – ISPs, app providers, cloud service providers, and the like – can be assumed to collect that information and provide it to the Chinese surveillance state on demand.
    1. This system will apply to foreign owned companies in China on the same basis as to all Chinese persons, entities or individuals. No information contained on any server located within China will be exempted from this full coverage program. No communication from or to China will be exempted. There will be no secrets. No VPNs. No private or encrypted messages. No anonymous online accounts. No trade secrets. No confidential data. Any and all data will be available and open to the Chinese government. Since the Chinese government is the shareholder in all SOEs and is now exercising de facto control over China’s major private companies as well, all of this information will then be available to those SOEs and Chinese companies. See e.g. China to place government officials inside 100 private companies, including Alibaba. All this information will be available to the Chinese military and military research institutes. The Chinese are being very clear that this is their plan.

      At least the current Chinese government are clear about how all-intrusive they will be, so that people can avoid them. IF people can avoid them.

    1. Amazon doesn’t tell customers much about its troubleshooting process for Cloud Cam. In its terms and conditions, the company reserves the right to process images, audio and video captured by devices to improve its products and services.
    2. Nowhere in the Cloud Cam user terms and conditions does Amazon explicitly tell customers that human beings are training the algorithms behind their motion detection software.
    3. An Amazon team also transcribes and annotates commands recorded in customers’ homes by the company’s Alexa digital assistant
    4. Dozens of Amazon workers based in India and Romania review select clips captured by Cloud Cam, according to five people who have worked on the program or have direct knowledge of it.
    1. We recently discovered that when you provided an email address or phone number for safety or security purposes (for example, two-factor authentication) this data may have inadvertently been used for advertising purposes, specifically in our Tailored Audiences and Partner Audiences advertising system. 

      Twitter may have sold your e-mail address to people.

      Twitter has only done this with people who have added their e-mail address for security purposes.

      Security purposes for Twitter = sell your e-mail address to a third-party company.

      Spam for you = security purposes for Twitter.

    1. In case you wanted to be even more skeptical of Mark Zuckerberg and his cohorts, Facebook has now changed its advertising policies to make it easier for politicians to lie in paid ads. Donald Trump is taking full advantage of this policy change, as popular info reports.
    2. The claim in this ad was ruled false by those Facebook-approved third-party fact-checkers, but it is still up and running. Why? Because Facebook changed its policy on what constitutes misinformation in advertising. Prior to last week, Facebook’s rule against “false and misleading content” didn’t leave room for gray areas: “Ads landing pages, and business practices must not contain deceptive, false, or misleading content, including deceptive claims, offers, or methods.”
  9. Sep 2019
  10. Aug 2019
    1. I think Netflix would’ve avoided this controversy if it had plainly told subscribers what it was doing somewhere in the app or with a notification. Instead, people discovered that Netflix was utilizing Android’s physical activity permission, which is strange behavior from a video streaming app. In some instances, it was doing this without asking users to approve the move first, as was the case for The Next Web’s Ivan Mehta. You’ve got to be transparent if you want to monitor anyone’s movements. Netflix was unable to immediately answer whether it will be removing the physical activity recognition permission from its app now that the test is done.

      It's great that sites like The Verge and The Next Web are calling surveillance capitalists out.

  11. Jul 2019
    1. Even if we never see this brain-reading tech in Facebook products (something that would probably cause just a little concern), researchers could use it to improve the lives of people who can’t speak due to paralysis or other issues.
    2. That’s very different from the system Facebook described in 2017: a noninvasive, mass-market cap that lets people type more than 100 words per minute without manual text entry or speech-to-text transcription.
    3. Their work demonstrates a method of quickly “reading” whole words and phrases from the brain — getting Facebook slightly closer to its dream of a noninvasive thought-typing system.
    1. SZ: We are not users. I say we are bound in new psychological, social, political, as well as, economic interests. That we have not yet invented the words to describe the ways that we are bound. We have not yet invented the forms of collective action to express the interests that bind us.  And that that is a big part of the work that must follow in this year and the next year and the year after that, if we are to ultimately interrupt and outlaw what I view as a pernicious rogue capitalism that has no business dominating our society.
    1. “We are a nation with a tradition of reining in monopolies, no matter how well-intentioned the leaders of these companies may be.”Mr. Hughes went on to describe the power held by Facebook and its leader Mr. Zuckerberg, his former college roommate, as “unprecedented.” He added, “It is time to break up Facebook.”
    1. AWS announced the general availability of Amazon Personalize, a fully-managed machine learning service that trains, tunes, and deploys custom, private machine learning models.

      Is this more of commoditising human experience so that Jeff Bezos can be even more rich?

    2. The number of Alexa-compatible smart home devices continues to grow, with more than 60,000 smart home products from over 7,400 unique brands
    3. Amazon introduced the all-new Echo Show 5
    1. According to Shoshana Zuboff, professor emerita at Harvard Business School, the Cambridge Analytica scandal was a landmark moment, because it revealed a micro version “of the larger phenomenon that is surveillance capitalism”. Zuboff is responsible for formulating the concept of surveillance capitalism, and published a magisterial, indispensible book with that title soon after the scandal broke. In the book, Zuboff creates a framework and a language for understanding this new world. She believes The Great Hack is an important landmark in terms of public understanding, and that Noujaim and Amer capture “what living under the conditions of surveillance capitalism means. That every action is being repurposed as raw material for behavioural data. And that these data are being lifted from our lives in ways that are systematically engineered to be invisible. And therefore we can never resist.”

      Shoshana Zuboff's comments on The Great Hack.

    1. Comparison between web browsers

      This is one of the best resources on web privacy I've ever seen. I warmly recommend it!

    1. Two years ago, when he moved from Boston to London, he had to register with a general practitioner. The doctor’s office gave him a form to sign saying that his medical data would be shared with other hospitals he might go to, and with a system that might distribute his information to universities, private companies and other government departments.The form added that the although the data are anonymized, “there are those who believe a person can be identified through this information.”“That was really scary,” Dr. de Montjoye said. “We are at a point where we know a risk exists and count on people saying they don’t care about privacy. It’s insane.”
    2. Scientists at Imperial College London and Université Catholique de Louvain, in Belgium, reported in the journal Nature Communications that they had devised a computer algorithm that can identify 99.98 percent of Americans from almost any available data set with as few as 15 attributes, such as gender, ZIP code or marital status.

      This goes to show that one should not trust companies and organisations which claim to "anonymise" your data.

    1. the question we should really be discussing is “How many years should Mark Zuckerberg and Sheryl Sandberg ultimately serve in prison?”
    2. “those who were responsible for ensuring the accuracy ‘did not give a shit.’” Another individual, “a former Operations Contractor with Facebook, stated that Facebook was not concerned with stopping duplicate or fake accounts.”
    1. In contrast to such pseudonymous social networking, Facebook is notable for its longstanding emphasis on real identities and social connections.

      Lack of anonymity also increases Facebook's ability to properly link shadow profiles purchased from other data brokers.

    2. Within this larger context, Facebook, Google (YouTube, Google+, Blogger), and Twitter have grown from small projects mocked up on sketchbooks and developed in college dorms to global networks of billions, garnering attention from venture capitalists who invested in pursuit of growth in revenues and profits and ultimately public offerings of stock. Facebook, Google, and Twitter are thus articulated into a particular political economy of the Internet, one dependent on surveillance of user activities, the construction of user data profiles, and the sale of user attention to an increasingly sophisticated Internet marketing industry (Langlois, McKelvey, Elmer, & Werbin, 2009).
    1. What should lawmakers do? First, interrupt and outlaw surveillance capitalism’s data supplies and revenue flows. This means, at the front end, outlawing the secret theft of private experience. At the back end, we can disrupt revenues by outlawing markets that trade in human futures knowing that their imperatives are fundamentally anti-democratic. We already outlaw markets that traffic in slavery or human organs. Second, research over the past decade suggests that when “users” are informed of surveillance capitalism’s backstage operations, they want protection, and they want alternatives. We need laws and regulation designed to advantage companies that want to break with surveillance capitalism. Competitors that align themselves with the actual needs of people and the norms of a market democracy are likely to attract just about every person on Earth as their customer. Third, lawmakers will need to support new forms of collective action, just as nearly a century ago workers won legal protection for their rights to organise, to bargain collectively and to strike. Lawmakers need citizen support, and citizens need the leadership of their elected officials.

      Shoshana Zuboff's answer to surveillance capitalism

  12. Jun 2019
  13. May 2019
    1. They’ve learned, and that’s more dangerous than caring, because that means they’re rationally pricing these harms. The day that 20% of consumers put a price tag on privacy, freemium is over and privacy is back.

      Google want you to say yes, not because they're inviting positivity more than ever, but because they want you to purchase things and make them richer. This is the essence of capitalism.

  14. Apr 2019
    1. The report also noted a 27 percent increase in the number of foreigners whose communications were targeted by the NSA during the year. In total, an estimated 164,770 foreign individuals or groups were targeted with search terms used by the NSA to monitor their communications, up from 129,080 on the year prior.
    2. The data, published Tuesday by the Office of the Director of National Intelligence (ODNI), revealed a 28 percent rise in the number of targeted search terms used to query massive databases of collected Americans’ communications.
    1. we get some of it by collecting data about your interactions, use and experiences with our products. The data we collect depends on the context of your interactions with Microsoft and the choices that you make, including your privacy settings and the products and features that you use. We also obtain data about you from third parties.
    1. drivers delivering Amazon packages have reported feeling so pressured that they speed through neighborhoods, blow by stop signs, and pee in bottles in the trucks or outside
    2. Amazon's system tracks a metric called "time off task," meaning how much time workers pause or take breaks, The Verge reported. It has been previously reported that some workers feel so pressured that they don't take bathroom breaks.
    3. Amazon employs a system that not only tracks warehouse workers' productivity but also can automatically fire them for failing to meet expectations.

      The bots now fire humans. AI 2.0.

    1. Facebook said on Wednesday that it expected to be fined up to $5 billion by the Federal Trade Commission for privacy violations. The penalty would be a record by the agency against a technology company and a sign that the United States was willing to punish big tech companies.

      This is where surveillance capitalism brings you.

      Sure, five billion American Dollars won't make much of a difference to Facebook, but it's notable.

    1. So far, according to the Times and other outlets, this technique is being used by the FBI and police departments in Arizona, North Carolina, California, Florida, Minnesota, Maine, and Washington, although there may be other agencies using it across the country.
    2. In a new article, the New York Times details a little-known technique increasingly used by law enforcement to figure out everyone who might have been within certain geographic areas during specific time periods in the past. The technique relies on detailed location data collected by Google from most Android devices as well as iPhones and iPads that have Google Maps and other apps installed. This data resides in a Google-maintained database called “Sensorvault,” and because Google stores this data indefinitely, Sensorvault “includes detailed location records involving at least hundreds of millions of devices worldwide and dating back nearly a decade.”

      Google is passing on location data to law enforcement without letting users know.

    1. Google says that will prevent the company from remembering where you’ve been. Google’s support page on the subject states: “You can turn off Location History at any time. With Location History off, the places you go are no longer stored.” That isn’t true. Even with Location History paused, some Google apps automatically store time-stamped location data without asking. (It’s possible, although laborious, to delete it .)
    1. Per a Wednesday report in Business Insider, Facebook has now said that it automatically extracted contact lists from around 1.5 million email accounts it was given access to via this method without ever actually asking for their permission. Again, this is exactly the type of thing one would expect to see in a phishing attack.

      Facebook are worse than Nixon, when he said "I'm not a crook".

    1. We may share information as discussed below, but we won’t sell it to advertisers or other third parties.

      Notice Drop Box states up front that it does not sell your info to advertisers and third parties. This line is crucial to your data privacy.

    1. (iii) Information we collect from other sources: From time to time, we may obtain information about you or your Contacts from third-party sources, such as public databases, social media platforms, third-party data providers and our joint marketing partners. We take steps to ensure that such third parties are legally or contractually permitted to disclose such information to us.

      So while this is a free site, they can mine your data including your social media account. All of this in the name of providing you better service.

    1. “In contrast to Dr. Wood’s claims, bias found in one system is cause for concern in the other, particularly in use cases that could severely impact people’s lives, such as law enforcement applications,” they wrote.

      This is more important than most people probably realise. Recognition bias will decide if a person dies or not, when implemented at substantial scale, which isn't far away.

    1. The highlight of today’s announcements is the beta launch of the company’s AI Platform. The idea here is to offer developers and data scientists an end-to-end service for building, testing and deploying their own models.
    1. AMP is a set of rules that publishers (typically news and analysis content providers) must abide by in order to appear in the “Top Stories” section of Google’s search results, a lucrative position at the top of the page.

      This is just one of many reasons for not using Google's search engine. Or most of their products.

      Monotheistic and, more importantly, monopolistic thinking like this drags us all down.

    1. Amazon.com Inc. is positioning Alexa, its artificial-intelligence assistant, to track consumers’ prescriptions and relay personal health information, in a bid to insert the technology into everyday health care.

      Surveillance capitalism, anyone?

    1. “They are morally bankrupt pathological liars who enable genocide (Myanmar), facilitate foreign undermining of democratic institutions. “[They] allow the live streaming of suicides, rapes, and murders, continue to host and publish the mosque attack video, allow advertisers to target ‘Jew haters’ and other hateful market segments, and refuse to accept any responsibility for any content or harm. “They #dontgiveazuck” wrote Edwards.

      Well, I don't think he should have deleted his tweets.

    1. Amazon’s technology struggles more than some peers’ to identify the gender of individuals with darker skin, prompting fears of unjust arrests. Amazon has defended its work and said all users must follow the law.

      Draw any parallel to "The Handmaid's Tale" and you're right.

    2. U.S. securities regulators shot down attempts by Amazon.com Inc to stop its investors from considering two shareholder proposals about the company’s controversial sale of a facial recognition service, a sign of growing scrutiny of the technology.

      Surveillance capitalism at its worst; this behemoth tries to have the people who own it not make decisions.

      Capitalism is like Skynet, an organism that's taken flight on its own, bound to make solipsistic and egoistic judgments and choices.

    1. Digital sociology needs more big theory as well as testable theory.

      I can't help but think here about the application of digital technology to large bodies of literature in the creation of the field of corpus linguistics.

      If traditional sociology means anything, then a digital incarnation of it should create physical and trackable means that can potentially be more easily studied as a result. Just the same way that Mark Dredze has been able to look at Twitter data to analyze public health data like influenza, we should be able to more easily quantify sociological phenomenon in aggregate by looking at larger and richer data sets of online interactions.

      There's also likely some value in studying the quantities of digital exhaust that companies like Google, Amazon, Facebook, etc. are using for surveillance capitalism.

    1. “Prison labor” is usually associated with physical work, but inmates at two prisons in Finland are doing a new type of labor: classifying data to train artificial intelligence algorithms for a startup. Though the startup in question, Vainu, sees the partnership as a kind of prison reform that teaches valuable skills, other experts say it plays into the exploitative economics of prisoners being required to work for very low wages.

      Naturally, this is exploitative; the inmates do not learn a skill that they can take out into the real world.

      I'd be surprised if they'd not have to sign a NDA for this.

  15. Mar 2019
    1. If you do not like the price you’re being offered when you shop, do not take it personally: many of the prices we see online are being set by algorithms that respond to demand and may also try to guess your personal willingness to pay. What’s next? A logical next step is that computers will start conspiring against us. That may sound paranoid, but a new study by four economists at the University of Bologna shows how this can happen.
    1. Mention McDonald’s to someone today, and they're more likely to think about Big Mac than Big Data. But that could soon change: The fast-food giant has embraced machine learning, in a fittingly super-sized way.McDonald’s is set to announce that it has reached an agreement to acquire Dynamic Yield, a startup based in Tel Aviv that provides retailers with algorithmically driven "decision logic" technology. When you add an item to an online shopping cart, it’s the tech that nudges you about what other customers bought as well. Dynamic Yield reportedly had been recently valued in the hundreds of millions of dollars; people familiar with the details of the McDonald’s offer put it at over $300 million. That would make it the company's largest purchase since it acquired Boston Market in 1999.

      McDonald's are getting into machine learning. Beware.

    1. As one of 13 million officially designated “discredited individuals,” or laolai in Chinese, 47-year-old Kong is banned from spending on “luxuries,” whose definition includes air travel and fast trains.
    2. Discredited individuals have been barred from taking a total of 17.5 million flights and 5.5 million high-speed train trips as of the end of 2018, according to the latest annual report by the National Public Credit Information Center.The list of “discredited individuals” was introduced in 2013, months before the State Council unveiled a plan in 2014 to build a social credit system by 2020.

      This is what surveillance capitalism brings. This is due to what is called China's "Golden Shield", a credit-statement system that, for example, brings your credit level down if you search for terms such as "Tianmen Square Protest" or post "challenging" pictures on Facebook.

      This is surveillance capitalism at its worst, creating a new lower class for the likes of Google, Facebook, Microsoft, Amazon, and insurance companies. Keep the rabble away, as it were.

    1. Amazon has been beta testing the ads on Apple Inc.’s iOS platform for several months, according to people familiar with the plan. A similar product for Google’s Android platform is planned for later this year, said the people, who asked not to be identified because they’re not authorized to share the information publicly.

      Sounds like one of the best reasons I've ever heard to run Brave Browser both on desktop and mobile. https://brave.com/

    1. Sharing of user data is routine, yet far from transparent. Clinicians should be conscious of privacy risks in their own use of apps and, when recommending apps, explain the potential for loss of privacy as part of informed consent. Privacy regulation should emphasise the accountabilities of those who control and process user data. Developers should disclose all data sharing practices and allow users to choose precisely what data are shared and with whom.

      Horrific conclusion, which clearly states that "sharing of user data is routine" where the medical profession is concerned.

    2. To investigate whether and how user data are shared by top rated medicines related mobile applications (apps) and to characterise privacy risks to app users, both clinicians and consumers.

      "24 of 821 apps identified by an app store crawling program. Included apps pertained to medicines information, dispensing, administration, prescribing, or use, and were interactive."

    1. While employees were up in arms because of Google’s “Dragonfly” censored search engine with China and its Project Maven’s drone surveillance program with DARPA, there exist very few mechanisms to stop these initiatives from taking flight without proper oversight. The tech community argues they are different than Big Pharma or Banking. Regulating them would strangle the internet.

      This is an old maxim with corporations, Google, Facebook, and Microsoft alike; if you don't break laws by simply doing what you want because of, well, greed, then you're hampering "evolution".

      Evolution of their wallets, yes.

    2. Amy Webb, Author of  “The Big Nine: How the Tech Titans and their Thinking Machines could Warp Humanity” refers not only to G-MAFIA but also BAT (the consortium that has led the charge in the highly controversial Social Credit system to create a trust value among its Chinese citizens). She writes: We stop assuming that the G-MAFIA (Google, Apple, Facebook, IBM, and Amazon) can serve its DC and Wall Street masters equally and that the free markets and our entrepreneurial spirit will produce the best possible outcomes for AI and humanity

      This is discussed by Shoshana Zuboff in her masterfully written "The Age of Surveillance Capitalism".

    1. A speech-detecting accelerometer recognizes when you’re speaking and works with a pair of beamforming microphones to filter out external noise and focus on the sound of your voice.

      I'll translate this for you: "This enables Apple to constantly listen to you, record your behaviour, and sell your behaviour data."

    1. we don’t want to fund teachers and manageable class sizes, so we outsource the plagiarism problem to a for-profit company that has a side gig of promoting the importance of the problem it promises to solve.

      Yet another example of a misdirected "solution" to a manufactured problem that ends up being more costly - in terms of monetary expense AND student learning AND faculty engagement - than it would have been to invest in human interaction and learner-centered pedagogies.

  16. Feb 2019
    1. It is no longer enough to automate information flows about us; the goal now is to automate us. These processes are meticulously designed to produce ignorance by circumventing individual awareness and thus eliminate any possibility of self-determination. As one data scientist explained to me, “We can engineer the context around a particular behaviour and force change that way… We are learning how to write the music, and then we let the music make them dance.”
    2. Larry Page grasped that human experience could be Google’s virgin wood, that it could be extracted at no extra cost online and at very low cost out in the real world. For today’s owners of surveillance capital the experiential realities of bodies, thoughts and feelings are as virgin and blameless as nature’s once-plentiful meadows, rivers, oceans and forests before they fell to the market dynamic. We have no formal control over these processes because we are not essential to the new market action. Instead we are exiles from our own behaviour, denied access to or control over knowledge derived from its dispossession by others for others. Knowledge, authority and power rest with surveillance capital, for which we are merely “human natural resources”. We are the native peoples now whose claims to self-determination have vanished from the maps of our own experience.
    3. The combination of state surveillance and its capitalist counterpart means that digital technology is separating the citizens in all societies into two groups: the watchers (invisible, unknown and unaccountable) and the watched. This has profound consequences for democracy because asymmetry of knowledge translates into asymmetries of power.
    1. No one is forced on Twitter, naturally, but if you aren’t on Twitter, then your audience is (probably) smaller, while if you are on Twitter, they can steal your privacy, which I deeply resent. This is a big dilemma to me. Beyond that, I simply don’t think anybody should have as much power as the social media giants have over us today. I think it’s increasingly politically important to decentralize social media.

      This is an important point! And nothing puts a finer point on it than Shoshona Zuboff's recent book on surveillance capitalism.

  17. Jan 2019
    1. Turnitin’s practices have been ruled as fair use in federal court. But to Morris and Stommel, the ceding of control of students' work -- and their ownership over that work -- to a corporation is a moral issue, even if it's legally sound. Time spent on checking plagiarism reports is time that would be better spent teaching students how to become better writers in the first place, they argue. “This is ethical, activist work. While not exactly the Luddism of the 19th century, we must ask ourselves, when we’re choosing ed-tech tools, who profits and from what?” they wrote in the essay. “The gist: when you upload work to Turnitin, your property is, in no reasonable sense, your property. Every essay students submit -- representing hours, days or even years of work -- becomes part of the Turnitin database, which is then sold to universities.”

      This is key issue for me - and we talked about this last week in GEDI when someone brought up the case of wide-scale cheating on the quizz / test that students took online.

      I'd like teachers to focus on teaching and helping students learn. And I think the question about who profits and who benefits from ed-tech tools like TurnitIn need to be asked.

  18. Nov 2018
    1. The Chinese place a higher value on community good versus individual rights, so most feel that, if social credit will bring a safer, more secure, more stable society, then bring it on
  19. Oct 2018
    1. The idea that researchers can, and should, quantify something as slippery as “engagement” is a red flag for many of the experts I talked to. As Alper put it, “anyone who has spent time in any kind of classroom will know that attention isn’t something well-measured by the face. The body as a whole provides many more cues.”
    2. The NYCLU found nothing in the documents outlining policies for accessing data collected by the cameras, or what faces would be fed to the system in the first place. And based on emails acquired through the same FOIL request, the NYCLU noted, Lockport administrators appeared to have a poor grasp on how to manage access to internal servers, student files, and passwords for programs and email accounts. “The serious lack of familiarity with cybersecurity displayed in the email correspondence we received and complete absence of common sense redactions of sensitive private information speaks volumes about the district’s lack of preparation to safely store and collect biometric data on the students, parents and teachers who pass through its schools every day,” an editor’s note to the NYCLU’s statement on the Lockport documents reads.
    3. A school using the platform installs a set of high-quality cameras, good enough to detect individual student faces, before determining exactly which biometrics it thinks must set off the system. Crucially, it’s up to each school to input these facial types, which it might source from local police and mug-shot databases, or school images of former students it doesn’t want on its premises. With those faces loaded, the Aegis system goes to work, scanning each face it sees and comparing it with the school’s database. If no match is found, the system throws that face away. If one is, Aegis sends an alert to the control room.
    4. It might sound like dystopian science fiction, but this could be the not-too-distant future for schools across America and beyond. Researchers at the University of California, San Diego, for instance, have already begun publishing models for how to use facial recognition and machine learning to predict student engagement. A Seattle company recently offered up an open-source facial recognition system for use in schools, while startups are already selling “engagement detectors” to online learning courses in France and China. Advocates for these systems believe the technology will make for smarter students, better teachers, and safer schools. But not everyone is convinced this kind of surveillance apparatus belongs in the classroom, that these applications even work, or that they won’t unfairly target minority faces.
    1. The end game of a surveillance society, from the perspective of those being watched, is to be subjected to whims of black-boxed code extended to the navigation of spaces, which are systematically stripped of important social and cultural clues. The personalized surveillance tech, meanwhile, will not make people less racist; it will make them more comfortable and protected in their racism.
    2. What would it look like to be constantly coded as different in a hyper-surveilled society — one where there was large-scale deployment of surveillant technologies with persistent “digital epidermalization” writing identity on to every body within the scope of its gaze?
  20. Aug 2018
    1. But the entire business model — what the philosopher and business theorist Shoshana Zuboff calls “surveillance capitalism” — rests on untrammeled access to your personal data.

      Is Shoshana Zuboff the originator of surveillance capitalism?

      According to Wikipedia--No: Surveillance capitalism is a term first introduced by John Bellamy Foster and Robert W. McChesney in Monthly Review in 2014 and later popularized by academic Shoshana Zuboff that denotes a new genus of capitalism that monetizes data acquired through surveillance.

    1. Since the data is already being collected on a regular basis by ubiquitous private firms, it is thought to contain information that will increase opportunities for intelligence gathering and thereby security. This marks a shift from surveillance to ‘dataveillance’ (van Dijck 2014), where the impetus for data processing is no longer motivated by specific purposes or suspicions, but opportunistic discovery of anomalies that can be investigated. For crisis management this could mean benefits such as richer situation awareness, increased capacity for risk assess-ment, anticipation and prediction, as well as more agile response

      Dataveillance definition.

      The supposed benefits for crisis management don't correspond to the earlier criticisms about data quality, loss of contextualization, and predictive analytics accuracy.

      The following paragraph clears up some of the overly optimistic promises. Perhaps this section is simply overstated for rhetorical purposes.

    2. lthough Snowden’s revelations shocked the world and prompted calls for a public debate on issues of privacy and transparency

      I understand the desire to use a topical hook to explain a complex topic but referring to the highly contentious Snowden scandal as a frame seems risky (alienating) and could potentially undermine an important argument about the surveillance state should new revelations be revealed about his motives/credibility.

    3. While seemingly avoiding the traps of exerting top- down power over people the state does not yet have formal control over, and simultaneously providing support for self- determination and choice to empower individuals for self- sufficiency rather than defining them as vulnerable and passive recipients of top- down protection (Meier 2013), tying individual aid to mobile tracking puts refugees in a situation where their security is dependent upon individual choice and the private sector. Apart from disrupting traditional dynamics of responsibility for aid and protection, public–private sharing of intel-ligence brings new forms of dataveillance

      If the goal is to improve rapid/efficient response to those in need, is it necessarily only a dichotomy of top-down institutional action vs private sector/market-driven reaction? Surely, we can do better than this.

      Data/predictive analytics abuses by the private sector are legion.

      How does social construction vs technological determinism fit here? In what ways are the real traumas suffered by crisis-affected people not being taken into account during the response/relief/resiliency phases?

    4. However, with these big data collections, the focus becomes not the individu-al’s behaviour but social and economic insecurities, vulnerabilities and resilience in relation to the movement of such people. The shift acknowledges that what is surveilled is more complex than an individual person’s movements, communica-tions and actions over time.

      The shift from INGO emergency response/logistics to state-sponsored, individualized resilience via the private sector seems profound here.

      There's also a subtle temporal element here of surveilling need and collecting data over time.

      Again, raises serious questions about the use of predictive analytics, data quality/classification, and PII ethics.

    5. Andrejevic and Gates (2014: 190) suggest that ‘the target becomes the hidden patterns in the data, rather than particular individuals or events’. National and local authorities are not seeking to monitor individuals and discipline their behaviour but to see how many people will reach the country and when, so that they can accommodate them, secure borders, and identify long- term social out-looks such as education, civil services, and impacts upon the host community (Pham et al. 2015).

      This seems like a terribly naive conclusion about mass data collection by the state.

      Also:

      "Yet even if capacities to analyse the haystack for needles more adequately were available, there would be questions about the quality of the haystack, and the meaning of analysis. For ‘Big Data is not self-explanatory’ (Bollier 2010: 13, in boyd and Crawford 2012). Neither is big data necessarily good data in terms of quality or relevance (Lesk 2013: 87) or complete data (boyd and Crawford 2012)."

    6. as boyd and Crawford argue, ‘without taking into account the sample of a data set, the size of the data set is meaningless’ (2012: 669). Furthermore, many tech-niques used by the state and corporations in big data analysis are based on probabilistic prediction which, some experts argue, is alien to, and even incom-prehensible for, human reasoning (Heaven 2013). As Mayer-Schönberger stresses, we should be ‘less worried about privacy and more worried about the abuse of probabilistic prediction’ as these processes confront us with ‘profound ethical dilemmas’ (in Heaven 2013: 35).

      Primary problems to resolve regarding the use of "big data" in humanitarian contexts: dataset size/sample, predictive analytics are contrary to human behavior, and ethical abuses of PII.

    7. Second, this tracking and tracing of refugees has become a deeply ambiguous process in a world riven by political conflict, where ‘migration’ increasingly comes to be discussed in co- location with terrorism.

      Data collection process for refugees is underscored as threat surveillance, whether it is intended or not.

    8. Surveillance studies have tracked a shift from discipline to control (Deleuze 1992; Haggerty and Ericson 2000; Lyon 2014) exemplified by the shift from monitoring confined populations (through technologies such as the panopticon) to using new technologies to keep track of mobile populations.

      Design implication for ICT4D and ICT for humanitarian response -- moving beyond controlled environment surveillance to ubiquitous and omnipresent.

  21. Jul 2018
    1. Tega Brain and Sam Lavigne, two Brooklyn-based artists whose work explores the intersections of technology and society, have been hearing a lot of stories like mine. In June, they launched a website called New Organs, which collects first-hand accounts of these seemingly paranoiac moments. The website is comprised of a submission form that asks you to choose from a selection of experiences, like “my phone is eavesdropping on me” to “I see ads for things I dream about.” You’re then invited to write a few sentences outlining your experience and why you think it happened to you.
    1. This is one of the reasons why the masses might welcome surveillance. The fear of disease.

  22. Mar 2018
  23. Jan 2018
  24. virginia-eubanks.com virginia-eubanks.com
    1. On EBT cards at the dawn of the modern surveillance state.

      That's the moment when I realized that a lot of the most innovative, cutting-edge technologies in the United States are first tested on poor and working-class people.

      Virginia Eubanks, on PBS, discussing Automating Inequality

      https://www.youtube.com/watch?v=Avxm7JYjk8M&start=170&end=219

  25. Oct 2017
    1. The distinction between assessment and surveillance seems really blurry to me.

      on the fine line between assessment and surveillance

  26. Jul 2017
    1. For example, an observer or eavesdropper that conducts traffic analysis may be able to determine what type of traffic is present (real-time communications or bulk file transfers, for example) or which protocols are in use, even if the observed communications are encrypted or the communicants are unidentifiable. This kind of surveillance can adversely impact the individuals involved by causing them to become targets for further investigation or enforcement activities

      good example about surveillance

  27. Mar 2017
    1. You can delete the data. You can limit its collection. You can restrict who sees it. You can inform students. You can encourage students to resist. Students have always resisted school surveillance.

      The first three of these can be tough for the individual faculty member to accomplish, but informing students and raising awareness around these issues can be done and is essential.

  28. Dec 2016
    1. Selling user data should be illegal. And the customer data a company is allowed to collect and store should be very limited.

      Under the guidance of Jared Kushner, a senior campaign advisor and son-in-law of President-Elect Trump, Parscale quietly began building his own list of Trump supporters. Trump’s revolutionary database, named Project Alamo, contains the identities of 220 million people in the United States, and approximately 4,000 to 5,000 individual data points about the online and offline life of each person. Funded entirely by the Trump campaign, this database is owned by Trump and continues to exist.

      Trump’s Project Alamo database was also fed vast quantities of external data, including voter registration records, gun ownership records, credit card purchase histories, and internet account identities. The Trump campaign purchased this data from certified Facebook marketing partners Experian PLC, Datalogix, Epsilon, and Acxiom Corporation. (Read here for instructions on how to remove your information from the databases of these consumer data brokers.)

    2. Trump's campaign used carefully targeted negative ads to suppress voter turnout.

      With Project Alamo as ammunition, the Trump digital operations team covertly executed a massive digital last-stand strategy using targeted Facebook ads to ‘discourage’ Hillary Clinton supporters from voting. The Trump campaign poured money and resources into political advertisements on Facebook, Instagram, the Facebook Audience Network, and Facebook data-broker partners.

      “We have three major voter suppression operations under way,” a senior Trump official explained to reporters from BusinessWeek. They’re aimed at three groups Clinton needs to win overwhelmingly: idealistic white liberals, young women, and African Americans.”

    1. You should assume that a printer (and probably cameras, or just about any product) includes unique identifying data. With printers, it's encoded as nearly invisible yellow dots.

    1. Former members of the Senate Church Committee write to President Obama and Attorney General Loretta Lynch requesting leniency for Edward Snowden.

      https://www.brennancenter.org/sites/default/files/news/Snowden_memo.pdf

  29. Nov 2016
    1. Mike Pompeo is Trump's pick for CIA director. In January 2016, Pompeo advocated "re-establishing collection of all metadata, and combining it with publicly available financial and lifestyle information into a comprehensive, searchable database. Legal and bureaucratic impediments to surveillance should be removed" (At least they acknowledge that backdoors in US hardware and software would do little good.)

      Oh, cute. Pompeo made a name for himself during the Benghazi investigation.<br> http://www.nytimes.com/2016/11/19/us/politics/donald-trump-mike-pompeo-cia.html

    1. EFF guide to attending protests, especially how to handle smartphones. (Part of the guide to surveillance self-defense.)

  30. Oct 2016
    1. Hemisphere isn’t a “partnership” but rather a product AT&T developed, marketed, and sold at a cost of millions of dollars per year to taxpayers. No warrant is required to make use of the company’s massive trove of data, according to AT&T documents, only a promise from law enforcement to not disclose Hemisphere if an investigation using it becomes public.

      ...

      Once AT&T provides a lead through Hemisphere, then investigators use routine police work, like getting a court order for a wiretap or following a suspect around, to provide the same evidence for the purpose of prosecution. This is known as “parallel construction.”

    1. If you’re white, you don’t usually need to worry about being monitored by the police.

      Interesting. That NYPD surveillance tower on Pitt Street. Will append sanitized photo at some point. Sad because so many children coming from the Masyrk and Brandeis communities cross Pitt on their way to school.

    1. Patrick G. Eddington of the Cato Institute says a report from the DoJ shows that Edward Snowden's actions were in the public interest, and President Obama should pardon him, or at least dismiss the charges.

  31. Sep 2016
    1. Proposed changes to "Rule 41" will make it too easy for government agents to get permission to hack remote computers. Petition Congress to prevent this.

    1. Even some of the world's largest companies live in constant "fear of Google"; sudden banishment from search results, YouTube, AdWords, Adsense, or a dozen other Alphabet-owned platforms can be devastating.
    1. In theory the editorial writers speak for the publisher. In practice the publisher does not routinely tell them what to say or even see their copy in advance. In this case I have on good authority that neither the publisher, Fred Ryan, nor the owner, Jeff Bezos, had any idea that this editorial was coming. I would be very surprised to learn that either of them agrees with the proposition that our principal stories on the NSA should not have been published. For sure I can tell you that this is not the position of the newsroom’s leadership or any reporter I know. Marty Baron, the executive editor, has said again and again how proud he is of the paper’s coverage of Ed Snowden and the NSA.

      -- Barton Gellman

  32. Jul 2016
    1. I could have easily chosen a different prepositional phrase. "Convivial Tools in an Age of Big Data.” Or “Convivial Tools in an Age of DRM.” Or “Convivial Tools in an Age of Venture-Funded Education Technology Startups.” Or “Convivial Tools in an Age of Doxxing and Trolls."

      The Others.

    2. demanded by education policies — for more data
    3. education technologies that are not build upon control and surveillance
  33. Jun 2016
  34. Feb 2016
    1. Patrick Ball—a data scientist and the director of research at the Human Rights Data Analysis Group—who has previously given expert testimony before war crimes tribunals, described the NSA's methods as "ridiculously optimistic" and "completely bullshit." A flaw in how the NSA trains SKYNET's machine learning algorithm to analyse cellular metadata, Ball told Ars, makes the results scientifically unsound.
    1. At some dark day in the future, when considered versus the Google Caliphate, the NSA may even come to be seen by some as the “public option.” “At least it is accountable in principle to some parliamentary limits,” they will say, “rather than merely stockholder avarice and flimsy user agreements.”

      In the last few years I've come to understand that my tolerance for most forms of surveillance should be considered in terms of my confidence in the judiciary.

  35. Jan 2016
    1. Vigilant Solutions, a surveillance technology company, is making shady deals with police departments in Texas. They lend the police equipment and database access. The police use it to spot people with outstanding warrants, whom they can stop and take payments from by credit card -- with a 25% processing fee tacked on for the tech company. The company also intends to keep all the license plate data collected by the police.

    1. “traffic analysis.” It’s basis lies in observing all message traffic traveling on a network and discerning who’s communicating with whom, how much, and when.

      The strategy seems to be archive everything in case traffic analysis finds something worth going back and reading.

      Defense against traffic analysis:

      Messages from users must be padded to be uniform in size and combined into relatively large “batches,” then shuffled by some trustworthy means, with the resulting items of the randomly ordered output batch then distributed to their respective destinations. (Technically, decryption needs to be included in the shuffling.) Mix network

      He then talks about limited anonymity and pairwise pseudonyms as ways of solving problems with complete anonymity versus public identification. There is an article in Wired about his proposed system.

    1. from Hawaii to Alabama to New Hampshire, a diverse, bipartisan coalition of state legislators will simultaneously announce state legislative proposals that, although varied, are all aimed at empowering their constituents to #TakeCTRL of their personal privacy. These bills would go far in ensuring students, employees, and everyone else has more of a say over who can know their whereabouts, track their activities online, and view information they share with friends.
  36. Dec 2015
    1. So you think mass surveillance isn't a problem, since you have nothing to hide? There are so many federal crimes that it is impossible to count them. If the government decides to focus on you, they can probably find a crime that fits your actions.

    1. Congress on Friday adopted a $1.15 trillion spending package that included a controversial cybersecurity measure that only passed because it was slipped into the US government's budget legislation. House Speaker Paul Ryan, a Republican of Wisconsin, inserted the Cybersecurity Information Sharing Act (CISA) into the Omnibus Appropriations Bill—which includes some $620 billion in tax breaks for business and low-income wage earners. Ryan's move was a bid to prevent lawmakers from putting a procedural hold on the CISA bill and block it from a vote. Because CISA was tucked into the government's overall spending package on Wednesday, it had to pass or the government likely would have had to cease operating next week.

      House 316-113<br> Senate 65-33

      The Verge "This morning, Congress passed the Cybersecurity Information Sharing Act of 2015, attached as the 14th rider to an omnibus budget bill. The bill is expected to be signed into law by the president later today."

      Techdirt 15 Dec

      1. Allows data to be shared directly with the NSA and DOD, rather than first having to go through DHS.
      2. Removes restrictions on using the data for surveillance activities.
      3. Removes limitation on using the data for cybersecurity purposes, and allows it to be used for investigating other crimes -- making it likely that the DEA and others will abuse CISA.
      4. Removes the requirement to "scrub" the data of personal information unrelated to a cybersecurity threat before sharing the data.

      ACLU

    1. THE INTERCEPT HAS OBTAINED a secret, internal U.S. government catalogue of dozens of cellphone surveillance devices used by the military and by intelligence agencies.

      MANY OF THE DEVICES in the catalogue, including the Stingrays and dirt boxes, are cell-site simulators, which operate by mimicking the towers of major telecom companies like Verizon, AT&T, and T-Mobile.

      Today nearly 60 law enforcement agencies in 23 states are known to possess a Stingray or some form of cell-site simulator, though experts believe that number likely underrepresents the real total. In some jurisdictions, police use cell-site simulators regularly. The Baltimore Police Department, for example, has used Stingrays more than 4,300 times since 2007.

      “The same grant programs that paid for local law enforcement agencies across the country to buy armored personnel carriers and drones have paid for Stingrays,” said Soghoian.

      Police cite the war on terror as their reason for purchasing surveillance equipment. But they use it for domestic cases, including minor ones.

      “The full extent of the secrecy surrounding cell-site simulators is completely unjustified and unlawful,” said EFF’s Lynch. “No police officer or detective should be allowed to withhold information from a court or criminal defendant about how the officer conducted an investigation.”

    1. Negotiated in secret and tucked in legislation thousands of pages long, Congress is about to pass an awful surveillance bill under the guise of “cybersecurity” that could open the door to the NSA acquiring much more private information of Americans.