42 Matching Annotations
  1. Sep 2022
    1. e doubted most students knew they were signing up for long-term monitoring when they clicked to connect to the campus WiFi.

      Intentionally unclear and confusing terms of service

    2. The systems, he added, are isolating for students who don’t own smartphones,

      Impoverished: Not everyone has a smartphone. This makes another hurdle for impoverished people to jump over in order to receive an education.

    3. He said he squandered several of his early lectures trying to convince the app he was present, toggling his settings in desperation as professors needled him to put the phone away. He then had to defend himself to campus staff members, who believed the data more than him.

      Tech fails...it happens. But when tech is regarded as infallible, and we rely so heavily on its records, then a failed tech record has consequences, and people are left without recourse.

    4. We know where you live.’ These days, it’s, ‘We know where you are,’ ” Purdue University president Mitch Daniels wrote last year about his school’s location-tracking software. “Isn’t technology wonderful?”

      It's clear that we all have different philosophies and inherent values. I, for one, do not think this sort of technology is wonderful.

    5. In Sasha’s case, Benz said, the university sent an adviser to knock on her door

      Sasha, "at risk" student Advisors knock on students' doors if their "at risk" score warrants it.

    6. It also generates a “risk score” for students based around factors such as how much time they spent in community centers or at the gym.

      Who is qualified to design such general parameters to assess an individual's "risk" score?

    7. “At every school, there are lots of Sashas,” he said. “And the bigger you are, the more Sashas that you have.”

      And, what is to be done about Sasha? Surely, Sasha feels even worse knowing that University systems are tracking her malaise.

    8. A student avoiding the cafeteria might suffer from food insecurity or an eating disorder; a student skipping class might be grievously depressed. The data isn’t conclusive, Benz said, but it can “shine a light on where people can investigate, so students don’t slip through the cracks.”

      Is it really the place of a university to make profiles on people's personal struggles and premptively intervene?

    9. using systems that calculate personalized “risk scores” based on factors such as whether the student is going to the library enough.

      Using private data to create personality profiles of their students.

    10. Graduates will be well prepared … to embrace 24/7 government tracking and social credit systems

      University students become accustomed to this type of surveillance, and their values adapt

    11. — have also led some to worry whether anyone will truly know when all this surveillance has gone too far.

      To what end?

    12. a trade-off of future worries for the immediacy of convenience, comfort and ease. If a tracking system can make students be better, one college adviser said, isn’t that a good thing?

      Good?(ish): convenience, assistance, better performance. Bad: Loss of privacy, accountability, discipline

    13. The tracking systems, they worry, will infantilize students in the very place where they’re expected to grow into adults, further training them to see surveillance as a normal part of living, whether they like it or not.

      Bad Issues: Students losing the a sense of choice, which is an important area through which to grow.

    14. Dozens of schools now use such technology to monitor students’ academic performance, analyze their conduct or assess their mental health.

      Bad: Issues being that this sort of tracking is incredibly invasive, at best.

    15. They want those points,” he said. “They know I’m watching and acting on it. So, behaviorally, they change.

      Good: Easy way for professors to enforce attendance in large classes. Attendance is one of the key indicators of student performance.

    16. When Syracuse University freshmen walk into professor Jeff Rubin’s Introduction to Information Technologies class, seven small Bluetooth beacons hidden around the Grant Auditorium lecture hall connect with an app on their smartphones and boost their “attendance points.”

      Syracuse Students Issues:Automatic attendance

    1. ProctorU chief executive officer Scott McFarland said that “students should be reassured that all testing data is owned by their schools, not the proctoring provider.”“Schools set the rules about what data is collected, how it’s retained and for how long,” he said in an emailed statement.

      Highly skeptical; where there is access to data, there is an opportunity to profit. Where opportunity to profit exists, claims of data security and ownership of data is highly suspect.

    2. As the pandemic makes remote learning a long-term state of affairs for many schools, Mr. Swauger, in Denver, recommends that schools take some time to do their research and only roll out systems that are proven to work.
    3. “No thank you,” she said. “I’d rather my students not feel like they’re in a police state.
    4. Taking an exam in the comfort of your own home, on your own schedule, is less invasive

      Good: less invasive, more relaxed environment within which to test

    5. damage to the student-teacher relationship
    6. damage to the student-teacher relationship

      Bad: damage to student-teacher relationship

    7. And algorithms designed to detect suspicious movement will inevitably flag disabled students and others who do not move in the way the platforms expect, he added.

      Disabled students at risk of being flagged with suspicious movements

    8. disabled students

      Disabled students

    9. There’s a big gulf between what this technology promises, and what it actually does on the groun

      Expectation/promise vs. reality of performance

    10. Facial recognition systems – which some proctoring platforms use to confirm the identity of the test taker – are less accurate with dark-skinned people, 
    11. bias in facial recognition but also because of the potential for data collection.

      Data Collection Bad tradeoff: privacy for convenience

    12. allows students to continue learning.“We believe that many lives have been positively impacted by being able to continue their education and careers

      Good: allows more people to get their education, particularly during events like the covid pandemic.

    13. “There are so many systematic barriers preventing people like me from obtaining these degrees – and this is just another example of tha

      online exams may exacerbate systemic barriers for people with dark skin complexion

    14. Mr. Khan began to suspect that it was his dark skin

      Those with dark skin tone

    15. aw student Areeb Khan

      law student Areeb Khan

    1. We need a safe way to experiment with these technologies and understand the consequences of their use instead of just continuing a blind march towards surveillance for the purpose of profit-making,” Newman says. “These are sophisticated applications with lifelong consequences for the individuals who are analyzed by them, to ends as yet unknown. We all need to be really judicious and thoughtful here.”

      Here is the area needed to consider to improve moving forward

    2. There is no promise to the user that their data won’t leave a specific device,” says Shmatikov. “We still don’t really know just how much data voice-skill hosts like Amazon—or third parties that rely on Amazon—are harvesting, or what they’re doing with that information.” Amazon didn’t respond to multiple requests for comment.

      Bad

    3. What if data harvested from students’ conversations affected their chances of getting a mortgage or a job later on? What if it were used against foreign students to have them deported, possibly to home countries where they could be imprisoned for their political views?

      Bad.

    4. “We use Amazon’s platform to make this work. Amazon stores information about usage that can be purged upon request.”

      We assume that Amazon adheres to our requests.

    5. access to financials, course schedules and grades, and outstanding fees via voice devices.

      This sort of information is very sensitive. While perhaps convenient to access, how secure is this data through these new technologies?

    6. “When it comes to deploying listening devices where sensitive conversations occur, we simply have no idea what long-term effect having conversations recorded and kept by Amazon might have on their futures—even, quite possibly, on their health and well-being,
    7. Amazon to advance voice--enabled technology on campuses, part of the tech giant’s Alexa Innovation Fellowship

      Giving one of the riches companies in the world even more access to peoples data, while convenient for consumers, may have long term consequences.

    8. Administrators at some of these schools told me they believe Alexa will bolster enrollment and reduce dropout rates

      Good tradeoff: it possibly improves student performance.

    9. Arizona State University, Lancaster University in the UK, and Ross University School of Medicine in Barbados have adopted voice-skill technology on campus. Some, including Northeastern University

      ASU, Lancaster in UK, RU School of Medicine, NU

    10. Each device was pre-programmed with answers to about 130 SLU-specific questions, ranging from library hours to the location of the registrar’s office (the school dubbed this “AskSLU”). The devices also included the basic voice “skills” available on other Dots, including alarms and reminders, general information, and the ability to stream music.

      Good tradeoff

    11. When Mateo Catano returned for his second year as an undergraduate at Saint Louis University

      Saint Louis University Students (mateo Catano)