7 Matching Annotations
  1. Last 7 days
    1. You might not care for, say, history, but understanding geopolitics and the history of a particular region may help you develop business strategies in breaking into new markets and accounting for local preferences. You might not care for, say, biology, but understanding the environmental impact of your products can help not only your business but also the world you and your future children will want to live in. And you might not care for, say, philosophy, but identifying the possible ethical concerns about your products or services in advance can help you avoid stepping into a legal or public-relations landmine, even if you don’t care about doing what’s right.

      Understanding that this is part of a broader conversation regarding the ethical and personal risks of AI, I believe it connects more closely to the topic of participatory culture.

      When we expect AI to explain and solve all our problems for us or educate us on important global topics, we may unknowingly avoid conversations or other means of communication, which would more naturally help us understand what we are so eager to learn. Participating in important political conversations, for example, with those in our community through books, websites, and other sources, allows us to engage in a process rather than expecting all the answers at our fingertips. I feel as if isolating ourselves to a computer screen, talking to ChatGPT, removes the human element of participatory culture. In the same sense, does everyone engaging with AI begin a new participatory culture of AI use and generating ideas?

    2. And cheating can lead to guilt and other conscience-based penalties, which can add up to a real effect on your mindset, just as achievements can lead to greater confidence.

      This excerpt reminds me of the differences between growth and fixed mindsets (specifically in reference to Carol Dweck's [2006] explenation of them). As I am teaching grades 9 and 10 this year, I often find myself in conversations with students about when AI use is appropriate and when it is not for assessment and idea generation. Students who are upset about not being able to use AI in every situation are often disillusioned about their own skill set and abilities, and align more with a fixed mindset. On the other hand, students who are willing to attempt tasks based on their own knowledge and ability often show a growth mindset. Although using AI is totally fine in some scenarios, when students do not know when to step away, I find that that too impacts their mindset and confidence. In short, students do not need to necessarily submit work created by AI to feel guilt or shame; they can feel it before the work even begins.

    3. While outlines and bullet points are faster to read and make it easier to see how the discussion flows, stripping down an article to its bare bones can flatten the discussion too much, causing nuance or details to be lost and moving too quickly to be digested.

      In relation to this excerpt, I consider the fact that when we rely on condensed material to support claims and ideas, we also have less of a 'backbone' when considering our own allegiance to those same ideas. For example, if a student is writing a paper on a book and uses AI to find three specific details to blindly support a claim, they may be less able to support their case outside of the specific paper. Sure, AI can write you a well-rounded argument, but if you try to move that topic of conversation to an oral assessment or question their reasoning behind their choices, they may be unable. Although AI can be used to help support ideas that students are already thinking about (ie. after they have read a chapter and identified key themes and ideas), it should not be used in replacement of actual work. As the author says, you lose the nuance of your argument when you cannot identify your why.

    4. The university is a gym for your mind. For both body and mind, your abilities and skills atrophy or decline when they’re not used, like a dead limb, for efficiency and energy-savings.

      When relating this comment to my students and the conversations I have with my coworkers, I consider the possibility of never even starting before cognitive or skill decline occurs. When considering the development of higher-level thinking, I worry about students who have never experienced a world without AI. Some students, especially those who are not intrinsically motivated in a specific subject area (say, drama), never give the task at hand a chance before they turn to ChatGPT. In relation to the quote I have chosen, I wonder if using AI immediately is like saying you’re going to start going to the gym “next week”, but you never end up going. With this in mind, the skill atrophy never occurs because the skill was never developed in the first place.

    5. AI can help tackle hard problems facing the world, such as climate change, energy sustainability, diseases, and other serious challenges, even aging and death itself.

      When reading this section, I would like to consider the fact that AI does not immediately address these challenges, specifically those related to environmental sustainability. The widespread use of AI systems globally raises significant ecological concerns that need to be addressed. The raw materials needed to build the physical systems, the energy consumed by data centers, and the water required to cool the systems have a direct impact on the world that we do not immediately see when we are typing away at our computers. Although I agree with the author’s point that the generative ideas AI can produce can be helpful in identifying solutions and pathways to combat climate change, energy sustainability, and other challenges, the systems themselves are not environmentally friendly. It is for this reason that we need to be cognisant about the necessary vs unnecessary AI tasks that we are performing.

    6. For school, the benefits of using AI can include higher grades than you would otherwise earn on your own.

      I struggle to understand this section of the article and wonder what a grade is meant to measure. In a world where we are bombarded with the thoughts of 'grade inflation', I wonder just how much AI is adding to this conversation. I also wonder what the value of a higher grade is when the content was not generated by the student. Am I giving credit to a student for copying and pasting ChatGPT to their essay? That does not sit well with me. Although I think it is okay to assess a student's ability to use and maneuver AI tools and resources (especially when looking at the application category of a standard rubric), I cannot imagine grading AI for the knowledge and understanding portion of a student’s grade. I also wonder if I am behind the times and if my own inability to think outside my own grading procedures hinders me from imagining a world where AI and human intelligence (HI) can be seamlessly assessed.

    7. Yes, learning how to use AI might be important to your future, but we already know that it seriously disrupts education and learning when used inappropriately, by either the teacher or student.

      Reading this, I am drawn to compare the use of AI in education to the use of supplements in sports competition and the ethical implications of both.

      Sports & Supplementation... Although supplementation in sports can be a massive benefit for individual competitors, it is challenging to then objectivley measure your 'pure' or 'raw' performance. As an example, powerlifters who compete in an untested competition (ie. those that do not test for steroid or PED use) are competing against individuals who may or may not haev an unfair advantage over them (those using PED's and those not are in the same field). In this world, there is no objective test or measure of strength and ability.

      AI... Similarly, in education, students and teachers who do not have explicit training or uses for AI are involved in tasks/situations in which some individuals are generating ideas/products which may or may not be completely or partially generated by AI. Again, these tasks may not show an objective display of knowledge, understanding, thinking, or application and some individuals may not gain the skills that they were meant to when completing an assignment.

      In short... Although there is no right or wrong in either of these situations, both worlds require individuals to have appropriate subject knowledge and application abilities to reach a desired end product. When a competitive powerlifter cannot reach desired lifts and switches to PEDs to help achieve that goal, they take a shortcut which misses an important stage in the growth and development of their skills. Similarly in education, when we skip to using AI in favour of appropriatley challenging skill development, we 'miss the mark'.