63 Matching Annotations
  1. Apr 2023
    1. Why do you think this cause/issue is particularly important

      Just adding that I think this is particularly useful because it serves a reflective educational purpose as well.

    2. “What are the causes and issues that you believe are most important?”

      Just noting that the first three schools (NHS, TPS, MCHS) received the slightly different question, "What do you believe are the most important issues in the world?" This was changed to "What are the causes and issues that you believe are most important?" to avoid prompting wide moral circles.

    3. not worth the time cost

      Do you think we could identify volunteers who would want to do this via some kind of double-blind coding? It could be worth collecting the data and then making a post on the EA Forum to see if it could go anywhere.

    4. at baseline

      It is possible that this was influenced by prompting from teachers (who knew about the charities before students submitted the survey), and I think it would be useful to collect control data to determine this.

    5. differential administration

      This is fairly common, as we want the event to be flexible so that schools with particular structures/limitations can run at least some version of the event.

    6. Student-level followup

      Is there a way to do this without collecting contact information from participating students? I am not sure if I understand this possibility.

    7. distinct future event

      Just to name a couple obstacles that would need to be addressed: - funding to organize an event distinct from a charity election - getting schools interested in doing an event distinct from a charity election

    8. Geo (observational)

      For reference, I am pasting below my initial thoughts on this via Slack.

      The long-term outcome that I believe is most valuable is action taken as a result of an increase in attitudes consistent with giving significantly and giving effectively. I agree that measuring the number of respondents from the EA survey likely to have participated in a charity election could be used to get at one type of action (getting involved in the EA community), but the event was designed to lead to other types of action (e.g. donating more and more effectively as an adult and generally incorporating principles of EA into decision making) without that necessarily happening through involvement with the EA survey/community. Overall, the program is primarily directed toward EA as the project (e.g. getting applied experience deciding where to give with the principle of cost-effectiveness), and I think the most appropriate long-term outcome would be in alignment with attitudes/values/actions consistent with that.

      (On the EAIF page, here are the goals that are most relevant: "Aim to build a global community of people who use principles of effective altruism as a core part of their decision-making process when deciding how they can have a positive impact on the world; Directly increase the number of people who are exposed to principles of effective altruism, or develop, refine or present such principles")

    9. causing (e.g.) students

      Even if the interest in EA sorts of groups had existed prior to the charity election, another possible path to impact is for the event to plant the seed / spark the beginning of a student group that wouldn't have gotten organized without it. We're still working on how to measurably do this given that we cannot collect contact information from students, though.

    10. set of similar comparison schools

      I agree that this would be useful and I want to do what I can within budget to make this work. For reference, the concerns that I had named via Slack a week or so ago are pasted below.

      Yeah, this seems possible but it is not immediately clear to me when it would be achievable. At least a few issues come to mind that would eventually need to get sorted out: - being comfortable telling schools they need to wait a year (based on randomization) when they could otherwise run the program - getting schools motivated to provide control surveys with no immediate benefit - working with legal team to create separate language around school permissions in accordance with the new primary purpose of data collection (program evaluation rather than voting) - having enough success scaling such that there would be sufficient schools for the experimental and control groups - having sufficient budget to coordinate this with schools

    11. The email list could be (e.g.) shared with the teachers only.

      Even if we do not have the email data, this would still pose legal issues. If we receive different legal advice that suggests otherwise I would certainly want to explore this, but we do not have the budget for further legal advice (we're currently doing what we can with pro bono) and I think it is unlikely we could get around this.

    12. should also be asked ‘Have you ever taken part in a Charity Election before?

      Would this be just in the registration? This would be doable as long as we ensure the overall length is no longer than the current survey.

    13. ensure these are timed correctly

      The issue here is that the primary purpose of the surveys for schools is to serve as the registration form and ballot, and voting-related activities surround the surveys (e.g., student leaders campaign about the charities before voter registration and the election results are announced and discussed after students vote).

    14. all partiipating students are to fill out the registration and ballot forms (pre and post)

      This is made very clear in materials and teachers are given reminders about this a few days before implementation. In my experience in schools, implementation issues are very common; though I want to do everything possible to improve implementation, I do think that some of what we have encountered is par for the course. We can take another look at our systems and try to improve this, perhaps with some external review to offer new suggestions.

    15. search the web (e.g., https://www.charitynavigator.org/) to find a charity

      Unfortunately, I think this would pose implementational issues and potentially result in further discrepancies between registration and votes. If this question can be more concise and self-contained (and we possibly cut another required question), I think something like this is doable.

    16. Asking

      This is also easily doable. However, I would lean against a sentence expectation due to student/teacher complaints about the length of the forms. We can discuss this further in the working document.

    17. ex-post impression

      After the first analysis of NHS, TPS, and MCHS (so still ex-post, but before the recent analysis with eight schools), I had also indicated concerns about this outcome in regard to whether it is capturing the core objectives of the program. It is discussed further in the annotations of the analysis chapter, but the main idea (in my view) is pasted and simplified below.

      • It's important to get around agreeability bias, but the causes/issues question doesn't capture the core objectives of the program.
      • Accordingly, there was a contradiction between the stated core objectives of the program in the preregistration (increase in "attitudes consistent with the giving significantly and giving effectively portions of the GWWC mission") and focusing on the causes/issues question (not measuring those attitudes) in order to get around agreeability bias.
    1. “what are the causes and issues that you believe are most important?”

      For reference, just noting again that this question was different for the first three schools: "What do you believe are the most important issues in the world?"

    2. indicate interest

      Note that the primary intention for adding this question a few years ago was to plant the seed for starting a student group or otherwise taking action (not to measure impact)

    3. ISMDA

      Registration form cannot be approximated as a pre-event survey (it was taken after students already did the myth vs. fact, learned about the charities, and possibly had begun the research process). This is discussed in comments elsewhere.

    4. NHS

      School has run events each year since 2018; particularly high number of student leaders (who receive some training in EA prior to submitting the registration form)

    5. Pre: top implementations

      By coincidence, the 'top implementations' were the first three schools to run events (NHS, TPS, MCHS). Also by coincidence, these were the three schools with the most participating students.

      This is notable because after these first schools ran events, we adjusted the way the question was phrased from "What do you believe are the most important issues in the world?" to "What are the causes and issues that you believe are most important?" to avoid prompting wide moral circles.

      It does not seem clear to me that this would influence change from the registration form to the ballot in any particular way, but it likely did influence the data, presumably by encouraging thinking about "world" issues on both the registration and ballot.

    6. further involvement

      This is pasted from a comment in the Preface but is also relevant here:

      "Even if the interest in EA sorts of groups had existed prior to the charity election, another possible path to impact is for the event to plant the seed / spark the beginning of a student group that wouldn't have gotten organized without it. We're still working on how to measurably do this given that we cannot collect contact information from students, though."

    7. participating in a student group related to animal advocacy

      I would be curious to see how this looks in comparison to schools with a different set of charities, including e.g. THL or an ACE fund.

    8. equal vote shares overall

      Though we obviously don't try to sway student votes in any particular direction, an approximately equal share of votes is a good outcome from my point of view. This is probably the most balanced distribution I've seen since 2018.

    9. survey

      Afterward, student leaders typically announce the election results to the school. For example, the lead teacher at SHS indicated the below after receiving the election results.

      "We will share these detailed results and insights (including the anonymous data you mention below) with the student leaders, and they will showcase them to the entire school in a presentation. Having this level of detail will help significantly in highlighting how the core values of Sage Hill School are very much in harmony with the core values of the individual students. This will without question strengthen our sense of community even further and encourage future participation in these events."

      This was in response to the 'election results' email format (the email for SHS, to which the lead teacher responded, is pasted below). The intention for this email is to inspire schools to share the results (e.g., at a school assembly) in a way that promotes effective giving more widely.

      "Dear SHS charity elections team,

      Thank you for running a charity election at Sage Hill School! Giving What We Can has reviewed your school's survey data, and we are excited to share the election results below.

      Election Results 1st place: SCI Foundation -- $712 -- 185 votes 2nd place: Clean Air Task Force -- $0 -- 100 votes 3rd place: GiveDirectly -- $0 -- 71 votes Results, SHS.png

      The winning charity in your school's election is the SCI Foundation, and the $712 gift can help protect 1,780 children from schistosomiasis. If student leaders are interested in reflecting on student takeaways from the event, they are welcome to view the anonymous data (password: GWWCSHS23). Students overwhelmingly indicated they "voted for a cause they believe in" and "thought critically about what makes a charity effective," and a selection of anonymous student takeaways are listed below.

      1. It made me want to be conscious about how I help others.
      2. I’ve opened my perspective to the lives others lead and how a few dollars can impact someone enormously.
      3. It’s made me think about the cost effectiveness of charities.
      4. It made me better understand how even a small amount of money can truly make a difference.
      5. This has reminded me that we must think about and help others, not just ourselves.
      6. It made me think more critically on how exactly I could use my money to help others.
      7. I will continue to donate to foundations for a good cause due to the charity elections. I’ve seen a glimpse of where donations go and how they impact others.
      8. I learned to think more specifically about how a charity helps people, and if a charity's money is really being used well.
      9. It made me think about getting the best way to help others and benefit the world.
      10. It makes me want to do more for the world.

      Additionally, there were several students who indicated an interest in the following opportunities. - Taking a leadership role in a charity election next year (48 students) - Participating in a student group related to world poverty (57 students) - Participating in a student group related to improving the long-term future (54 students) - Making a donation to one of the charities in the election (50 students)

      Thank you for making a difference for your school community, as well as those directly benefiting from the charities on the ballot! Please let us know if there is anything else that would be useful in bringing closure to the event.

      All our best, Charity Elections team Giving What We Can"

    10. Students

      This was typically preceded by student leaders campaigning for the charities. For example, we provide the posters below for student leaders to hang up in the school.

      https://drive.google.com/file/d/1c7IUV4N_GyufhZxwnLXMx1PnzJo0Nm7k/view?usp=sharing

      This handbook format provides a more complete sense of the process (note that some of the links within the handbook do not work unless it is downloaded as a PDF): https://drive.google.com/file/d/1YhP-2lXMkhxLfkwWpuMogV0V5K1T5plX/view?usp=sharing

    11. over the course of three days

      As noted in the previous comment, there is some variation in the actual implementation relative to the curriculum (staff capacity has been a consistent issue over the years and the flexibility for teachers makes it more likely they will be able to fit the event into their schedules).

    12. exceptions

      Yes, information about the two notable exceptions (IHS and TPS) is pasted below. Generally, schools vary quite a bit in terms of what days of the week and times students register to vote and vote as well as the amount of time between registration and voting. I suspect that some number of teachers skip portions of the curriculum, and my sense is that this varies based on the motivation of the lead teacher (with whom we are in contact) and the cohesiveness of the department running the event.

      IHS, from 'Info for Evaluation' document: "This event was run by a student leader who was not able to garner support from teaching staff due to lack of availability. Instead of implementing the full event, we provided him with a link to the voting ballot and he independently created a short video announcement. Only 36 students participated, and a voter registration form was not used. The primary intention here was to plant the seed for running the full event another time and data from the event are incomplete for purposes of program evaluation."

      TPS, pasted from email correspondence (Nov 28th): "Basically, we ran the elections as follows: 1. Pre-registration form completed by all pupils in the morning before assembly. 2. Assembly run by student leaders who explained about each charity and showed videos. 3. 2 x 20 min lessons (supervised by a teacher) a. Students were given links to research the 3 charities on their own to help them make up their mind (this was done independently with no discussion as we didn’t have the staffing) b. Students leaders videoed instructions for the worksheet and pupils were split into discussion groups to fill in the worksheet. Student leaders circulated the classrooms to ask and answer questions. c. At the end of this they logged on to vote again"

      Note that I neglected to share this information from TPS in the preregistration. I think the following information via email was the only context provided prior to analysis (Nov 26th): "Just a quick follow-up to provide the datasets for the second school. I won't send any additional context until we have a better system for communication that information, but in the meantime I wanted to mention that this school also ran the event last year. All of the data will be anonymous for each school."

    13. Outcome 1

      The visual below does not refer to Outcome 1. I know that you have gone over paid hours and am assuming this is just a section you haven't gotten to yet.

    14. climate change

      It will be interesting to see results after schools are exposed to a different combination of charities (especially THL or an ACE fund). One of the charities on the ballot this academic year has been the Clean Air Task Force.

    15. disease

      It probably is not worth the time at this stage of analysis, but I would be curious to see the data with cancer and disease as separate categories. It is possible that some students use the word, disease, to refer to schistosomiasis (as Unlimit Health is one of the charities on the ballot over the 2022-23 academic year).

    16. Cause priority 1 (broad categorized)

      Can you help me understand how this table differs from the above table? Was it categorized to require more specificity (i.e. more falling into the 'Other' bucket)?

    17. outcomes of greatest interest

      Note there are lingering concerns regarding this outcome. For reference, here is a summary pasted from Slack.

      In the document with info for analysis (https://docs.google.com/document/d/1Frd_C9f2Eery9Sy1fMlRPGz-tm-0htM9X-mVgmVOutY/edit?usp=sharing), I had indicated concerns about this outcome (especially under "Question not capturing core objectives") that we should discuss when you are able to get to it. Here are my thoughts from the document as concise as I can make them. - It's obviously important to get around agreeability bias, but the causes/issues question doesn't capture the core objectives of the program and is subject to other issues. - Accordingly, there was a contradiction between the stated core objectives of the program in the preregistration (increase in "attitudes consistent with the giving significantly and giving effectively portions of the GWWC mission") and focusing on the causes/issues question (not measuring those attitudes) in order to get around agreeability bias. This should certainly be incorporated into discussion in the report, though I regret that this was not identified earlier. - For reasons discussed in the document my view is that control data (for the items measuring giving significantly + giving effectively) is needed to adequately evaluate the intended impact. We of course do not have control data at this stage of evaluation, but we somehow need to both (a) account for agreeability bias and (b) evaluate the core objectives of the program in the meantime. If we do not do (a), then the data have limited interpretability; if we do not do (b), then the intended impact of the program is not being evaluated. I don't know what to do about it, but it's an important area of discussion.

    18. Moderate attrition

      Note that attrition is a hypothesis/assumption that cannot be tested since the collected data must be anonymous for legal reasons. Given the nature of the school setting and the voting context (distinct from other data collection settings), there are other plausible explanations for the discrepancy between votes and registrations.

    19. Note substantial attrition

      Note that attrition is a hypothesis/assumption that cannot be tested since the collected data must be anonymous for legal reasons. Given the nature of the school setting and the voting context (distinct from other data collection settings), there are other plausible explanations for the discrepancy between votes and registrations.

    20. TPS

      Note that TPS also used an adjusted curriculum (e.g. they removed the discussion portion of the curriculum due to lack of staff capacity).

      However, lack of staff capacity is expected in schools, and we plan to continue to offer sponsorship to schools with limited staffing. There is probably a difference in impact between schools that have strong vs. limited implementation adherence, but I would say that TPS would still be included in the range of schools that would be awarded sponsorship going forward. In other words, I would suggest that low implementation adherence reduces expected impact, but we have no plans to stop working with schools that have low implementation adherence due to lack of staff capacity.

    21. other responses

      Where, if at all in this report, would it be appropriate to discuss other potential measures of success that were not selected to evaluate the program (e.g. because they were not falsifiable)? For example, a teacher and two student leaders presented at the National Council for the Social Studies annual conference, which could influence how teachers discuss charitable giving in their classes.

    22. TPS, MCHS, and NHS

      Since TPS and NHS both ran the event in past year(s) and had more students, it might be expected that the effects are understated relative to the expected effect for schools running the event for the first time. Note that TPS used an adjusted curriculum (e.g. they removed the discussion portion of the curriculum due to lack of staff capacity).

    23. students who are less sympathetic to the program

      It also seems plausible that students who are less sympathetic to the program would be more motivated to vote than to register to vote. However, it is difficult to get a sense for whether this effect is happening, especially if it might be happening in conjunction with other effects.

    24. concern

      This could also potentially be explained by a lower expected effect due to TPS having run an event last year or TPS having used an adjusted curriculum (e.g. they removed the discussion portion of the curriculum due to lack of staff capacity).

    25. Maybe at other HS too?

      Yes, it a standard part of the event for student leaders. For the sake of having a meaningful pre-post comparison, though, the impact of the campaigns is assumed to be not significant. NHS just tends to emphasize the campaigns more than other schools, as the event is run through an AP Government class.

    26. ‘pre’ survey was not administered to everyone

      Just noting that as recently discussed, data from schools indicated other explanations for differences between registrations and votes that seem to be more reasonable (clearly more reasonable for NHS and plausibly/possibly for other schools as well). In the document with info for analysis (https://docs.google.com/document/d/1Frd_C9f2Eery9Sy1fMlRPGz-tm-0htM9X-mVgmVOutY/edit?usp=sharing), they are primarily discussed in the "notable considerations" section of the table for NHS and under the question, "To what extent is it an issue when there is not an identical number of registrations and votes?"

    27. IHS excluded

      ISMDA also should be excluded, as the voter registration form was not a pre-event survey for any students at ISMDA (see below from working doc). Since RHMA was a pilot with 5th grade students and the event is targeted to high schools, I would lean against including RHMA in the overall analysis as well. As noted in the comment above, the "sample" data shouldn't be included as it just originated from trying to encourage a subset of students to vote at NHS.

      "All of the resources, including the questions on the voter registration form and voting ballot, were translated into Italian. The forms were translated based on the same English formats, by a teacher who is involved with the translators team of Italy's EA community. Due to the number of documents exchanged to complete the translations, the teacher had trouble finding the voter registration form during the event. Students did not register to vote prior to learning about or discussing the charities, and the voter registration form cannot be considered a pre-event survey. It seems to have been closer to a mid-event survey, but I do not have an exact sense of this. A few resources have resulted from this event (e.g., this video presentation)."

    1. Generally, the results and the differences are similar across schools, with TPS showing somewhat lower incidence of the top ratings. Note that the attrition rate for TPS is particularly low (under 2%), and it also shows the smallest differences in the key measures. This loosely suggests that the differential attrition bias might indeed be a concern.

      Pasting from email for ease of reference...

      Also, I wanted to flag a couple aspects of the report (pasted below) that mentioned attrition. As discussed recently, data from schools indicated other explanations for differences between registrations and votes that seem to be more reasonable (clearly more reasonable for NHS and plausibly/possibly for other schools as well). In the document with info for analysis (https://docs.google.com/document/d/1Frd_C9f2Eery9Sy1fMlRPGz-tm-0htM9X-mVgmVOutY/edit?usp=sharing), they are primarily discussed in the "notable considerations" section of the table for NHS and under the question, "To what extent is it an issue when there is not an identical number of registrations and votes?"

      On a separate note, I am curious about differences in results between schools running events for the first time (MCHS, IHS, MHS, ISMDA, SHS, RHMA) and schools that had already run an event or events in past years (NHS, TPS). Among the schools planning or thinking of running an event, three have not previously run an event (CAS, HS, GSAL) and three already have (RHS, SEK, WPS). This probably wouldn't fit into the preliminary analysis but I think this could be interesting to look into another time.

      "There is a small amount of attrition – a bit over 5% of the sample. As we are considering the simple before/after differences, this could bias our results. It seems plausible that students who are less sympathetic to the program (less altruistic, etc.) are more likely to ‘drop out’. This would make the above (post - pre) differences overstate the true differences for the full sample, and overstate the impact of the program.4"

      "Generally, the results and the differences are similar across schools, with TPS showing somewhat lower incidence of the top ratings. Note that the attrition rate for TPS is particularly low (under 2%), and it also shows the smallest differences in the key measures. This loosely suggests that the differential attrition bias might indeed be a concern."

    2. There is a small amount of attrition – a bit over 5% of the sample. As we are considering the simple before/after differences, this could bias our results. It seems plausible that students who are less sympathetic to the program (less altruistic, etc.) are more likely to ‘drop out’. This would make the above (post - pre) differences overstate the true differences for the full sample, and overstate the impact of the program.4

      Pasting from email for ease of reference...

      Also, I wanted to flag a couple aspects of the report (pasted below) that mentioned attrition. As discussed recently, data from schools indicated other explanations for differences between registrations and votes that seem to be more reasonable (clearly more reasonable for NHS and plausibly/possibly for other schools as well). In the document with info for analysis (https://docs.google.com/document/d/1Frd_C9f2Eery9Sy1fMlRPGz-tm-0htM9X-mVgmVOutY/edit?usp=sharing), they are primarily discussed in the "notable considerations" section of the table for NHS and under the question, "To what extent is it an issue when there is not an identical number of registrations and votes?"

      "There is a small amount of attrition – a bit over 5% of the sample. As we are considering the simple before/after differences, this could bias our results. It seems plausible that students who are less sympathetic to the program (less altruistic, etc.) are more likely to ‘drop out’. This would make the above (post - pre) differences overstate the true differences for the full sample, and overstate the impact of the program.4"

      "Generally, the results and the differences are similar across schools, with TPS showing somewhat lower incidence of the top ratings. Note that the attrition rate for TPS is particularly low (under 2%), and it also shows the smallest differences in the key measures. This loosely suggests that the differential attrition bias might indeed be a concern."