1,046 Matching Annotations
  1. Nov 2022
    1. - brms’ default and my own

      these both seem to allow negative values for sigma. These don't seem right -- aren't you supposed to do something that implies a strictly positive distribution, like letting the log of sigma be normally distributed?

      (I think they are only positive in the plot because you cut off the x axis)

      Maybe the brms procedure below fixes this in a mechanical way because it sees 'class= "sigma"' .. .but I'm not sure howw

    2. we can simulate what we believe the data to be

      I wouldn't say 'believe the data to be'. We have the data (at least the sample). We are simulating what we believe the population this is drawn from looks like

    3. The sigma prior

      'prior over sigma' -- 'sigma prior' makes it sound like it's a sigma distribution (if that's a thing) rather than a distribution over sigma

    4. We’re not super uncertain about people’s heights, though, so let’s use a normal distribution.

      uncertainty could be expressed in terms of a greater sigma (std deviation) also. So this isn't exactly uncertainty, but something about the shape of the distribution, the amount of tail-uncertainty for a given level of uncertainty closer to the mean

    5. But this is the default prior. brms determines this automatic prior by peeking at the data, which is not what we want to do. Instead, we should create our own.

      but what is it's justification for doing so? the people who designed it must have had something in mind.

    6. we should start with defining our beliefs, rather than immediately jumping into running an analysis.

      Slight quibble: It doesn't have to be 'our own beliefs' but just 'what you think a reasonable starting belief would be', or 'what you think others would accept'.

      This relates to the discussion of 'epistemological assumption', I believe.

      It could also be a belief (distribution) based on prior data and evidence

    1. This is what “5 expected events” means! The most likely number of meteors is 5, the rate parameter of the distribution.

      I think that's the mode -- does it coincide with the mean here?

    1. Attrition rate = % of fellows who do not complete the fellowship, assuming that completion of a fellowship is defined as attending at least 6 of 8 meetings

      I know they track this for the virtual fellowships

    2. This form will consist of a list of all Fellows who filled out the “How you heard about us” question. Each organizer will be prompted to label each fellow as one of the following:

      super important!

    3. maybe) Other variables which we might be interested in (Group age, # of organizers, existing group size, etc.).

      This seems important -- identifying 'outcomes' and tracking them

    4. Fellowship application, and regularly track fellowship attendance for every Fellow.

      Can we clearly define or link which 'fellowships' we mean here? Can people be in these groups without doing the fellowship?

  2. Oct 2022
    1. Apply the quadratic approximation to the globe tossing data with rethinking::map().

      Here they actually use the Rethinking package instead of brms. Why?

  3. Sep 2022
    1. Why hasn’t such a movement for something like Guided Consumption happened already? Because Guiding Companies, by definition, generate profit for charities instead of traditional investors, a major issue they face is that Guiding Companies cannot access the same investment pool of private equity and angel investors for seed money. One solution to this would be to seek financing from philanthropists, particularly those who are looking to spend their money to advance the same cause area as the Guiding Company. However, the question remains: if Guided Consumption is a more effective means of funding charities than direct donation, why has this not been more fully explored already?   I suspect that the reason stems from a deep-seated psychological separation between the way that people think about the business world, essentially a rather competitive, dog-eat-dog mindset and the kinder, more magnanimous mindset involved in charity work. The notion also seems to violate intuitions about sacrifice involved in charitable contributions, although these intuitions do not hold with the deliberate substitution of traditional stakeholders for charities. I would also note that some further red-teaming can be found in the comments of the longer paper.

      These are good points.

    2. But even if Guiding Companies engage in activities that consumers take issue with regarding traditional firms, such as competitive (i.e., princely) compensation for CEOs, it is not clear why this would cause a consumer to choose a company that enriches shareholder over a company that helps fight global poverty.

      But the pressure not to do this might make the GC's less efficient and thus more expensive

    3. What if selfish motivations make for the best founders/investors/etc.? The efforts of philanthropic investors are cap

      I think this is a big issue and you are not taking it seriously enough. Without profit motives, it may be hard for these companies to stay efficient and well, profitable. Who is 'holding the CEOs' feet to the fire?' At least the conventional wisdom is that altruistically motivated leaders are less hard-headed, less efficient, etc.

    4. the public

      I feel like this already exists enough with Newman's Own etc. I think we should try to focus on GH&D here and maybe some Global Catastrophic Risk prevention public goods. Animal causes: maybe, but only to some demos/products (like vegetarian stuff).

    5. Which Market Sectors?

      I also suggest market sectors where there is some reluctance/repugnance to buying the product or service. The charity aspect will allow some moral licensing. E.g., I forget which charity allowed people to donate in exchange for cutting-in-line at some festival.

    6. low-differentiation sectors, it may be easier to construct a “no-brainer” where a consumer is genuinely ambivalent as to two product

      But are there substantial profits to be had by newcomers in such sectors? The profit margins may be low for such commonplace undifferentiated sectors.

    7. Another approach is to capitalize on virtue-signaling, perhaps through products that could enable a consumer to conspicuously show that they bought through a Guiding Company.

      I strongly agree with this. More conspicuous consumption.

    8. A movement that enables everyday people to help charities without sacrificing anything personally should be much easier than one that demands people give significant things up or even mildly inconveniences people.

      But can we really quantify the benefit?

      1. Charities already hold shares of companies

      2. People already do consider the owners of companies (usually through a political lens ... e.g., "Home Depot owner supports right wing causes so people boycott" or some such

      3. How much more will shopping at a "Guided Consumption owned company" actually lead more to go to the charities?

      4. Will people (over)compensate for this by reducing donations elsewhere?

      5. If the big companies are differentiated in some way (like 'monopolistic competition' suggests, there could be a substantial cost to consumers (and to efficiency) to choosing the 'charity supporting brand'

    9. I am optimistic about the prospects for a movement developing because of what it allows for consumers: they get the same product, at the same price, but profits benefit charities rather than shareholders.

      I think you said this already

    10. to be the most powerful, would likely require a social movement

      why does it 'need a social movement'? That doesn't seem clear to me. It seems like it would benefit from one... but.

    11. although a Guiding Company would likely enjoy a degree of advantage correspondent with a Guiding Company being able to communicate this feature with its customer base.

      not sure why this is an 'although'?

    12. the identity of the entities that benefit from your purchase, often, owners in some form.

      Not entirely true. A lot of companies (e.g., Big Y) advertise themselves as 'American owned'

    13. would have a competitive advantage

      "Would have" seems too strong. There are reasons to imagine an advantage and other reasons to imagine a disadvantage. I think EA forum prefers 'humble' epistemic statements

  4. Aug 2022
    1. + (1|reader)

      Richard: 2 reasons 1. I get this pooling/regularization effect 2. "I don't really care about reader" so ???

      If reader were orthogonal to everything else I might still put it in because of the unbiasedness 'in a low dimensional setting' (DR: sort of thought it goes the opposite way)

      If I do an idealized RCT with things I changed in exactly the same way I would not get overfitting. I might get error, but not overfitting.

    2. Thinking by analogy to a Bayesian approach, what does it mean that we assume the intercept is a “random deviations drawn from a distribution”? Isn’t that what we always assume, for each parameter in a Bayesian model … so then, what would it mean for a Bayesian model to have a fixed (vs random) coefficient?

      With bayesian mixed you are putting priors on ever coefficient true.

      But also you have an (?additional) random effect .. somewhat more structure.

      Also in LMER stuff we never update to a posterior

    3. Why wouldn’t we want all our parameters to be random effects? Why include any fixed effects … considering general ideas of overfitting and effects as draws from larger distributions?
      1. analogy to existing examples of fields of wheat
      2. or build a nested model and look for sensitivity
    4. How distinct is this from the ‘regularization with cross-validation’ that we see in Machine learning approaches? E.g., I could do a ridge model where I allow only the coefficient on reader to be regularized; this also leads to the same sort of ‘shrinkage’ … so what’s the difference?

      Richard: The L1/L2 E-net approach does something mechanical ... also it can handle a lot of stuff high dimensional, quick and dirty

      RE requires more thinking and more structure

      How to do this "Does this line up with the canonical problems involving fields etc"

    1. Groups Name Variance Std.Dev. Corr ## Chick (Intercept) 103.61 10.179 ## Time 10.01 3.165 -0.99 ## Residual 163.36 12.781 ## Number of obs: 578, groups: Chick, 50

      note the coefficients are not reported, just the dispersion

    1. This model assumes that each participant’s individual intercept and slope parameters are deviations from this average, and these random deviations drawn from a distribution of possible intercept and slope parameters.

      presumably normally distributed .. or at least with more mass inthe center

    1. and can give rise to subtle biases that require considerable sophistication to avoid.)

      I'm not sure the link refers to the same sort of 'random effects' technique, so the bias discussed there may not apply

    1. I’ll introduce ranks in a minute. For now, notice that the correlation coefficient of the linear model is identical to a “real” Pearson correlation, but p-values are an approximation which is is appropriate for samples greater than N=10 and almost perfect when N > 20.

      this paragraph needs clarification. the coefficient on which linear model?

  5. Jul 2022
    1. under the assumption that ℋ1H1{\mathcal {H}}_1 is true, the associated credible interval for the test-relevant parameter provides a range that contains 95% of the posterior mass.

      I don't get the 'under the assumption that H1 is true' in this sentence. Isn't this true of the credible interval in any case?

    2. he Bayes factor (e.g., Etz and Wagenmakers 2017; Haldane 1932; Jeffreys 1939; Kass et al. 1995; Wrinch and Jeffreys 1921) reflects the relative predictive adequacy of two competing models or hypotheses, say ℋ0H0{\mathcal {H}}_0 (which postulates the absence of the test-relevant parameter) and ℋ1H1{\mathcal {H}}_1 (which postulates the presence of the test-relevant parameter).

      Bayes Factor is critiqued (datacolada?) because the 'presence of the parameter' involves an arbitrary choice of distribution of what values the parameter would have 'if it were present'.

      And sometimes the H0 is deemed more likely even when the observed parameter 'estimate' falls in this range.

    3. a [100×(1−𝛼)][100×(1−α)][100 \times (1-\alpha )]% confidence interval contains only those parameter values that would not be rejected if they were subjected to a null-hypothesis test with level 𝛼α\alpha.

      With the null hypothesis equal to the point estimate, I think, not a 0 null hypothesis

    1. They found that only 70% of their large (20:1) samples produced correct solutions, leading them to conclude that a 20:1 participant to item ratio produces error rates well above the field standard alpha = .05 level.

      really confused here ... what is the 'gold standard' ... how do they know what is a 'correct solution'? Also, how does this fit into a NHST framework?

    2. f course, the participant to item ratio is not a good benchmark for the appropriate sample size, so this is not enough to demonstrate that the sample size is insufficient. They did find support that this is not enough by sampling data of various sample sizes from a large data s

      rephrase?

    3. the number of underlying factors and which items load on which factor

      Would be good to link or define what terms like 'factors' and 'load' mean here

  6. Jun 2022
    1. In other words, the goal is to explore the data and reduce the number of variables.

      That's not 'in other words', it's different. "Reduce the number of variables" can be done in many ways and for different reasons. Latent factors are (I think) something with a specific meaning in psychology and this sort of structural analysis in general.

    1. The module also comes with a reviewer reputation system based on the assessment of reviews themselves, both by the community of users and by other peer reviewers. This allows a sophisticated scaling of the importance of each review on the overall assessment of a research work, based on the reputation of the reviewer.

      This seems promising!

    2. By transparent we mean that the identity of the reviewers is disclosed to the authors and to the public

      Not sure this is good. I worry about flattery and avoiding public criticism.

    3. Digital research works hosted in these repositories can thus be evaluated by an unlimited number of peers that offer not only a qualitative assessment in the form of text, but also quantitative measures that are used to build the work’s reputation.

      but who will finance and coordinate this?

    4. One important element still missing from open access repositories, however, is a quantitative assessment of the hosted research items that will facilitate the process of selecting the most relevant and distinguished content.

      What we've been saying

  7. May 2022
    1. EA focuses on two kinds of moral issue. The first is effective action in the here and now — maximising the bang for your charitable buck. The second is the very long run: controlling artificial general intelligence (AGI), or colonizing other planets so that humanity doesn’t keep all its eggs in one basket.

      good summary

    2. this issue in acute form

      wait, which issue? He realized that achieving his moral objectives wouldn't make him feel happy. Did that change what he felt he should do or his sense of moral obligation?

    3. (You barge past me, about my lawful business, on your mission of mercy. “Out of the way! Your utility has already been included in my decision calculus!” Oh yeah, pal? Can I see your working?)

      good analogy

    4. Another reason is just that other people’s concerns, right or wrong, deserve listening to.

      Is this related to the 'moral uncertainty' and 'moral hedging' ... or is this a fairness/justice argument?

    5. Utilitarianism is an outgrowth of Christianity.1

      This is a really big claim to make here ... needs more support. It kind of goes against religion in that it sets no 'thou shalt not's ... at least the act utilitarianism

    6. What will motivate you if you don’t change the world? Can you be satisfied doing a little? Cultivating the mental independence to work without appreciation, and the willpower to keep pedalling your bike, might be a valuable investment. Ambition is like an oxidizer. It gets things going, and can create loud explosions, but without fuel to consume, it burns out. It also helps to know what you actually want. “To maximize the welfare of all future generations” may not be the true answer.

      I'm not 100% sure what you are saying/suggesting here. Maybe this ends less strongly than it began? What is the 'fuel to consume' you are getting at here? What should it be?

    7. Just by arithmetic, only few will succeed.

      but if each have an independent probability of succeeding, each may still be having a large impact in expected value.

    8. Here’s a more general claim: the more local the issue, the less substitutable people are. Many people are working on the great needs of the world.

      This is possibly true, in some cases, for research work but probably not true for donations. If you donate $4000, lots more children get malaria nets or pills, fewer get severely ill, and on average 1 less children dies ... relative to your not having made that donation.

    9. net contribution

      what do you mean by 'net contribution'? There's a lot of discussion in the donations side of EA about making a counterfactual impact. They focus on the marginal difference you make in the world relative to not having done/donated this. If, absent your donation to Malaria Consortium just as many people would have gotten ill from malaria (because someone else would have stepped in) this would be counted as a 0. So this is already baked in.

    10. My marginal contribution would be small.

      I think you DHJ could possibly make a big contribution. BUT what does this have to do with this essay? What is the point you are making here?

    11. But enough other people are thinking about it already. I trust them to work it out. My marginal contribution would be small.

      Relative to other things and relative to the magnitude of the problem people claim this is ... few people are working on it, it's seen to be neglected.

    12. After a visit to Lesswrong, someone will probably associate EA more with preventing bad AIs than with expanding access to clean drinking water.

      But LW is not EA ... see https://www.lesswrong.com/posts/bJ2haLkcGeLtTWaD5/welcome-to-lesswrong ... doesn't mention EA.

      Also, I think most people who have heard of it still associate EA with the latter, and with the Giving What We Can 10% pledge (we actually have data on this) . Even though the most active EAs are in fact prioritizing longtermism these days.

    13. Contributing to the first topic requires discipline.

      Contributing research-wise and through your career, that is. You can always donate to the non-longtermist stuff, and that was at the core of the first-gen EA. And the whole GiveWell thing is about making it easy to know your gift makes the biggest difference per $

    14. They contain contradictions. That makes them rich enough to cope with human life, which is contradictory.

      I think some examples in footnotes or links or a vignette would help here. Because I sort of feel like "no, the old religions really struggle to cope with modern life"

    15. Either someone is maximizing utility, or they’re in the way.

      hmm., who said 'they're in the way?'

      Also, 'max util' is confusing here ... because in economics we think of it as maximizing our "personal" utility function. Maybe a distinction needs to be made at some point to make it clear that this is some weighted adding up.

    16. it imposes extreme moral demands on people. Sell all your possessions and give them to the poor.

      Not sure whom this is. EA doesn't really ask this. The push now is ~try to find more effective/impactful careers. And even the well known GWWC was 'only' advocating a 10% donation rate, and even that 'not for everyone'

    17. Utilitarians reduce all concerns to maximising utility. They can’t be swayed by argument, except about how to maximise utility. This makes them slightly like paperclip maximisers themselves.

      My guess is that the way you make your claim here might be seen as not Scout mindset, not fully reasoning transparent, not arguing in a generous spirit. It's not how people want to discourse on the EA forum anyways; not sure if effectiveideas would be ok with it or not.

      Would utilitarians say "we reduce all concerns to maximising utility"? If so, give a link/evidence to this statement.

    18. as preferences,

      I'm not 100% that all schools of utilitarianism treats it as choice or preference based. That is familiar from economics, but I think other schools consider things like 'intensity of pleasure and pain'?

    1. Demonstrate the potential for impact via thought experiments like The Drowning Child (although use this sparingly and cautiously, as people can be turned off by obligation framing).

      I think people are also sometimes turned off or disturbed by having to make these difficult Sophies' choices

  8. Apr 2022
    1. Approaches to overcoming the barriers and biases discussed in previous chapters.

      give bolded titles to these categories and link to sections below

    1. The first intervention, surgical treatment, can’t even be seen on this scale, because it has such a small impact relative to other interventions. And the best strategy, educating high-risk groups, is estimated to be 1,400 times better than that. (It’s possible that these estimates might be inaccurate, or might not capture all of the relevant effects. But it seems likely that there are still big differences between interventions.)

      Some rigor might be hslpful here

    1. So my functions don’t require access to the raw data

      there's always a workaround to regenerate the 'equivalent raw data'... but it's annoying

    2. while we are more used to think in terms of standardized effect sizes (correlations or Cohen’s d).

      not so much in Economics, where we often focus on 'non-unitless' outcomes, that have a context-specificinterpretation

    1. your data set (e.g., means, standard deviations, correlations

      your expected data set.

      But simulation also allows you to use prior data... maybe worth mentioning?

    1. An example of this is presented in the study by Lichenstein et al., (1978).

      seminal paper; if we want to dig into the evidence we should look at replication, review, post-replication crisis work.

    2. Experiential distance describes one’s proximity to a particular situation or feeling as a result of having seen something or been a part of it. In particular, a greater experiential distance is seen to make it more difficult to imagine a particular situation or feeling, therefore making it harder to empathize with someone going through it. This could create a barrier to giving effectively as most charitable giving is motivated by empathy and sympathy for victims (Lowenstein & Small, 2007). Experiential distance explains why it is easier for an individual to feel empathy for a victim if they have personally experienced the ailment or someone close to them has. As a result, they do not have to imagine the suffering it may have caused because they have directly or indirectly physically experienced it. In other words, the experiential gap is smaller. For example, people living in wealthy nations are more likely to be affected by cancer than malaria, leading to a greater support for that cause.

      This comes up in the context of 'availability bias'

      'Experiential distance' may be our own definition. If so, let's make it clear that this is the case

      "As most charity" ... "as it is claimed that..."

      "Ailment = illness" ... it could be outside of the medical domain .. make it more concise .. And if this is an esxample and not the general point, use a parenthetical "(e.g.)"

    Annotators

  9. Mar 2022
  10. pub-tools-public-publication-data.storage.googleapis.com pub-tools-public-publication-data.storage.googleapis.com
    1. predict the precisionof an iROAS estimate that a proposed experimental design will deliver.

      'predict the precision' ... maybe that's the Bayesian equivalent of a power analysis

    2. design process featuring a statistical power analysis is a crucial step in planning of aneffective experiment.

      this is great -- statistical power!

    3. If it is obvious from this plot that the incrementaleffects have ceased to increase beyond the end of the intervention period, there is no needto add a cooldown period, since doing so only adds more uncertainty to the estimate ofiROAS

      this seems suspect: you are imposing a strong assumption based on casual empiricism, and this probably overstates the precision of the results.

      ideally, some uncertainty over that should be incorporated into the estimation and model, but maybe it's not important in practice?

    4. Due to the marketing intervention, incremental cost of clicks increasedduring the test period.

      This is, apparently, 'amount spent' not a marginal cost thing

    5. The simplest possible relationship is given by the regression model,yt = α + βxt + t, t in pretest period

      maybe we want some geo-specific error term?

    6. or TBR, data are aggregated across geos to provide one observation ofresponse metric volume at every time interval for the control and treatment groups:

      the aggregation seems to throw out important variation here ... that might tell us something about how much things 'typically vary by without treatments' ... but maybe I'm missing something

    7. matched market tests, which may compare the behavior of users in a singlecontrol region with the behavior of users in a single test region.

      I'm considering a case with one test region and many control regions. Will this paper still apply?

  11. Feb 2022
    1. This appendix provides a brief introduction to the several types of software and processes used to creating websites such as Increasing Effective Charitable Giving and Researching and writing for Economics students. We aim to encourage others to participate in this collaborative work, and to spin off their own projects. If you would like to provide feedback or ask a question about these projects then using ‘hypothes.is’ is an easy way to do so (please write directly in the html and contact me at daaronr at gmail dot com to let me know you’ve done so).

      Answering some questions about participation:

      How many hours are student researchers required to allocate for the project? It depends on how much you want me to engage with you and talk you through the processes, unboarding etc. I think something like a minimum of 40-50 total hours seems about right, but 100+ hours would be better

      1. Is the project's intended audience the EA community? To an extent, yes. I guess 3 primary audiences.

      i. The EA community interested in learning more about (what they can do to promote effective) charitable giving and relevant attitudes,

      ii. Effective charities and organizations promoting effective giving and action (see the related EA Market testing team)

      iii. Academic researchers (Social science, economists, data scientists, human biology) interested in these issues, coming from a range of perspectives

      1. Which requirements constitute sufficient quality work for student co-authorship?

      It's hard to put this in writing succinctly. If you're continent is well written and reasoning-transparent we can probably integrated into the website and recognize you as the author of a particular section. In terms of peer reviewed academic output (this project is not itself a 'paper' but I am very pro-feedback and evaluation-- see bit.ly/eaunjournal) we have to discuss that more carefully.

    1. 10 Effect of analytical (effectiveness) information on generosity

      todo? maybe incorporate more lab work with clear strong 'analytical thinking' manipulations

    2. Much of this project is being openly presented in the (in-progress) “Impact of impact treatments on giving: field experiments and synthesis” bookdown, a project organised in the dualprocess repo.

      Todo -- integrate these better to remove overlap or make overlap directly synchronized

    1. We consider a range published work (from a non-systematised literature review) including laboratory and hypothetical experiments, and non-experimental analysis of related questions (such as ‘the impact of general charitable ratings).

      Consider: should I focus on the lab and otherwise framed work more?

      See private RP Slack thread here considering Moche et al, 2022

    1. After participants had completed the initial survey, which took most of them at least 10 minutes, we measured their volunteering behaviour by asking them whether they would be willing to fill in a ‘a few extra questions’ for charity (59.3% responded yes) rather than skipping directly to the final questions. The participants were informed that their choice would be completely anonymous and that 5 SEK (around 0.50 euro or 50 US cents) would be donated to a charity organization of their choosing if they would fill out the additional questions.

      This seems like a potentially meaningful measure. We need to read closely to consider the extent to which that 'desire for consistency with questionnaire response' could be driving/biasing this.

  12. Jan 2022
    1. Given the conjugacy of the beta for the binomial,

      Wikipedia

      In Bayesian probability theory, if the posterior distribution p(θ | x) is in the same probability distribution family as the prior probability distribution p(θ), the prior and posterior are then called conjugate distributions, and the prior is called a conjugate prior for the likelihood function p(x | θ).

      So, here, I guess, the combination of a binomial(\(\theta\)) distribution for the data and a Beta probability for \(\theta\), the probability of each positive outcome, implies that the posterior density will also have a Beta probability.

      However, the posterior density of the difference in the \(\theta\)s is something that would need to be computed.

    2. The statistical hypothesis we wish to investigate is whether the proportion of left-handed men is greater than, equal to, or less than the proportion of left-handed women.

      Note this is not the 'lady tasting tea' case, where true outcome shares are known

    1. Proportion of Funding Available for Program

      The 'user input' here should be something like a mean and a dispersion ... most people won't know what the parameters of the Beta distribution mean.

      if necessary, we could explain what the parameters you input here will do, and have a graph of the distribution of this input to the model

    2. Transfers as a percentage of total costs. GiveDirectly has other costs! So how much of our money is going to people in need? The value is derived here: https://docs.google.com/spreadsheets/d/1L03SQuAeRRfjyuxy20QIJByOx6PEzyJ-x4edz2fSiQ4/edit#gid=537899494 This is calculated by finding the average proportion over the years. TODO: Create a predictive model, fitting a normal and a beta distribution to financials Cell: B5 Units: Unitless (percentage), 0-100%

      I love that this is described, but obviously the display here doesn't work. I asked Causal about cold folding

  13. Dec 2021
    1. he current results are telling us more about the structure of the model than about the world. For real results, go try the Jupyter notebook!

      I would love for this to be made more user friendly and explained better! I was able to run it but its hard to wrap your head around all the parameters while you are doing it.

    2. minimally

      what do you mean 'minimally sensitive' here? This is a bit confusing ... you are highlighting what seem to be the LEAST important factors, then.

    1. Probability distributions of value per dollar for GiveWell’s top charities

      the chart is a bit challenging to read. I think it would benefit some indicators for where e.g., the 25th percentile, median, and 75th percentile are.

    2. Instead, what I have done is uniformly taken GiveWell’s best guess and added and subtracted 20%. These upper and lower bounds then become the 90% confidence interval of a log-normal distribution2. For example, if GiveWell’s best guess for a parameter is 0.1, I used a log-normal with a 90% CI from 0.08 to 0.12.

      this is a bit arbitrary, but I guess he is working to build on this

    1. But how do we usefully express probabilities over rankings

      But I don't think 'ranking' is precisely the target ... e.g., if 2 of the charities are extremely close in impact per dollar, it doesn't matter so much which ranks (trivially) higher

    1. ggplotly(p) 197019801990200020104000800012000dateunemploy.cls-1 {fill: #3f4f75;} .cls-2 {fill: #80cfbe;} .cls-3 {fill: #fff;}plotly-logomark {"x":{"data":[{"x":[-915,-884,-853,-823,-792,-762,-731,-700,-671,-640,-610,-579,-549,-518,-487,-457,-426,-396,-365,-334,-306,-275,-245,-214,-184,-153,-122,-92,-61,-31,0,31,59,90,120,151,181,212,243,273,304,334,365,396,424,455,485,516,546,577,608,638,669,699,730,761,790,821,851,882,912,943,974,1004,1035,1065,1096,1127,1155,1186,1216,1247,1277,1308,1339,1369,1400,1430,1461,1492,1520,1551,1581,1612,1642,1673,1704,1734,1765,1795,1826,1857,1885,1916,1946,1977,2007,2038,2069,2099,2130,2160,2191,2222,2251,2282,2312,2343,2373,2404,2435,2465,2496,2526,2557,2588,2616,2647,2677,2708,2738,2769,2800,2830,2861,2891,2922,2953,2981,3012,3042,3073,3103,3134,3165,3195,3226,3256,3287,3318,3346,3377,3407,3438,3468,3499,3530,3560,3591,3621,3652,3683,3712,3743,3773,3804,3834,3865,3896,3926,3957,3987,4018,4049,4077,4108,4138,4169,4199,4230,4261,4291,4322,4352,4383,4414,4442,4473,4503,4534,4564,4595,4626,4656,4687,4717,4748,4779,4807,4838,4868,4899,4929,4960,4991,5021,5052,5082,5113,5144,5173,5204,5234,5265,5295,5326,5357,5387,5418,5448,5479,5510,5538,5569,5599,5630,5660,5691,5722,5752,5783,5813,5844,5875,5903,5934,5964,5995,6025,6056,6087,6117,6148,6178,6209,6240,6268,6299,6329,6360,6390,6421,6452,6482,6513,6543,6574,6605,6634,6665,6695,6726,6756,6787,6818,6848,6879,6909,6940,6971,6999,7030,7060,7091,7121,7152,7183,7213,7244,7274,7305,7336,7364,7395,7425,7456,7486,7517,7548,7578,7609,7639,7670,7701,7729,7760,7790,7821,7851,7882,7913,7943,7974,8004,8035,8066,8095,8126,8156,8187,8217,8248,8279,8309,8340,8370,8401,8432,8460,8491,8521,8552,8582,8613,8644,8674,8705,8735,8766,8797,8825,8856,8886,8917,8947,8978,9009,9039,9070,9100,9131,9162,9190,9221,9251,9282,9312,9343,9374,9404,9435,9465,9496,9527,9556,9587,9617,9648,9678,9709,9740,9770,9801,9831,9862,9893,9921,9952,9982,10013,10043,10074,10105,10135,10166,10196,10227,10258,10286,10317,10347,10378,10408,10439,10470,10500,10531,10561,10592,10623,10651,10682,10712,10743,10773,10804,10835,10865,10896,10926,10957,10988,11017,11048,11078,11109,11139,11170,11201,11231,11262,11292,11323,11354,11382,11413,11443,11474,11504,11535,11566,11596,11627,11657,11688,11719,11747,11778,11808,11839,11869,11900,11931,11961,11992,12022,12053,12084,12112,12143,12173,12204,12234,12265,12296,12326,12357,12387,12418,12449,12478,12509,12539,12570,12600,12631,12662,12692,12723,12753,12784,12815,12843,12874,12904,12935,12965,12996,13027,13057,13088,13118,13149,13180,13208,13239,13269,13300,13330,13361,13392,13422,13453,13483,13514,13545,13573,13604,13634,13665,13695,13726,13757,13787,13818,13848,13879,13910,13939,13970,14000,14031,14061,14092,14123,14153,14184,14214,14245,14276,14304,14335,14365,14396,14426,14457,14488,14518,14549,14579,14610,14641,14669,14700,14730,14761,14791,14822,14853,14883,14914,14944,14975,15006,15034,15065,15095,15126,15156,15187,15218,15248,15279,15309,15340,15371,15400,15431,15461,15492,15522,15553,15584,15614,15645,15675,15706,15737,15765,15796,15826,15857,15887,15918,15949,15979,16010,16040,16071,16102,16130,16161,16191,16222,16252,16283,16314,16344,16375,16405,16436,16467,16495,16526],"y":[2944,2945,2958,3143,3066,3018,2878,3001,2877,2709,2740,2938,2883,2768,2686,2689,2715,2685,2718,2692,2712,2758,2713,2816,2868,2856,3040,3049,2856,2884,3201,3453,3635,3797,3919,4071,4175,4256,4456,4591,4898,5076,4986,4903,4987,4959,4996,4949,5035,5134,5042,4954,5161,5154,5019,4928,5038,4959,4922,4923,4913,4939,4849,4875,4602,4543,4326,4452,4394,4459,4329,4363,4305,4305,4350,4144,4396,4489,4644,4731,4634,4618,4705,4927,5063,5022,5437,5523,6140,6636,7501,7520,7978,8210,8433,8220,8127,7928,7923,7897,7794,7744,7534,7326,7230,7330,7053,7322,7490,7518,7380,7430,7620,7545,7280,7443,7307,7059,6911,7134,6829,6925,6751,6763,6815,6386,6489,6318,6337,6180,6127,6028,6309,6080,6125,5947,6077,6228,6109,6173,6109,6069,5840,5959,5996,6320,6190,6296,6238,6325,6683,6702,6729,7358,7984,8098,8363,8281,8021,8088,8023,7718,8071,8051,7982,7869,8174,8098,7863,8036,8230,8646,9029,9267,9397,9705,9895,10244,10335,10538,10849,10881,11217,11529,11938,12051,11534,11545,11408,11268,11154,11246,10548,10623,10282,9887,9499,9331,9008,8791,8746,8762,8456,8226,8537,8519,8367,8381,8198,8358,8423,8321,8339,8395,8302,8460,8513,8196,8248,8298,8128,8138,7795,8402,8383,8364,8439,8508,8319,8135,8310,8243,8159,7883,7892,7865,7862,7542,7574,7398,7268,7261,7102,7227,7035,6936,6953,6929,6876,6601,6779,6546,6605,6843,6604,6568,6537,6518,6682,6359,6205,6468,6375,6577,6495,6511,6590,6630,6725,6667,6752,6651,6598,6797,6742,6590,6922,7188,7368,7459,7764,7901,8015,8265,8586,8439,8736,8692,8586,8666,8722,8842,8931,9198,9283,9454,9460,9415,9744,10040,9850,9787,9781,9398,9565,9557,9325,9183,9056,9110,9149,9121,8930,8763,8714,8750,8542,8477,8630,8583,8470,8331,7915,7927,7946,7933,7734,7632,7375,7230,7375,7187,7153,7645,7430,7427,7527,7484,7478,7328,7426,7423,7491,7313,7318,7415,7423,7095,7337,6882,6979,7031,7236,7253,7158,7102,7000,6873,6655,6799,6655,6608,6656,6454,6308,6476,6368,6306,6422,5941,6047,6212,6259,6179,6300,6280,6100,6032,5976,6111,5783,6004,5796,5951,6025,5838,5915,5778,5716,5653,5708,5858,5733,5481,5758,5651,5747,5853,5625,5534,5639,5634,6023,6089,6141,6271,6226,6484,6583,7042,7142,7694,8003,8258,8182,8215,8304,8599,8399,8393,8390,8304,8251,8307,8520,8640,8520,8618,8588,8842,8957,9266,9011,8896,8921,8732,8576,8317,8370,8167,8491,8170,8212,8286,8136,7990,7927,8061,7932,7934,7784,7980,7737,7672,7651,7524,7406,7345,7553,7453,7566,7279,7064,7184,7072,7120,6980,7001,7175,7091,6847,6727,6872,6762,7116,6927,6731,6850,6766,6979,7149,7067,7170,7237,7240,7645,7685,7497,7822,7637,8395,8575,8937,9438,9494,10074,10538,11286,12058,12898,13426,13853,14499,14707,14601,14814,15009,15352,15219,15098,15046,15113,15202,15325,14849,14474,14512,14648,14579,14516,15081,14348,14013,13820,13737,13957,13855,13962,13763,13818,13948,13594,13302,13093,12797,12813,12713,12646,12660,12692,12656,12471,12115,12124,12005,12298,12471,11950,11689,11760,11654,11751,11335,11279,11270,11136,10787,10404,10202,10349,10380,9702,9859,9460,9608,9599,9262,8990,9090,8717,8903,8610,8504,8526],"text":["date: 1967-07-01<br />unemploy: 2944","date: 1967-08-01<br />unemploy: 2945","date: 1967-09-01<br />unemploy: 2958","date: 1967-10-01<br />unemploy: 3143","date: 1967-11-01<br />unemploy: 3066","date: 1967-12-01<br />unemploy: 3018","date: 1968-01-01<br />unemploy: 2878","date: 1968-02-01<br />unemploy: 3001","date: 1968-03-01<br />unemploy: 2877","date: 1968-04-01<br />unemploy: 2709","date: 1968-05-01<br />unemploy: 2740","date: 1968-06-01<br />unemploy: 2938","date: 1968-07-01<br />unemploy: 2883","date: 1968-08-01<br />unemploy: 2768","date: 1968-09-01<br />unemploy: 2686","date: 1968-10-01<br />unemploy: 2689","date: 1968-11-01<br />unemploy: 2715","date: 1968-12-01<br />unemploy: 2685","date: 1969-01-01<br />unemploy: 2718","date: 1969-02-01<br />unemploy: 2692","date: 1969-03-01<br />unemploy: 2712","date: 1969-04-01<br />unemploy: 2758","date: 1969-05-01<br />unemploy: 2713","date: 1969-06-01<br />unemploy: 2816","date: 1969-07-01<br />unemploy: 2868","date: 1969-08-01<br />unemploy: 2856","date: 1969-09-01<br />unemploy: 3040","date: 1969-10-01<br />unemploy: 3049","date: 1969-11-01<br />unemploy: 2856","date: 1969-12-01<br />unemploy: 2884","date: 1970-01-01<br />unemploy: 3201","date: 1970-02-01<br />unemploy: 3453","date: 1970-03-01<br />unemploy: 3635","date: 1970-04-01<br />unemploy: 3797","date: 1970-05-01<br />unemploy: 3919","date: 1970-06-01<br />unemploy: 4071","date: 1970-07-01<br />unemploy: 4175","date: 1970-08-01<br />unemploy: 4256","date: 1970-09-01<br />unemploy: 4456","date: 1970-10-01<br />unemploy: 4591","date: 1970-11-01<br />unemploy: 4898","date: 1970-12-01<br />unemploy: 5076","date: 1971-01-01<br />unemploy: 4986","date: 1971-02-01<br />unemploy: 4903","date: 1971-03-01<br />unemploy: 4987","date: 1971-04-01<br />unemploy: 4959","date: 1971-05-01<br />unemploy: 4996","date: 1971-06-01<br />unemploy: 4949","date: 1971-07-01<br />unemploy: 5035","date: 1971-08-01<br />unemploy: 5134","date: 1971-09-01<br />unemploy: 5042","date: 1971-10-01<br />unemploy: 4954","date: 1971-11-01<br />unemploy: 5161","date: 1971-12-01<br />unemploy: 5154","date: 1972-01-01<br />unemploy: 5019","date: 1972-02-01<br />unemploy: 4928","date: 1972-03-01<br />unemploy: 5038","date: 1972-04-01<br />unemploy: 4959","date: 1972-05-01<br />unemploy: 4922","date: 1972-06-01<br />unemploy: 4923","date: 1972-07-01<br />unemploy: 4913","date: 1972-08-01<br />unemploy: 4939","date: 1972-09-01<br />unemploy: 4849","date: 1972-10-01<br />unemploy: 4875","date: 1972-11-01<br />unemploy: 4602","date: 1972-12-01<br />unemploy: 4543","date: 1973-01-01<br />unemploy: 4326","date: 1973-02-01<br />unemploy: 4452","date: 1973-03-01<br />unemploy: 4394","date: 1973-04-01<br />unemploy: 4459","date: 1973-05-01<br />unemploy: 4329","date: 1973-06-01<br />unemploy: 4363","date: 1973-07-01<br />unemploy: 4305","date: 1973-08-01<br />unemploy: 4305","date: 1973-09-01<br />unemploy: 4350","date: 1973-10-01<br />unemploy: 4144","date: 1973-11-01<br />unemploy: 4396","date: 1973-12-01<br />unemploy: 4489","date: 1974-01-01<br />unemploy: 4644","date: 1974-02-01<br />unemploy: 4731","date: 1974-03-01<br />unemploy: 4634","date: 1974-04-01<br />unemploy: 4618","date: 1974-05-01<br />unemploy: 4705","date: 1974-06-01<br />unemploy: 4927","date: 1974-07-01<br />unemploy: 5063","date: 1974-08-01<br />unemploy: 5022","date: 1974-09-01<br />unemploy: 5437","date: 1974-10-01<br />unemploy: 5523","date: 1974-11-01<br />unemploy: 6140","date: 1974-12-01<br />unemploy: 6636","date: 1975-01-01<br />unemploy: 7501","date: 1975-02-01<br />unemploy: 7520","date: 1975-03-01<br />unemploy: 7978","date: 1975-04-01<br />unemploy: 8210","date: 1975-05-01<br />unemploy: 8433","date: 1975-06-01<br />unemploy: 8220","date: 1975-07-01<br />unemploy: 8127","date: 1975-08-01<br />unemploy: 7928","date: 1975-09-01<br />unemploy: 7923","date: 1975-10-01<br />unemploy: 7897","date: 1975-11-01<br />unemploy: 7794","date: 1975-12-01<br />unemploy: 7744","date: 1976-01-01<br />unemploy: 7534","date: 1976-02-01<br />unemploy: 7326","date: 1976-03-01<br />unemploy: 7230","date: 1976-04-01<br />unemploy: 7330","date: 1976-05-01<br />unemploy: 7053","date: 1976-06-01<br />unemploy: 7322","date: 1976-07-01<br />unemploy: 7490","date: 1976-08-01<br />unemploy: 7518","date: 1976-09-01<br />unemploy: 7380","date: 1976-10-01<br />unemploy: 7430","date: 1976-11-01<br />unemploy: 7620","date: 1976-12-01<br />unemploy: 7545","date: 1977-01-01<br />unemploy: 7280","date: 1977-02-01<br />unemploy: 7443","date: 1977-03-01<br />unemploy: 7307","date: 1977-04-01<br />unemploy: 7059","date: 1977-05-01<br />unemploy: 6911","date: 1977-06-01<br />unemploy: 7134","date: 1977-07-01<br />unemploy: 6829","date: 1977-08-01<br />unemploy: 6925","date: 1977-09-01<br />unemploy: 6751","date: 1977-10-01<br />unemploy: 6763","date: 1977-11-01<br />unemploy: 6815","date: 1977-12-01<br />unemploy: 6386","date: 1978-01-01<br />unemploy: 6489","date: 1978-02-01<br />unemploy: 6318","date: 1978-03-01<br />unemploy: 6337","date: 1978-04-01<br />unemploy: 6180","date: 1978-05-01<br />unemploy: 6127","date: 1978-06-01<br />unemploy: 6028","date: 1978-07-01<br />unemploy: 6309","date: 1978-08-01<br />unemploy: 6080","date: 1978-09-01<br />unemploy: 6125","date: 1978-10-01<br />unemploy: 5947","date: 1978-11-01<br />unemploy: 6077","date: 1978-12-01<br />unemploy: 6228","date: 1979-01-01<br />unemploy: 6109","date: 1979-02-01<br />unemploy: 6173","date: 1979-03-01<br />unemploy: 6109","date: 1979-04-01<br />unemploy: 6069","date: 1979-05-01<br />unemploy: 5840","date: 1979-06-01<br />unemploy: 5959","date: 1979-07-01<br />unemploy: 5996","date: 1979-08-01<br />unemploy: 6320","date: 1979-09-01<br />unemploy: 6190","date: 1979-10-01<br />unemploy: 6296","date: 1979-11-01<br />unemploy: 6238","date: 1979-12-01<br />unemploy: 6325","date: 1980-01-01<br />unemploy: 6683","date: 1980-02-01<br />unemploy: 6702","date: 1980-03-01<br />unemploy: 6729","date: 1980-04-01<br />unemploy: 7358","date: 1980-05-01<br />unemploy: 7984","date: 1980-06-01<br />unemploy: 8098","date: 1980-07-01<br />unemploy: 8363","date: 1980-08-01<br />unemploy: 8281","date: 1980-09-01<br />unemploy: 8021","date: 1980-10-01<br />unemploy: 8088","date: 1980-11-01<br />unemploy: 8023","date: 1980-12-01<br />unemploy: 7718","date: 1981-01-01<br />unemploy: 8071","date: 1981-02-01<br />unemploy: 8051","date: 1981-03-01<br />unemploy: 7982","date: 1981-04-01<br />unemploy: 7869","date: 1981-05-01<br />unemploy: 8174","date: 1981-06-01<br />unemploy: 8098","date: 1981-07-01<br />unemploy: 7863","date: 1981-08-01<br />unemploy: 8036","date: 1981-09-01<br />unemploy: 8230","date: 1981-10-01<br />unemploy: 8646","date: 1981-11-01<br />unemploy: 9029","date: 1981-12-01<br />unemploy: 9267","date: 1982-01-01<br />unemploy: 9397","date: 1982-02-01<br />unemploy: 9705","date: 1982-03-01<br />unemploy: 9895","date: 1982-04-01<br />unemploy: 10244","date: 1982-05-01<br />unemploy: 10335","date: 1982-06-01<br />unemploy: 10538","date: 1982-07-01<br />unemploy: 10849","date: 1982-08-01<br />unemploy: 10881","date: 1982-09-01<br />unemploy: 11217","date: 1982-10-01<br />unemploy: 11529","date: 1982-11-01<br />unemploy: 11938","date: 1982-12-01<br />unemploy: 12051","date: 1983-01-01<br />unemploy: 11534","date: 1983-02-01<br />unemploy: 11545","date: 1983-03-01<br />unemploy: 11408","date: 1983-04-01<br />unemploy: 11268","date: 1983-05-01<br />unemploy: 11154","date: 1983-06-01<br />unemploy: 11246","date: 1983-07-01<br />unemploy: 10548","date: 1983-08-01<br />unemploy: 10623","date: 1983-09-01<br />unemploy: 10282","date: 1983-10-01<br />unemploy: 9887","date: 1983-11-01<br />unemploy: 9499","date: 1983-12-01<br />unemploy: 9331","date: 1984-01-01<br />unemploy: 9008","date: 1984-02-01<br />unemploy: 8791","date: 1984-03-01<br />unemploy: 8746","date: 1984-04-01<br />unemploy: 8762","date: 1984-05-01<br />unemploy: 8456","date: 1984-06-01<br />unemploy: 8226","date: 1984-07-01<br />unemploy: 8537","date: 1984-08-01<br />unemploy: 8519","date: 1984-09-01<br />unemploy: 8367","date: 1984-10-01<br />unemploy: 8381","date: 1984-11-01<br />unemploy: 8198","date: 1984-12-01<br />unemploy: 8358","date: 1985-01-01<br />unemploy: 8423","date: 1985-02-01<br />unemploy: 8321","date: 1985-03-01<br />unemploy: 8339","date: 1985-04-01<br />unemploy: 8395","date: 1985-05-01<br />unemploy: 8302","date: 1985-06-01<br />unemploy: 8460","date: 1985-07-01<br />unemploy: 8513","date: 1985-08-01<br />unemploy: 8196","date: 1985-09-01<br />unemploy: 8248","date: 1985-10-01<br />unemploy: 8298","date: 1985-11-01<br />unemploy: 8128","date: 1985-12-01<br />unemploy: 8138","date: 1986-01-01<br />unemploy: 7795","date: 1986-02-01<br />unemploy: 8402","date: 1986-03-01<br />unemploy: 8383","date: 1986-04-01<br />unemploy: 8364","date: 1986-05-01<br />unemploy: 8439","date: 1986-06-01<br />unemploy: 8508","date: 1986-07-01<br />unemploy: 8319","date: 1986-08-01<br />unemploy: 8135","date: 1986-09-01<br />unemploy: 8310","date: 1986-10-01<br />unemploy: 8243","date: 1986-11-01<br />unemploy: 8159","date: 1986-12-01<br />unemploy: 7883","date: 1987-01-01<br />unemploy: 7892","date: 1987-02-01<br />unemploy: 7865","date: 1987-03-01<br />unemploy: 7862","date: 1987-04-01<br />unemploy: 7542","date: 1987-05-01<br />unemploy: 7574","date: 1987-06-01<br />unemploy: 7398","date: 1987-07-01<br />unemploy: 7268","date: 1987-08-01<br />unemploy: 7261","date: 1987-09-01<br />unemploy: 7102","date: 1987-10-01<br />unemploy: 7227","date: 1987-11-01<br />unemploy: 7035","date: 1987-12-01<br />unemploy: 6936","date: 1988-01-01<br />unemploy: 6953","date: 1988-02-01<br />unemploy: 6929","date: 1988-03-01<br />unemploy: 6876","date: 1988-04-01<br />unemploy: 6601","date: 1988-05-01<br />unemploy: 6779","date: 1988-06-01<br />unemploy: 6546","date: 1988-07-01<br />unemploy: 6605","date: 1988-08-01<br />unemploy: 6843","date: 1988-09-01<br />unemploy: 6604","date: 1988-10-01<br />unemploy: 6568","date: 1988-11-01<br />unemploy: 6537","date: 1988-12-01<br />unemploy: 6518","date: 1989-01-01<br />unemploy: 6682","date: 1989-02-01<br />unemploy: 6359","date: 1989-03-01<br />unemploy: 6205","date: 1989-04-01<br />unemploy: 6468","date: 1989-05-01<br />unemploy: 6375","date: 1989-06-01<br />unemploy: 6577","date: 1989-07-01<br />unemploy: 6495","date: 1989-08-01<br />unemploy: 6511","date: 1989-09-01<br />unemploy: 6590","date: 1989-10-01<br />unemploy: 6630","date: 1989-11-01<br />unemploy: 6725","date: 1989-12-01<br />unemploy: 6667","date: 1990-01-01<br />unemploy: 6752","date: 1990-02-01<br />unemploy: 6651","date: 1990-03-01<br />unemploy: 6598","date: 1990-04-01<br />unemploy: 6797","date: 1990-05-01<br />unemploy: 6742","date: 1990-06-01<br />unemploy: 6590","date: 1990-07-01<br />unemploy: 6922","date: 1990-08-01<br />unemploy: 7188","date: 1990-09-01<br />unemploy: 7368","date: 1990-10-01<br />unemploy: 7459","date: 1990-11-01<br />unemploy: 7764","date: 1990-12-01<br />unemploy: 7901","date: 1991-01-01<br />unemploy: 8015","date: 1991-02-01<br />unemploy: 8265","date: 1991-03-01<br />unemploy: 8586","date: 1991-04-01<br />unemploy: 8439","date: 1991-05-01<br />unemploy: 8736","date: 1991-06-01<br />unemploy: 8692","date: 1991-07-01<br />unemploy: 8586","date: 1991-08-01<br />unemploy: 8666","date: 1991-09-01<br />unemploy: 8722","date: 1991-10-01<br />unemploy: 8842","date: 1991-11-01<br />unemploy: 8931","date: 1991-12-01<br />unemploy: 9198","date: 1992-01-01<br />unemploy: 9283","date: 1992-02-01<br />unemploy: 9454","date: 1992-03-01<br />unemploy: 9460","date: 1992-04-01<br />unemploy: 9415","date: 1992-05-01<br />unemploy: 9744","date: 1992-06-01<br />unemploy: 10040","date: 1992-07-01<br />unemploy: 9850","date: 1992-08-01<br />unemploy: 9787","date: 1992-09-01<br />unemploy: 9781","date: 1992-10-01<br />unemploy: 9398","date: 1992-11-01<br />unemploy: 9565","date: 1992-12-01<br />unemploy: 9557","date: 1993-01-01<br />unemploy: 9325","date: 1993-02-01<br />unemploy: 9183","date: 1993-03-01<br />unemploy: 9056","date: 1993-04-01<br />unemploy: 9110","date: 1993-05-01<br />unemploy: 9149","date: 1993-06-01<br />unemploy: 9121","date: 1993-07-01<br />unemploy: 8930","date: 1993-08-01<br />unemploy: 8763","date: 1993-09-01<br />unemploy: 8714","date: 1993-10-01<br />unemploy: 8750","date: 1993-11-01<br />unemploy: 8542","date: 1993-12-01<br />unemploy: 8477","date: 1994-01-01<br />unemploy: 8630","date: 1994-02-01<br />unemploy: 8583","date: 1994-03-01<br />unemploy: 8470","date: 1994-04-01<br />unemploy: 8331","date: 1994-05-01<br />unemploy: 7915","date: 1994-06-01<br />unemploy: 7927","date: 1994-07-01<br />unemploy: 7946","date: 1994-08-01<br />unemploy: 7933","date: 1994-09-01<br />unemploy: 7734","date: 1994-10-01<br />unemploy: 7632","date: 1994-11-01<br />unemploy: 7375","date: 1994-12-01<br />unemploy: 7230","date: 1995-01-01<br />unemploy: 7375","date: 1995-02-01<br />unemploy: 7187","date: 1995-03-01<br />unemploy: 7153","date: 1995-04-01<br />unemploy: 7645","date: 1995-05-01<br />unemploy: 7430","date: 1995-06-01<br />unemploy: 7427","date: 1995-07-01<br />unemploy: 7527","date: 1995-08-01<br />unemploy: 7484","date: 1995-09-01<br />unemploy: 7478","date: 1995-10-01<br />unemploy: 7328","date: 1995-11-01<br />unemploy: 7426","date: 1995-12-01<br />unemploy: 7423","date: 1996-01-01<br />unemploy: 7491","date: 1996-02-01<br />unemploy: 7313","date: 1996-03-01<br />unemploy: 7318","date: 1996-04-01<br />unemploy: 7415","date: 1996-05-01<br />unemploy: 7423","date: 1996-06-01<br />unemploy: 7095","date: 1996-07-01<br />unemploy: 7337","date: 1996-08-01<br />unemploy: 6882","date: 1996-09-01<br />unemploy: 6979","date: 1996-10-01<br />unemploy: 7031","date: 1996-11-01<br />unemploy: 7236","date: 1996-12-01<br />unemploy: 7253","date: 1997-01-01<br />unemploy: 7158","date: 1997-02-01<br />unemploy: 7102","date: 1997-03-01<br />unemploy: 7000","date: 1997-04-01<br />unemploy: 6873","date: 1997-05-01<br />unemploy: 6655","date: 1997-06-01<br />unemploy: 6799","date: 1997-07-01<br />unemploy: 6655","date: 1997-08-01<br />unemploy: 6608","date: 1997-09-01<br />unemploy: 6656","date: 1997-10-01<br />unemploy: 6454","date: 1997-11-01<br />unemploy: 6308","date: 1997-12-01<br />unemploy: 6476","date: 1998-01-01<br />unemploy: 6368","date: 1998-02-01<br />unemploy: 6306","date: 1998-03-01<br />unemploy: 6422","date: 1998-04-01<br />unemploy: 5941","date: 1998-05-01<br />unemploy: 6047","date: 1998-06-01<br />unemploy: 6212","date: 1998-07-01<br />unemploy: 6259","date: 1998-08-01<br />unemploy: 6179","date: 1998-09-01<br />unemploy: 6300","date: 1998-10-01<br />unemploy: 6280","date: 1998-11-01<br />unemploy: 6100","date: 1998-12-01<br />unemploy: 6032","date: 1999-01-01<br />unemploy: 5976","date: 1999-02-01<br />unemploy: 6111","date: 1999-03-01<br />unemploy: 5783","date: 1999-04-01<br />unemploy: 6004","date: 1999-05-01<br />unemploy: 5796","date: 1999-06-01<br />unemploy: 5951","date: 1999-07-01<br />unemploy: 6025","date: 1999-08-01<br />unemploy: 5838","date: 1999-09-01<br />unemploy: 5915","date: 1999-10-01<br />unemploy: 5778","date: 1999-11-01<br />unemploy: 5716","date: 1999-12-01<br />unemploy: 5653","date: 2000-01-01<br />unemploy: 5708","date: 2000-02-01<br />unemploy: 5858","date: 2000-03-01<br />unemploy: 5733","date: 2000-04-01<br />unemploy: 5481","date: 2000-05-01<br />unemploy: 5758","date: 2000-06-01<br />unemploy: 5651","date: 2000-07-01<br />unemploy: 5747","date: 2000-08-01<br />unemploy: 5853","date: 2000-09-01<br />unemploy: 5625","date: 2000-10-01<br />unemploy: 5534","date: 2000-11-01<br />unemploy: 5639","date: 2000-12-01<br />unemploy: 5634","date: 2001-01-01<br />unemploy: 6023","date: 2001-02-01<br />unemploy: 6089","date: 2001-03-01<br />unemploy: 6141","date: 2001-04-01<br />unemploy: 6271","date: 2001-05-01<br />unemploy: 6226","date: 2001-06-01<br />unemploy: 6484","date: 2001-07-01<br />unemploy: 6583","date: 2001-08-01<br />unemploy: 7042","date: 2001-09-01<br />unemploy: 7142","date: 2001-10-01<br />unemploy: 7694","date: 2001-11-01<br />unemploy: 8003","date: 2001-12-01<br />unemploy: 8258","date: 2002-01-01<br />unemploy: 8182","date: 2002-02-01<br />unemploy: 8215","date: 2002-03-01<br />unemploy: 8304","date: 2002-04-01<br />unemploy: 8599","date: 2002-05-01<br />unemploy: 8399","date: 2002-06-01<br />unemploy: 8393","date: 2002-07-01<br />unemploy: 8390","date: 2002-08-01<br />unemploy: 8304","date: 2002-09-01<br />unemploy: 8251","date: 2002-10-01<br />unemploy: 8307","date: 2002-11-01<br />unemploy: 8520","date: 2002-12-01<br />unemploy: 8640","date: 2003-01-01<br />unemploy: 8520","date: 2003-02-01<br />unemploy: 8618","date: 2003-03-01<br />unemploy: 8588","date: 2003-04-01<br />unemploy: 8842","date: 2003-05-01<br />unemploy: 8957","date: 2003-06-01<br />unemploy: 9266","date: 2003-07-01<br />unemploy: 9011","date: 2003-08-01<br />unemploy: 8896","date: 2003-09-01<br />unemploy: 8921","date: 2003-10-01<br />unemploy: 8732","date: 2003-11-01<br />unemploy: 8576","date: 2003-12-01<br />unemploy: 8317","date: 2004-01-01<br />unemploy: 8370","date: 2004-02-01<br />unemploy: 8167","date: 2004-03-01<br />unemploy: 8491","date: 2004-04-01<br />unemploy: 8170","date: 2004-05-01<br />unemploy: 8212","date: 2004-06-01<br />unemploy: 8286","date: 2004-07-01<br />unemploy: 8136","date: 2004-08-01<br />unemploy: 7990","date: 2004-09-01<br />unemploy: 7927","date: 2004-10-01<br />unemploy: 8061","date: 2004-11-01<br />unemploy: 7932","date: 2004-12-01<br />unemploy: 7934","date: 2005-01-01<br />unemploy: 7784","date: 2005-02-01<br />unemploy: 7980","date: 2005-03-01<br />unemploy: 7737","date: 2005-04-01<br />unemploy: 7672","date: 2005-05-01<br />unemploy: 7651","date: 2005-06-01<br />unemploy: 7524","date: 2005-07-01<br />unemploy: 7406","date: 2005-08-01<br />unemploy: 7345","date: 2005-09-01<br />unemploy: 7553","date: 2005-10-01<br />unemploy: 7453","date: 2005-11-01<br />unemploy: 7566","date: 2005-12-01<br />unemploy: 7279","date: 2006-01-01<br />unemploy: 7064","date: 2006-02-01<br />unemploy: 7184","date: 2006-03-01<br />unemploy: 7072","date: 2006-04-01<br />unemploy: 7120","date: 2006-05-01<br />unemploy: 6980","date: 2006-06-01<br />unemploy: 7001","date: 2006-07-01<br />unemploy: 7175","date: 2006-08-01<br />unemploy: 7091","date: 2006-09-01<br />unemploy: 6847","date: 2006-10-01<br />unemploy: 6727","date: 2006-11-01<br />unemploy: 6872","date: 2006-12-01<br />unemploy: 6762","date: 2007-01-01<br />unemploy: 7116","date: 2007-02-01<br />unemploy: 6927","date: 2007-03-01<br />unemploy: 6731","date: 2007-04-01<br />unemploy: 6850","date: 2007-05-01<br />unemploy: 6766","date: 2007-06-01<br />unemploy: 6979","date: 2007-07-01<br />unemploy: 7149","date: 2007-08-01<br />unemploy: 7067","date: 2007-09-01<br />unemploy: 7170","date: 2007-10-01<br />unemploy: 7237","date: 2007-11-01<br />unemploy: 7240","date: 2007-12-01<br />unemploy: 7645","date: 2008-01-01<br />unemploy: 7685","date: 2008-02-01<br />unemploy: 7497","date: 2008-03-01<br />unemploy: 7822","date: 2008-04-01<br />unemploy: 7637","date: 2008-05-01<br />unemploy: 8395","date: 2008-06-01<br />unemploy: 8575","date: 2008-07-01<br />unemploy: 8937","date: 2008-08-01<br />unemploy: 9438","date: 2008-09-01<br />unemploy: 9494","date: 2008-10-01<br />unemploy: 10074","date: 2008-11-01<br />unemploy: 10538","date: 2008-12-01<br />unemploy: 11286","date: 2009-01-01<br />unemploy: 12058","date: 2009-02-01<br />unemploy: 12898","date: 2009-03-01<br />unemploy: 13426","date: 2009-04-01<br />unemploy: 13853","date: 2009-05-01<br />unemploy: 14499","date: 2009-06-01<br />unemploy: 14707","date: 2009-07-01<br />unemploy: 14601","date: 2009-08-01<br />unemploy: 14814","date: 2009-09-01<br />unemploy: 15009","date: 2009-10-01<br />unemploy: 15352","date: 2009-11-01<br />unemploy: 15219","date: 2009-12-01<br />unemploy: 15098","date: 2010-01-01<br />unemploy: 15046","date: 2010-02-01<br />unemploy: 15113","date: 2010-03-01<br />unemploy: 15202","date: 2010-04-01<br />unemploy: 15325","date: 2010-05-01<br />unemploy: 14849","date: 2010-06-01<br />unemploy: 14474","date: 2010-07-01<br />unemploy: 14512","date: 2010-08-01<br />unemploy: 14648","date: 2010-09-01<br />unemploy: 14579","date: 2010-10-01<br />unemploy: 14516","date: 2010-11-01<br />unemploy: 15081","date: 2010-12-01<br />unemploy: 14348","date: 2011-01-01<br />unemploy: 14013","date: 2011-02-01<br />unemploy: 13820","date: 2011-03-01<br />unemploy: 13737","date: 2011-04-01<br />unemploy: 13957","date: 2011-05-01<br />unemploy: 13855","date: 2011-06-01<br />unemploy: 13962","date: 2011-07-01<br />unemploy: 13763","date: 2011-08-01<br />unemploy: 13818","date: 2011-09-01<br />unemploy: 13948","date: 2011-10-01<br />unemploy: 13594","date: 2011-11-01<br />unemploy: 13302","date: 2011-12-01<br />unemploy: 13093","date: 2012-01-01<br />unemploy: 12797","date: 2012-02-01<br />unemploy: 12813","date: 2012-03-01<br />unemploy: 12713","date: 2012-04-01<br />unemploy: 12646","date: 2012-05-01<br />unemploy: 12660","date: 2012-06-01<br />unemploy: 12692","date: 2012-07-01<br />unemploy: 12656","date: 2012-08-01<br />unemploy: 12471","date: 2012-09-01<br />unemploy: 12115","date: 2012-10-01<br />unemploy: 12124","date: 2012-11-01<br />unemploy: 12005","date: 2012-12-01<br />unemploy: 12298","date: 2013-01-01<br />unemploy: 12471","date: 2013-02-01<br />unemploy: 11950","date: 2013-03-01<br />unemploy: 11689","date: 2013-04-01<br />unemploy: 11760","date: 2013-05-01<br />unemploy: 11654","date: 2013-06-01<br />unemploy: 11751","date: 2013-07-01<br />unemploy: 11335","date: 2013-08-01<br />unemploy: 11279","date: 2013-09-01<br />unemploy: 11270","date: 2013-10-01<br />unemploy: 11136","date: 2013-11-01<br />unemploy: 10787","date: 2013-12-01<br />unemploy: 10404","date: 2014-01-01<br />unemploy: 10202","date: 2014-02-01<br />unemploy: 10349","date: 2014-03-01<br />unemploy: 10380","date: 2014-04-01<br />unemploy: 9702","date: 2014-05-01<br />unemploy: 9859","date: 2014-06-01<br />unemploy: 9460","date: 2014-07-01<br />unemploy: 9608","date: 2014-08-01<br />unemploy: 9599","date: 2014-09-01<br />unemploy: 9262","date: 2014-10-01<br />unemploy: 8990","date: 2014-11-01<br />unemploy: 9090","date: 2014-12-01<br />unemploy: 8717","date: 2015-01-01<br />unemploy: 8903","date: 2015-02-01<br />unemploy: 8610","date: 2015-03-01<br />unemploy: 8504","date: 2015-04-01<br />unemploy: 8526"],"type":"scatter","mode":"lines","line":{"width":1.88976377952756,"color":"rgba(0,0,0,1)","dash":"solid"},"hoveron":"points","showlegend":false,"xaxis":"x","yaxis":"y","hoverinfo":"text","frame":null}],"layout":{"margin":{"t":26.2283105022831,"r":7.30593607305936,"b":40.1826484018265,"l":54.7945205479452},"plot_bgcolor":"rgba(242,242,242,1)","paper_bgcolor":"rgba(255,255,255,1)","font":{"color":"rgba(0,0,0,1)","family":"","size":14.6118721461187},"xaxis":{"domain":[0,1],"automargin":true,"type":"linear","autorange":false,"range":[-1787.05,17398.05],"tickmode":"array","ticktext":["1970","1980","1990","2000","2010"],"tickvals":[0,3652,7305,10957,14610],"categoryorder":"array","categoryarray":["1970","1980","1990","2000","2010"],"nticks":null,"ticks":"outside","tickcolor":"rgba(51,51,51,1)","ticklen":3.65296803652968,"tickwidth":0.66417600664176,"showticklabels":true,"tickfont":{"color":"rgba(77,77,77,1)","family":"","size":11.689497716895},"tickangle":-0,"showline":false,"linecolor":null,"linewidth":0,"showgrid":true,"gridcolor":"rgba(255,255,255,1)","gridwidth":0.66417600664176,"zeroline":false,"anchor":"y","title":{"text":"date","font":{"color":"rgba(0,0,0,1)","family":"","size":14.6118721461187}},"hoverformat":".2f"},"yaxis":{"domain":[0,1],"automargin":true,"type":"linear","autorange":false,"range":[2051.65,15985.35],"tickmode":"array","ticktext":["4000","8000","12000"],"tickvals":[4000,8000,12000],"categoryorder":"array","categoryarray":["4000","8000","12000"],"nticks":null,"ticks":"outside","tickcolor":"rgba(51,51,51,1)","ticklen":3.65296803652968,"tickwidth":0.66417600664176,"showticklabels":true,"tickfont":{"color":"rgba(77,77,77,1)","family":"","size":11.689497716895},"tickangle":-0,"showline":false,"linecolor":null,"linewidth":0,"showgrid":true,"gridcolor":"rgba(255,255,255,1)","gridwidth":0.66417600664176,"zeroline":false,"anchor":"x","title":{"text":"unemploy","font":{"color":"rgba(0,0,0,1)","family":"","size":14.6118721461187}},"hoverformat":".2f"},"shapes":[{"type":"rect","fillcolor":null,"line":{"color":null,"width":0,"linetype":[]},"yref":"paper","xref":"paper","x0":0,"x1":1,"y0":0,"y1":1}],"showlegend":false,"legend":{"bgcolor":"rgba(255,255,255,1)","bordercolor":"transparent","borderwidth":1.88976377952756,"font":{"color":"rgba(0,0,0,1)","family":"","size":11.689497716895}},"hovermode":"closest","barmode":"relative"},"config":{"doubleClick":"reset","modeBarButtonsToAdd":["hoverclosest","hovercompare"],"showSendToCloud":false},"source":"A","attrs":{"307d5caf07a9":{"x":{},"y":{},"type":"scatter"}},"cur_data":"307d5caf07a9","visdat":{"307d5caf07a9":["function (y) ","x"]},"highlight":{"on":"plotly_click","persistent":false,"dynamic":false,"selectize":false,"opacityDim":0.2,"selected":{"opacity":1},"debounce":0},"shinyEvents":["plotly_hover","plotly_click","plotly_selected","plotly_relayout","plotly_brushed","plotly_brushing","plotly_clickannotation","plotly_doubleclick","plotly_deselect","plotly_afterplot","plotly_sunburstclick"],"base_url":"https://plot.ly"},"evals":[],"jsHooks":[]}

      or 'pipe it to ggplotly()'

    2. p <- ggplot(economics, aes(x = date, y = unemploy)) +

      minor coding style thing: I generally prefer to 'pipe in data' from the previous line

      and make ggplot() its own line

      Makes the code more 'modular' IMO

    1. The reason for that is that these rules only involve one of two factors.

      this sentence doesn't make sense to me. So you are saying 'rules of thumb are bad because people haven't come up with good rules of thumb?'

    2. However, if you believe there is some latent construct that defines the interrelationship among items, then factor analysis may be more appropriate.

      I get the idea that 'latent factors' may be behind the relationships between items.

      But what I don't get is 'on what basis do I divide things up into these factors?' What would be a 'reasonable' versus 'unreasonable' way to do it and why? It would be great to have an example.

      Also, I think the choice to use FA rather than PCA is not just driven by whether you 'believe the latent construct is behind things...' but it is depends on what you want to do with these measures.

      PCA may be very good for building a predictive model, perhaps, while FA may yield more interpretable insights?

    3. makes up common variance

      maybe you mean 'drives common variance'?

      I think we probably want to give some math here and some examples ... to make this more clear. I have some notes I can try to integrate, but I admit I never fully understood it. (I understood PCA but not EFA or CFA).

    4. assumes that there common variances takes up all of total variance, common factor analysis assumes that total variance can be partitioned into common and unique variance.

      I don't understand what this means. What does it mean that 'common variances take up all of total variance"? What is 'common variance"?

    5. It conceptualizes constructs as causally determined by the observations (Edwards & Bagozzi, 2000). This is likely not what you want because this means that “principal component scores are”caused” by their indicators in much the same way that sumscores are “caused” by item scores

      Who is 'you' here and how do you know what he wants? :)

      By the way, my take on this was that PCA did not assume any causal relationships ... it is simple data reduction and geometric relationships.

      In my reading, I do recall a lot of discussion about how the PCA 'projects from the data vectors to the latent components' while EFA is meant to do the reverse.

    6. This means that the number of underlying factors is determined mostly empirically, rather than theoretically.

      Maybe this could be made more clear. I don't get a sense of how one could 'determine the number of factors empirically' here.

    1. If the elasticity of substitution between capital and labor is bounded above 1, capital and labor are “gross substitutes”. In this case, any given level of output can be achieved with enough capital or enough labor; neither individual production factor is necessary. If we just have a high enough saving rate and pile up ever more capital—factories producing factories—output will rise indefinitely.

      great idea but what does elasticity mean? (J/K)

  14. Nov 2021
    1. To enable them, you’ll need to first tell Git that your local branch has a remote equivalent: git push --set-upstream origin <branch-name>

      does this create the remote branch too?

    1. subjective scales are best understood as cardinally comparable unless, and until, other evidence suggests otherwise.

      this 'induction' seems too strong. I can see justifying treating these as comparable in modeling. But in assessing policies, especially those where people start from clearly different 'happiness starting points', I would be more careful.

      Consider, would you really recommend a policy that makes a bunch of 6's into 8's over a policy that makes a slightly smaller number of 4's into 6's?

    2. how such tests could be used to ‘correct’ the data if people do not intuitively interpret subjective scales in the way hypothesised

      interesting, but is it practical

    3. In each case, there is evidence indicating the condition does hold and no strong evidence suggesting it does not.

      This is the most important part. Perhaps worth our digging into.

      I expect the 'evidence suggesting it holds' might often involve very weak and underpowered tests. E.g., something like 'the ordered logit model doesn't predict substantially better than a linear model' ... but maybe both predict poorly, and there are large error bounds on the measure of how well each predict.

    4. (different individuals use the scale the same way and the scale end-points represent the real limits)

      also seems doubtful... I expect many differences in how people use this. Ask some people 'how did you choose the happiness number', I expect they will state it in different ways ..

      E.g., some people will say "I am a 9 because it would be almost impossible to be a 10, as that is perfect happiness"

    5. (each reported unit change represents the same change in magnitude)

      this seems like a very strong and fairly implausible assumption to me. It demands a lot of the respondent. And I expect that many people would agree with things like

      "It would be much more important for me to have a happiness of 6 rather than 4 ... but I wouldn't care as much about having a happiness of 10 rather than 8" ... which would violate the linearity.

      Or maybe they would say "Happiness 3 is much better than 1. While happiness 8 is pretty close to happiness 6" ... suggesting some people might think proportionally rather than linearly.

    6. The cardinality thesis could fail to be exactly true, but nevertheless be approximately true, such that it is unproblematic to treat it as true. 

      no need to say this each time ... of course this is the case

    7. However, all the other conditions can fail by degree and what’s important is by how much they deviate.

      this is obvious ... I don't see why he needs to state this

    8. C1: phenomenal cardinality(the underlying subjective state, e.g. happiness, is felt in units)

      OK this is a short summary, but I think 'felt in units' needs to be defined in more detail.

      The Expected Utility framework would offer one way of pinning that down ... i.e.,

      "I respond to the happiness scale in a way such that I would be willing to sacrifice 1 happiness unit in one state-of-the-world to gain 1 happiness unit in another state-of-the-world, for any two equiprobable states of the world, no matter what the happiness starting point is in either."

      (This is with linearity, otherwise it's just that there is 'some exact correspondence')

      This would be one way of pinning it down, and then in fact the stated happiness measure would give us what those 'lottery choice elicitation measures' I referred to were targeting.

    9. Further, they interpret the scale as linear, so each point represents the same change in quantity.

      this seems too strong. Particularly when dealing with such a fuzzy concept as happiness or life satisfaction.

      Also, some evidence suggests people consider scales in proportional terms.

      I've also heard about some evidence that people do tend to treat Likert scales differently in many contexts. (I should dig that up)

    10. scale interpretation should be understood as a search for a ‘focal point’ (or ‘Schelling point’), a default solution chosen in the absence of communication (Schelling, 1960).

      OK that makes sense ... as a goal

    11. eir decision (Kristoffersen, 2011). Thi

      "the cardinality assumption is reasonable in most research contexts." they say. When we are trying to make a general statement like 'does more income make people happier' maybe this is not very sensitive to the estimation method (e.g., linear vs. ordered logit).

      But for assessing the *welfare impact of income gains from different starting levels" the reasonableness hurdle is a lot higher IMO.

    12. psychologists less so (Ferrer‐i‐Carbonell and Frijters, 2004). Despite

      A quick read is that the cited paper is just saying 'this doesn't matter much for particular coefficients of a model predicting happiness'.

      But applying this to say 'we can be confident that if policy A moves 100 people from 8 to 10 and policy B moves 100 people from 1-2 than A is better' is a stronger statement.

    13. t would not be possible to use subjective scales to say what would increase overall happiness. 

      I think this is too strong ... because we can have cases where some policies yield improvements that strictly dominate others, and we can also have particular criteria and standards.

  15. Oct 2021
    1. this suggests they might be working on this, or have some willingness to pursue it. In particular, we want to know how the impact varies depending on initial incomes, and what this suggests for the coefficient of isoelastic utility (imposing that framework).

    1. Finally, we do not detect a significant spillover effect, that is an effect on non-recipients in other households in the same community. This result is important as a potential concern about cash transfers

      this is important ... but 'not significant or underpowered'?

      OK in the meta-analysis it is fairly tightly bounded

    1. A dollar to AMF is worth 10 times that of a dollar to GiveDirectly,

      which estimate is this? Are we assuming that the people benefiting from AMF are the same as those who benefit from give directly?

    2. this can be achieved by increasing η above and beyond the experimentally determined rate (e.g. by adding 1 to it).

      that's a confusing workaround. You should change the weighting in the social welfare function.

    3. Group Annual Consumption η = 1 η = 2 Median US incomefn-4 $21,000 1× 1× US poverty linefn-5 $6,000 3.5× 12× Mean income in Kenyafn-6 $1,400 15× 230× World Bank’s international poverty linefn-7 $230 91× 8,300x GiveDirectly’s average recipientsfn-8 $180 120× 14,000×         Table 1. Some key consumption levels.

      \(\eta\) really matters here!

    4. how much more utility

      I don't like the phrasing 'how much more utility' because utility is not measurable in that way. However, one might phrase it as 'how much more would need to be spent to yield the same utility boost'

    1. To leverage parochialism, interventions should aim to broaden people's moral circle and

      I would say "to counter parochialism"… It's not really leveraging it

    2. To leverage conformity, effective charities should be set as the default,

      This is the sort of thing that Nick and Ari at momentum are trying to do

    3. A series of studies by Montealegre and colleagues [36] showed that donors who are driven by effectiveness concerns (rather than emotional concerns) were rated as less moral and less desirable partners.

      although, to my reading, some of these results looked to be mixed or under powered. Furthermore, driven by effectiveness were rated as better in other ways

    4. Altruism that is driven by emotions is perceived as more genuine [41].

      A quick read suggests that the paper reports this belief for people who gave emotionally in contrast to people who clearly donated for status or personal gain. Is there other evidence in addition to this?