707 Matching Annotations
  1. Aug 2020
    1. cgs unitsof millibars (1 mb103bar)

      how was 1 bar defined?

    2. exponentially with height

      The exponential is the maximum entropy distribution of mass for a given amount of potential energy.

    3. Exercise 1.1The globally averaged surface pressureis 985 hPa

      within 1.5% of an exact power of 10. Totally a coincidence!

    4. atmospheric pressure is usuallyexpressed in units of hundreds of (i.e., hecto) pascals(hPa)

      hecto. It is a great coincidence that Earth surface pressure is so close to a round power of 10 of Pa, derived from MKS units (from Earth size and water properties). Nothing about the air blanket of the planet guaranteed that !!

    5. includingthe minus signs in front of them, are referred to asadvection terms

      advection has a negative sign

    6. westerly(from the west)andeasterly(from the east)

      take note, non-meteorologists

    7. xis distance east of the Greenwich meridianalong a latitude circle

      often we use x as distance eastward in an approximate Cartesian coordinate tangent to the Earth at a point, as a local coordinate system for phenomena small enough that the Earth's curvature can be neglected (that is, the Earth's size can be size taken as infinite)

    8. Solution

      Problem 7.32 solution

    9. Earth’s atmos-phere

      duh Earth

    10. Atmospheric scienceis a relatively new, applied disci-pline

      Still defining questions, not just delivering pat answers!

    11. Instability91viiContentsP732951

      2019 annotations recovered!

    1. In 1960, the CGPM launched the International System of Units (in French the Système international d'unités or SI) which had six "base units": the metre, kilogram, second, ampere, degree Kelvin (subsequently renamed the "kelvin") and candela

      it takes 6 to cover all fields (electricity, radiation)

    1. a⋅b=abcosθ

      cosine of angle, well defined in any dimensional space (2D, 3D)

    2. Vector algebra

      add, negate (--> subtraction), multiply (two kinds)

    3. same object, as seen from different axes. The very fact that we can say “the same object” implies a physical intuition about the reality

      a physical concept, not a mathematical one

    4. All quantities that have a direction, like a step in space, are called vectors. A vector is three numbers. In order to represent a step in space, say from the origin to some particular point PP whose location is (x,y,z)(x,y,z), we really need three numbers, but we are going to invent a single mathematical symbol, r\FLPr, which is unlike any other mathematical symbols we have so far used.1 It is not a single number, it represents three numbers: xx, yy, and zz. It means three numbers, but not really only those three numbers, because if we were to use a different coordinate system, the three numbers would be changed to x′x', y′y', and z′z'. However, we want to keep our mathematics simple and so we are going to use the same mark to represent the three numbers (x,y,z)(x,y,z) and the three numbers (x′,y′,z′)(x',y',z'). That is, we use the same mark to represent the first set of three numbers for one coordinate system, but the second set of three numbers if we are using the other coordinate system.

      Direction, a concept. It requires three numbers to specify, but those numbers depend on the coordinate system used. So it's a set of three numbers, but not simply any set.

    5. laws of physics, so far as we know today, have the two properties which we call invariance (or symmetry) under translation of axes and rotation of axes. These properties are so important that a mathematical technique has been developed to take advantage of them in writing and using physical laws

      the heart of vectors

    1. The cumulative distribution function of Y {\displaystyle Y}

      cumulative CDF, capital F

    2. differentiating both sides of the above expression with respect to y {\displaystyle y}

      little f for probability density function (derivative of cumulative function F)

    3. The term "random variable" in statistics is traditionally limited to the real-valued case ( E = R {\displaystyle E=\mathbb {R} } ).

      It is a mapping (a function) from outcomes to probability (real numbers, which furthermore integrate or sum to unity)

    4. A random variable has a probability distribution

      Random variable has a PDF, not a value

  2. Jun 2020
  3. May 2020
    1. Increasing the shear depth above 4500 m reduces the precipitation again, with the values for 6000- and 9000-m depths being smaller than those for the intermediate depths.

      too deep decreases precip

    2. The middepth shear layers, with tops at 3000–4500 m, produce the greatest precipitation; in this sense, these intermediate shear-layer depths are “optimal.”

      mid-depth shear increases precip

    3. The shallowest shear layer, with depth 1500 m, produces the least precipitation, about 30% less than the unsheared case, despite a greater degree of convective organization compared to the unsheared flow

      shallow shear makes organization but LESS rain

    4. two distinct regimes

      weak and strong shear, compared to some other relevant time scale? but what?

    5. the strong shear case is moister than the unsheared case by about 0.2 g kg−1 and drier in the boundary layer by about 0.5 g kg−1

      Acts like a shear-proportional mixing perhaps. I wonder if it is numerical-diffusion dependent, or robust?

    1. improvements can make the model better suited for addressing a host of climate problems

      mean state improvements valuable not just for forecast skill

    2. Additional improvements in forecast skill might be possible with a state-dependent correction if the associated statistical sampling issues can be overcome (e.g., Leith 1978; Danforth et al. 2007).

      state-dependent corrections idea

    3. There thus appears to be two windows (one early and one late) during which TBC could induce improved forecasts.

      coupled improvements from bias reductions occur on a second, longer time scale (long past atm-land skill horizons)

    4. the climate drift in the North Pacific waveguide (believed to be a key controlling factor for Rossby waves entering North America) appears to develop too slowly in CNTRL-A (reaching only about one-half the long-term value at 10 days’ lead) to allow its correction in TBC-A to produce more than a modest impact (via more skillful Rossby wave predictions) on week 2 T2m forecasts (when skill is already rather low).

      climate systematic error reductions take too long to develop to help skill very much

    5. o improvement in the precipitation forecasts

      precip just hard

    6. skill improvements were rather modest at best

      better mean state doesn't make skill better

    7. improved (increased) cloudiness in TBC-A (not shown) appears to contribute to the dramatic reduction in the warm bias over the NH summer continents. Here we have a clear case where the TBC impacts are indirect; the model’s parameterizations of moisture processes working with the states directly affected by TBC appear to produce more realistic output

      yes clouds are model-produced but state-sensitive, indirect

    8. dramatic reduction of the SST bias in TBC-C appears to be the result of a combination of direct impacts from the near-surface temperature increments (especially over the Gulf Stream, the SH high latitudes, and equatorial and coastal upwelling regions) and indirect impacts due to the reductions in surface stress biases

      oh surface air T bias corrections become ocean T bias corrections if sustained long enough

    9. improvements in the equatorial surface stress

      and evaporation

    10. excessive subtropical westerlies in both hemispheres (though more so in the NH) and during both seasons. These likely reflect anomalous forcing/heating by the excessively strong and split ITCZ in the coupled model.

      low wind speed bias, hence warm SST? How related to ITCZs?

    11. In discussing the TBC impacts on the model’s climate, it is useful to consider them as being divided into those that are direct and those that are indirect, with the latter including any quantities (such as precipitation and, for the AOGCM, atmospheric moisture) that are not explicitly forced by the TBC, as well as the transients, since the TBC is a constant forcing term.

      direct and indirect

    12. state-independent TBC to the atmosphere can produce considerable improvements to the simulated mean climate as well as to its variability

      eddies better when jet better

    13. return of skill

      Yeah correlations of 0.1 - 0.2 not dazzling but nonzero

    14. bias in the u wind (the waveguide) develops slowly over the course of about two months (Fig. 12, top left; green curves).

      wow, a very slow timescale of polar vortex i guess?

    15. right) The υ250-mb correlations at 12-day lead for TBC-A, CNTRL-A, and the differences

      time series correlations with analysls at each point i guess, so small and small differences

    16. how the drift in the waveguide evolves (u250 mb) and the extent to which the Rossby waves themselves are predicted more accurately (υ250 mb

      close inference

    17. hese two time scales (associated with drift development and predictability) serve to define a window of forecast leads during which TBC can be expected to have an impact on skill

      yes drift and skill a dance

    18. skill assessment is based on a series of hindcasts initialized in late spring and running through August produced with both the CNTRL-A and TBC-A models (see section 2c). Note that in the following we use the terminology hindcasts and forecasts interchangeably, keeping in mind that these simulations are not true forecasts; in these atmosphere-only runs, observed SSTs are prescribed throughout the forecast period

      atmosphere only S2S hindcasts

    19. correlations between the DJF mean Niño-3.4 index a

      ENSO correlations

    20. reductions are likely due to TBC-C-induced changes in the (now reduced) variability of the tropical Pacific SST linked to ENSO

      Oh "transient" includes interannual as well as high-pass?

    21. substantially improves the boreal winter stationary waves (Fig. 9, right

      pretty darned subtle improvement

    22. et and stationary waves, improvements in the climatologies of those aspects of the flow should have positive impacts on ENSO-related teleconnections

      OK, second-order deduction

    23. ENSO-related teleconnections

      but it shows an unconditional mean, right?

    24. does little to improve the MJO, though the CNTRL-C model already produces a fairly realistic but weaker-than-observed MJO

      mean state doesn't fix MJO

    25. The improvement in the zonal mean specific humidity (Figs. 6, 7, bottom) is also substantial, highlighted by the elimination of the wet biases in CNTRL-C in the tropics and SH during both seasons (it is noteworthy that this occurs despite not correcting the moisture

      oh the moisture TBC is not applied in TBC-C

      How much of high-q bias reduction is due to reduction of warm SST bias?

    26. we do not replay the moisture in the AOGCM

      in coupled model, moisture not replayed (nor is its TBC applied)

    27. used to correct the u, υ, T, and ps tendencies

      No q tendencies in TBC-C

    28. the replay approach (REPLAY-C) is able, for the most part, to reproduce the annual mean observed (Reynolds) SST. In contrast, the free-running CNTRL-C (middle-left panel) shows large positive SST biases over much of the tropics and SH

      wow, atmospheric tendencies fix the SST

    29. improvements are seen in the NH momentum transport, especially in the North Pacific and North Atlantic jet exit regions, where the high-frequency eddies are expected to maintain the mean jet through barotropic decay (e.g., Chang et al. 2002).

      transient eddy flues as an evaluation field, from TBC climatological tendencies imposed.

    30. increased wet bias over India

      unfixed by TBC

    1. Model changes possibly responsible for this improvement between these periods are the shift of SSTs from the use of weekly optimally interpolated (OI) SST to the high‐resolution RTG SSTs and the update of the Community Radiative Transfer Model (CRTM).

      Oh, physics updates during the 3 years

    2. JJA averaged AIs for the years (left) 2012, (middle) 2013, and (right) 2014 at approximately 850 mb. The AIs remain quite consistent from 2012 to 2014.

      Analysis increments pattern

    3. Forecast surface pressure is generally too high (cool colors) over the oceans, except near coasts, and too low (warm colors) over the continents

      SLP short-term errors

    1. the general applicability of the TBC approach can only be fully assessed by repeating our analysis with a number of different models (and perhaps several different reanalyses)

      More work to do: other reanalyses to replay to

    2. Additional work is needed to better quantify the statistical sampling errors as a function of season, the sizes/locations of the regions, and the length of the model runs

      Future work statement, but only about the sampling error (S/N)

    3. distinguish between proximate and ultimate causes of the model biases

      CAUSES literature worth a look

    4. the TBC impacts project less strongly on the biases, accounting for between 58% and 66% of the bias for the T2m, and between 50% and 61% of the bias for precipitation. Nevertheless, in a relative sense, considering only the fraction of the bias over North America that is actually corrected by applying TBC in northern midlatitudes, we find that the sources of the biases are again roughly 2/3 remote and 1/3 local.

      No land surface TBCs, just atmosphere

    5. eddy streamfunction response

      eddy streamfunction hides the zonal mean

    6. excessive heating in that region is the main source of the AGCM’s circulation biases (and related biases) that span the NH

      Tibet excessive heating is NH error source

    7. temperature increments at 500

      Wonder if 300 or 350 mb would be clearer, top heavy heating results from my GoM figure about MERRA TDTANA.

    8. reduction to the excessive precipitation bias over the Himalayas, suggesting that the associated cooling anomalies may play some role in producing the hemispheric-wide responses in those experiments

      model-physics heat source, not just T tendency

    9. RPL_Tibet_uv) produces a more locally confined response

      I bet if you replayed uv over the Balkan quadrant the zonal mean compoent would pop out better (Kelly and Mapes).

    10. The remarkable similarity between the RPL_Tibet_T and RPL_Tibet_All responses indicates that the long-term bias is primarily driven by the temperature increments in that region. Somewhat similar, although with weaker amplitude, results are obtained from only replaying to the moisture (RPL_Tibet_q).

      A zonal mean component for sure, and wavenumber 1

    11. wavenumbers that are much smaller than would be expected for typical NH summer jet speeds

      mean zonal wind looks to be involved, not just the "circum-global" eddy pattern I've heard Branstator talk about

    12. (Note that the calculation of the cosθ values in Fig. 11 does not include the patterns in the local region considered, since these are forced to track the analysis by design)

      a more sensitive test: only the remote part

    13. TBC corrects for a long-term mean error, whereas RPL effectively corrects the specific errors produced at each time step

      RPL makes the correct eddies, which can then impact the time mean flow. Evaluation is still on the time mean.

    14. nonlinear

      nonlinear, or merely of canceling phase?

    15. TBC in TBC_TR substantially reduces the long-term JJA tropical precipitation biases (not shown), but has little impact on the NM region

      tropics -> NH midlatitudes connection weak in summer

    16. 2/3 from remote sources and 1/3 from the local source

      North America 2/3 remote, 1/3 local

    17. local impacts tend to be more focused on the middle and southern Great Plains

      no soil moisture corrections, just atmospheric fields?

    18. remote TBC corrections provide more than twice the impact on T2M over North America as local corrections

      Interesting! Asia and NEPac for the NW US biases.

    19. responses are quantified in terms of a normalized spatial inner product

      This pools offsets (biases) with patterns

  4. Apr 2020
    1. caused bynoise

      don't like this phrase very much... unclear

    2. Occam’s razor now tells us

      Important principle, but again with a socially constructed definition of what "simplest" means. Is invoking God simple?

    3. p(r.r0jH)

      requires an expected model for both signal and noise under the hypothesized state of affairs

    4. The factor that updates the prior odds to the posteriorodds is called the Bayes factor

      There's a nice nutshell! The nice thing about odds (ratios of probability density) rather than probabilities is that they don't have to add up to 1 so the ugly normalizing factors don't clutter the result.

    5. The presence of alter-native theories also influences prior odds for hypotheses

      There is no escape from the humanity here: history, schools of thought, what is socially considered reasonable or plausible -- the Overton window of discourse which exists yes even in science.

    6. 11O(H)

      where does this algebra come from in (2)?

    7. case of lowsignal-to-noise ratio

      "detection" problems

    8. O(H)

      chased that meaning into the denominator

    9. p(H)

      Learn to see the numerator as the key locus of meaning. The denominator is just a normalizing factor that is what it must be for the result to be a true probability: that is, so that its total or integral is 1

    10. pvalue[p(r.r0jH)

      This is the one-tailed version for positive correlation. Two-tailed would use |r| > |r0|

    11. Inthe limit of very low signal-to-noise ratio, the relatedseries would also show 95% low correlations and 5%high correlations (see Table 2).The probability that our observedr0withr0.rpisindicative of an actual relation is then 5/(515)550%

      These should not be in separate paragraphs! the second sentence is still tightly "in the limit" established by the first.

    12. equal prior odds

      prior is a key concept of Bayesian reasoning

    13. he signal and thenoise

      both need to be modelled

    14. sampling

      in other words, is this correlation a property merely of your one sample and not of the population your sample is drawn from

    15. study synthetic time series

      Monte Carlo methods: build synthetic data (or resample your own data) in ways that embody the null hypothesis. Then you can just re-run your analysis, no matter how complicated, and look for "significant" differences.

    16. the errorof thetransposed conditional

      so common it has a name

    17. fluke

      ""lucky stroke, chance hit," 1857, also flook, said to be originally a lucky shot at billiards, of uncertain origin. Century Dictionary connects it with fluke (n.1) in reference to the whale's use of flukes to get along rapidly (to go a-fluking or some variant of it, "go very fast," is in Dana, Smyth, and other sailors' books of the era)." https://www.etymonline.com/word/fluke

    18. depen-dent

      non-independent, in the probability theory sense

    19. attest) provides an answer
    20. time series, and we find that they arecorrelated
    21. lead

      led (typo)

    22. large fraction of papers in the climate literature includes erroneous uses of significance tests.

      Sharp critiques often have the crispest summaries, if they are fair minded

    1. Tukey (1991), however, "It is foolish to ask 'Are the effects of A and B different?' They are always different—for some decimal place."

      sensible fellow whose retirement salvo is worth a close examination, if only to see how deep the jargon goes

  5. Mar 2020
    1. Trier and Parsons (1993) noted how a trough moving over the Rocky Mountains and into the Great Plains area strengthens the climatological southerly low-level jet that feeds moisture into MCSs forming over the central United States (Fig. 17-45). A similar behavior occurs in South America, where the South American low-level jet (SALLJ) flows southward along the eastern edge of the Andes from the moist Amazon region to feed MCSs in the region centered on Argentina (Nogués-Paegle and Mo 1997; Douglas et al. 1998; Saulo et al. 2000; Marengo et al. 2004; Vera et al. 2006; Salio et al. 2007; Rasmussen and Houze 2016). As shown by Bonner (1968), these low-level jets are stronger at night, which gives nocturnal preference for MCSs over the central United States (as noted by Huckleberry Finn, see introduction). Dai et al. (1999) showed how the diurnal and semidiurnal processes favor large-scale convergence over the Rockies during the day and over the plains to the east at night. These processes assure that the enhanced jet associated with an approaching trough has its maximum effect on MCSs at night in the central United States. Data from the U.S. radar network show that MCSs developing from diurnally triggered convection over the Rockies and propagating eastward maximize at night over the central United States (Carbone et al. 2002), in conjunction with the nocturnal maximum of the low-level jet in that region. Feng et al. (2016) found that an increase in MCS activity over the central United States has been accompanied by strengthening of the low-level jet and its moisture transport over the past 30–40 years.

      Jet, MCSs, nocturnal maximum, moisture transport, all woven together.

    1. low specific (3-6 g/kg) and absolute humidity (4-7 g/m3)

      absolute humidity determines RH at the lung surface after air warms on its way in, and thus parching of tissue which makes it susceptible for flu seasonality; Shaman et al.

    1. suppression will minimally requirea combination of social distancing of the entire population, home isolation of casesandhousehold quarantine of their familymembers.

      here we are

    2. we apply a previously published microsimulation model to two countries: the UK (Great Britain specifically) and the US. We conclude that the effectiveness of any one intervention in isolationis likely to be limited, requiring multiple interventions to be combined to have a substantial impact

      modeling at epidemiological scale

    1. The exergy weighting factor (20) explains the expected behavior for wq(z), which increases with height for decreasing values of rυ⎯⎯⎯(z)rυ¯⁡(z)<math altimg="eq-00078.gif"><mrow><mrow><mover accent="true"><mrow><msub><mi>r</mi><mi>υ</mi></msub></mrow><mo>¯</mo></mover></mrow><mo>⁡</mo><mrow><mo>(</mo><mi>z</mi><mo>)</mo></mrow></mrow></math>.

      Exergy norm weights water vapor increasing with height

    1. one of the most powerful inventions of modern science.” 1 1. J. Gleick, Chaos: Making a New Science, Viking, New York (1987). But who invented it? Who named it? And why

      because our brains have special hardware, and our languages have special nomenclatures

    2. “phase space” has become synonymous with the idea of a large parameter set

      fitness landscapes and optimization, etc. etc.

    1. ROJECT a single eigenvector onto the data and get an amplitude of this eigenvector at each tim

      tableau of eigenvector projection

    2. One way to represent their amplitude is to take the time series of principal components for the spatial structure (EOF) of interest, normalize this time series to unit variance, and then regress it against the original data set. This produces a map with the sign and dimensional amplitude of the field of interest that is explained by the EOF in question. The map has the shape of the EOF, but the amplitude actually corresponds to the amplitude in the real data with which this structure is associated. Thus we get structure and amplitude information in a single plot. If we have other variables, we can regress them all on the PC of one EOF and show the structure of several variables with the correct amplitude relationship, for example, SST and surface vector wind fields can both be regressed on PCs of SST

      Regression maps on PC time series

    3. σk2=λkN

      Singular values are square root of eigenvalues with a pesky factor of N the sample size

    4. You must choose which dimension of the data matrix contains interesting structure, and which contains sampling variability. In practice, sometimes only one dimension has meaningful structure, and the other is noise. At other times both can have meaningful structure, as with wavelike phenomena, and sometimes there is no meaningful structure in either dimension

      Key point of utilizing matrix algebra

    5. Singular Value Decomposition

      Beautiful SVD song

    6. The decision to standardize variables and work with the correlation matrix or, alternatively, to use the covariance matrix depends upon the circumstances. If the components of the state vector are measured in different units (e.g., weight, height, and GPA) then it is mandatory to usestandardized variables. If you are working with the same variable at different points (e.g., a geopotential map), then it may be desirable to retain a variance weighting by using unstandardized variables. The results obtained will be different

      Combined EOFs necessarily imply standardization, unless some other relative weighting scheme can be justified such as on physical grounds (energy for instance).

    7. Orthogonality of the Principal Component Time Series

      Z is the PCs

    8. Equation (4.18) shows how to express the original data in terms of the eigenvectors, when the coefficient matrix Zis defined by (4.17)

      Transforming back and forth from/to eigenspace (eigenvector coordinate system).

    9. orthonormal eigenvectors ei

      ortho-normal so E'E = I

    10. eTXXTe

      transpose rule for products

    11. covariance matrix is diagonal in this new coordinate space

      eigenvectors are orthogonal for square real symmetric matrices

    12. structure and sampling variables

      a good name for the distinction

      unfortunate choice to use N as structure and M as samples! Usually we speak about N samples.

    13. CCA is MCA of a covariance matrix of a truncated set of PC’s

      a two-field approach

  6. Feb 2020
    1. develop guidelines and best practices to educate the future Earth science workforce to be well prepared for innovative, interdisciplinary research

      trying here

    2. collaboration between Earth scientists and AI researchers

      Don't go it alone... but we need these mavens in the middle

    3. constraints in the optimization problem

      "regularization" this is called

    4. a two-step approach. For a given task, first identify all subtasks that can easily and efficiently be addressed by physics-driven methods, and apply those

      Prioritize physical law enforcement

    5. prediction, understanding

      Two big prongs of Science

    6. physics-based and data-driven methods simultaneously

      theory and empiricism, merging

    7. e.g., empirical orthogonal function analysis and spectral analysis

      Our later course topics...

    8. Without best practices, inappropriate use of these methods might lead to “bad science,” which could create a general backlash in the Earth science community against the use of AI methods. Such a backlash would be unfortunate because AI has much to offer

      Opportunity and peril for science. We could end up with decades of work to deconstruct a sprawl of bad literature.

  7. weather.rsmas.miami.edu weather.rsmas.miami.edu
    1. atural selection is the survival of the survivors

      from that flow profound ways of explanation of the world we find ourself in... is that not science?

    1. Machine Learning for Everyone In simple words. With real-world examples. Yes, again

      Great, friendly sketch-style but goes deep

  8. Jan 2020
  9. weather.rsmas.miami.edu weather.rsmas.miami.edu
    1. Why Predict? Historical ; Perspectives on Prediction in Earth Science

      Oreskes essay on logical vs. temporal prediction

    1. 

      wikipedia

    2. 

      Well we could talk about inheritance taxes in capitalism, where luck IS partially passed along, and the distribution of success is NOT necessarily stable...

    3. 

      attention here

    4. 

      a special case!

    5. 

      Students, write a brief reply to this post, with links to your own annotations above or other sources if it helps.

      Were any terms unclear? Share your questions, and/or answers you found online.

      Think of an example in your research area where a causal tree can be drawn that might help you structure a data analysis pertaining to the relationships among different variables.

    6. 

      Beyond profound: a "miracle"!

    7. 

      A profound law: the CLT

    8. 

      A law of statistics, misinterpreted as a law of heredity -- although a lot of statistics is actually repeated application of selection, which in some grand sense is "heredity".

    1. less inciination to misapply evi-dence

      Students, write a brief reply to this post, with links to your own annotations above or other sources if it helps.

      Were any terms unclear, or wonderfully antique? Share anything you learned about 19th century English. .

      Think of an example in your research area where multiple hypotheses are at play, in the literature or in your own mind. Can you feel the balancing effect of having several rather than one candidates for your mental affection?

    2. The mindlingers with pleasure upon the factsthat fall happily into the embrace ofthe theory, and feels a natural cold-nc,s toward those that seem refractory

      Human nature, expressing itself through selection bias

    3. the method of multiple work-ing hypotheses

      Chamberlin read his paper on "The method of multiple working hypotheses" before the Society of Western Naturalists in 1889, and it was published in Science in 1890 and the Journal of Geology in 1897

    1. About the Framework

      Students, write a brief reply to this post, with links to your own annotations above or other sources if it helps.

      1. What are the Three Basic Entities mentioned in the text?

      2. Were any terms unclear? Share your questions, and/or answers you found online.

      3. Think of an example in your research area where a "system" might be hiding within data, waiting to be Identified.

    2. Inferring models from observations and studying their properties is really what science is about.

      from Systems Identificaton, Lennart and Ljung, 1987 Prentice-Hall. I learned how to make a free "sandwich PDF" from a scan, with character recognition on top of the page image.

    1. misregulation hypothesis proposes that individuals may instead prioritize downregulating negative emotions (e.g., anxiety) through procrastination over accomplishing

      rings true

    2. beliefs or confidence regarding one’s capability

      self-efficacy

    1. Our mathematical nomenclature is as follows. Capitalized Latin letters denote random variables,122and lowercase versions of the same letter indicate particular values of these variables. Vectors123are boldfaced. We define ̃Yto be a climate sensitivity proxy such as the equilibrium climate124sensitivity or a climate feedback strength, for which the constraints are derived. A single emergent125constraint variable is denoted ̃X. A collection ofnemergent constraints will be labeled ̃Xi,i=1261,...,n. Versions of these random variables which have been normalized to have zero mean and127variance of 1 are similarly denoted, but without the tilde. The PDF of any random variableUis128p(u), and similarly for multivariate distributions.

      Random variables nomenclature, for class

    1. community

      The humanity is inescapable

    2. Daubert says, the methods should be judged by the following four criteria

      OK, here it is: the essential list

    3. There should be a known rate of error

      Error quantification is the huge mop-up job of generations, after the first glorious discoveries are logged. Sigh, this is what a "mature" science largely consists of.

    4. error is intrinsic to our interaction with nature

      Yes, it is just the finitude of our aspiration to be quantitative with continuous (real) numbers.

    5. force

      Beware of the word"forcing" in our science. It presupposes a causality direction.

    6. a list of such pretenders

      Might be amusing to look up

    7. data analysis and the proper application of research methods

      Especially if the result (claim) is unsurprising, the methods are rarely scrutinized.

    8. two-thirds of all postdoctoral fellows in biology in American universities believe that they are going to make this step, but in fact, only about a quarter of them succeed

      Yes it's a chasm, post-postdoc. I always picture Roadrunner cartoons. Meep meep.

    9. media outlets

      US News and World Report was once a magazine, now it is solely a college rating service!

    10. inconsistent with human nature

      Popper's view seems, like Bacon's, a view based on a very sparse early stage, offering the first tentative notions about a richly detailed world that narrative-deprived observers were humbly awed by. Now we are awash in zany ideas and can hardly see past them to reality sometimes. We need machetes to hack back jungles of assertion, not a shyness about offering tentative propositions.

    11. choose what is and is not worth observing

      Yes conditional sampling bias is infinitely deep in scienctific inference, right down to the Anthropic Principle.

    12. we understand

      This is a reductionist's blind spot (physics professor). It says that combinatorics is so straightforward that it doesn't count, once some basis set of interacting units is mapped out.

    13. cleansed of harmful preconceptions

      the less you think the better you observe? interesting

    14. demystify somewhat the business of science

      a good goal, even for scientists.

    1. But such processeshave not been well characterized.

      "such processes" are "not well characterized?" What does this mean?

    1. standard deviation σ/√n. Where σ is the standard deviation of the sample and n is the number of observations in the sample

      called the standard error of the mean

    1. Identifying Causal Effects from Observations

      Great, a chapter on this topic

    2. Concepts You Should Know

      For data analysis: a page of concepts

    1. Reminders from Basic Probability

      Basic probability: notation, etc.

    1. Scientists should also make their data and all other relevant materials available to the world once they publish their research

      "should" is easy to say -- what about giant model outputs for instance?

  10. Dec 2019
    1. calculated the Jacobian using thenumdifftoolsPython package. The specific268method we used from this package is a second-order forward difference method

      nice tool to know about

    1. he proposed framework uniquely elucidates therelationship between the IT and statistical perspectives on causalit

      sounds worth the slog, and we are lucky to have the leading example be in climate science

    2. distributions over the effect rather than values of the effect and(2) are defined with respect to random variables representing a cause rather than specific values

      Getting used to random-variables calculus

  11. Nov 2019
    1. We conclude that there is a need to more accurately quantify entrainment rates, improve the representation of plume radius, and incorporate the effects of column instability in future versions of 1D volcanic plume models.

      entrainment, radius, and stability dependence

    1. quasi-geostrophi

      Only incidence of this string in the book

    2. verticallypointing lidar.

      amazing new data source since it sees clear air (vapor)

    3. Morning convection is particularly strong in the Gulfof Panama because of its concave coastline

      Yes, but for gravity wave reasons not just a land breeze.

    4. abatic wi

      katabatic = downhill, anabatic = uphill

    5. ractional cloud coveragetends to be highest around sunrise and lowest duringthe afternoon. The thinning (and in some cases thebreakup) of the overcast during the daytime is due tothe absorption of solar radiation just below the cloudtops (see Fig. 4.30

      Diurnal timing of albedo affects the climatic timescale energy balance. Might it change with climate?

    6. evaporation of the drizzle drops in thesubcloud layer absorbs latent heat. The thermody-namic impact of the downward, gravity-driven flux ofliquid water is an upward transport of sensible heat,thereby stabilizing the layer near cloud base

      Stabilization by drizzle

    7. In contrast, cooling from above drivesclosed cell convection

      Albedo of the Earth depends scarily much on this delicate bistable (two-regime) solution for PBL-top clouds!

    8. Heating from below drives open cellconvection

      Bottom up vs. top-down convection

    9. horizontal roll vortices

      a momentum instability leading to ALONG-SHEAR rolls -- not KH instability which makes billows ACROSS the shear.

    10. The area indicated by the hatched region representsthe total amount of heat input into the bottom of the boundarylayer from sunrise until time t1

      Conserved variable with height diagram, subject to a stability limit. Energy flux "fill the area" game.

    11. turbulent sensible heatflux FHacross the mixed layer

      Notice the cooling effect at zi, since the upward flux is negative below, and then zero above.

    12. evolution during summer over land

      Classic diurnal cycle

    13. nonlocal air-parcel movement

      Nonlocal effects = convection (penetrative parcel motions). Diffusion idea (flux proportional to gradient) breaks down.

    14. Typical variation of wind speeds with height in thesurface layer for different static stabilities

      Wind speed in stable and unstable boundary layers

    15. Karmam

      Karman

    16. logarithmic wind profile is consistent with aK-theory approach

      via a mixing length theory for K

    17. Kmust increase linearly with height

      K is proportional to eddy size (height above ground)

    18. logarithmic profile

      Log layer near the surface.

    19. The stable boundarylayer near the ground consumes TKE, resulting inweak and sporadic turbulence there

      surface friction consumes it, stability just prevents vertical transports of the TKE downward to refresh the slowed winds

    20. Bowen ratio over the oceans decreaseswith increasing sea surface temperature. Typical valuesrange from around 1.00.5 along the ice edge to lessthan 0.1 over the tropical oceans where latent heatfluxes dominate

      Sometimes I see "evaporative fraction" EF used. That is clearer in its meaning.