- May 2022
I returned to another OER Learning Circle and wrote an ebook version of a Modern World History textbook. As I wrote this, I tested it out on my students. I taught them to use the annotation app, Hypothesis, and assigned them to highlight and comment on the chapters each week in preparation for class discussions. This had the dual benefits of engaging them with the content, and also indicating to me which parts of the text were working well and which needed improvement. Since I wasn't telling them what they had to highlight and respond to, I was able to see what elements caught students attention and interest. And possibly more important, I was able to "mind the gaps', and rework parts that were too confusing or too boring to get the attention I thought they deserved.
This is an intriguing off-label use case for Hypothes.is which is within the realm of peer-review use cases.
Dan is essentially using the idea of annotation as engagement within a textbook as a means of proactively improving it. He's mentioned it before in Hypothes.is Social (and Private) Annotation.
Because one can actively see the gaps without readers necessarily being aware of their "review", this may be a far better method than asking for active reviews of materials.
Reviewers are probably not as likely to actively mark sections they don't find engaging. Has anyone done research on this space for better improving texts? Certainly annotation provides a means for helping to do this.
- Apr 2022
A 2019 study published in the Proceedings of the National Academy ofSciences supports Wieman’s hunch. Tracking the intellectual advancement ofseveral hundred graduate students in the sciences over the course of four years,its authors found that the development of crucial skills such as generatinghypotheses, designing experiments, and analyzing data was closely related to thestudents’ engagement with their peers in the lab, and not to the guidance theyreceived from their faculty mentors.
Learning has been shown to be linked to engagement with peers in social situations over guidance from faculty mentors.
Cross reference: David F. Feldon et al., “Postdocs’ Lab Engagement Predicts Trajectories of PhD Students’ Skill Development,” Proceedings of the National Academy of Sciences 116 (October 2019): 20910–16
Are there areas where this is not the case? Are there areas where this is more the case than not?
Is it our evolution as social animals that has heightened this effect? How could this be shown? (Link this to prior note about social evolution.)
Is it the ability to scaffold out questions and answers and find their way by slowly building up experience with each other that facilitates this effect?
Could this effect be seen in annotating texts as well? If one's annotations become a conversation with the author, is there a learning benefit even when the author can't respond? By trying out writing about one's understanding of a text and seeing where the gaps are and then revisiting the text to fill them in, do we gain this same sort of peer engagement? How can we encourage students to ask questions to the author and/or themselves in the margins? How can we encourage them to further think about and explore these questions? Answer these questions over time?
A key part of the solution is not just writing the annotations down in the first place, but keeping them, reviewing over them, linking them together, revisiting them and slowly providing answers and building solutions for both themselves and, by writing them down, hopefully for others as well.
- Jan 2022
Fischer, O., Jeitziner, L., & Wulff, D. U. (2021). Affect in science communication: A data-driven analysis of TED talks. PsyArXiv. https://doi.org/10.31234/osf.io/28yc5
- TED talk
- general public
- science communication
- sentiment analysis
- scientific content
- social media
- affect valence