321 Matching Annotations
  1. Jul 2019
    1. How you track student progress can make a difference in their learning and your teaching.

      I will have to develop my own assessment strategies - formative and summative.

    1. Performance assessment does not have to be a time-consuming ordeal; it is a great way to assess our students' skills. It is essential to create a rubric that is simple, quick, and objective. This article discusses the process of creating a rubric as well as showing a rubric used by the author in her general music classroom for several years. Differences between assessment and evaluation are also mentioned.

      How to create a rubric for performance assessment?

    1. It is interesting to notice that this article from a decade ago doesn't even mention any online assessment. So much has changed since then! I'm glad to see that from measuring attendance and attitude we are moving toward a more professionally acceptable system where we can teach, assign and assess measurable knowledge in music ed, more specifically in choral programs.

    2. 11% for music knowledge

      Only 11% for knowledge! That is surprising and could be more if we don't try to measure "talent" but the knowledge that is teachable and factual. Again, this is old data (1991) so today the numbers might look different.

    3. attendance and attitude were the most common grading criteria employed by instrumental and choral music teachers.

      Yes. I noticed that in schools.

    4. Some music teachers believe the creative or interpretive nature of music precludes assessment but then readily employ subjective methods of assessment, many of which "are determined haphazardly, ritualistically, and/or with disregard for available objective information" (Boyle & Radocy, 1987, p. 2).

      This is old data (1987) but still true on some levels. By now, what I see in practice is that music educators have figured out what is that's measurable and what is not and in the school I was student teaching, the choral program is taken as an academic subject and is graded.

    1. In prior work, we found that different student choices oflearning methods (e.g., doing interactive activities, reading onlinetext, or watching online lecture videos) are associated withlearning outcomes [7]. More usage in general is associated withhigher outcomes, but especially for doing activities which has anestimated 6x greater impact on total quiz and final examperformance than reading or video watching.
  2. Apr 2019
    1. Trauma-Informed Screening and Assessment Tools

      Difference between trauma screening and trauma assessment tools: Screening tools are brief, used universally, and designed to detect exposure to traumatic events and symptoms. They help determine whether the child needs a professional, clinical, trauma-focused assessment. Functional assessments are more comprehensive and capture a range of specific information about the child’s symptoms, functioning, and support systems. A trauma assessment can determine strengths as well as clinical symptoms of traumatic stress. It assesses the severity of symptoms, and can determine the impact of trauma (how thoughts, emotions, and behaviors have been changed by trauma) on the child’s functioning in the various well-being domains.

  3. Mar 2019
    1. This article discusses that technology rich classroom research is lacking in the research world. This paper created a scale in which it could evaluate classroom environments. The authors tested this scale and determined it was a good starting framework for how to improve classroom environments. This scale could be useful later in class when evaluating technologies.Rating 9/10 for help assessment techniques

    1. 4Vision: Preparing Learning Communities to succeed in College and Careers in a global society through technology.Vision and Goals

      This proposal outlines a draft for a technology plan for Arizona regarding adult education. This plan outlines the goals of the plan and how Arizona can address them moving forward. This plan outlines trends for the future in technology and acknowledges challenges that might come up later down the line. This plan also reviews teaching standards and instruction, as well as operations for the future. Rating 6/10 for being a draft, but with good ideas!

    1. This page is free resource to download a book about how people learn. This selected chapter provides recommendations for assessments and feedback in learning environments in general which also applies to adult learning. In addition to these examples, this chapter provides a section on theory and framework to better understand the overall topics. Rating: 10/10 Great free, open source resource with reputable information about learning.

    1. personalized learning: how does it differ from traditional learning Some of the text here is gray and it is also small, so that does not make it easy to read. Nonetheless it is an infographic about personalized learning from which a fair amount of information can e learned in a short time. rating 4/5

    1. classroom assessment techniques These are quick ways to complete formative assessment during a class session. The results can help the instructor determine what he or she should address. it can unearth learner misconceptions. These were designed for college classrooms but can be used in other adult learning contexts. rating 4/5

    1. teachthought This particular page is entitled '20 simple assessment strategies you can use every day' but the reason it is listed here is because the page itself serves as a jumping off point for reviewing or learning about many educational theories and principles. This site may have been designed for K-12 teachers - I am not sure, but it is quite usable for those who teach adults. This is a place to come if you are interested in browsing - not if you have a specific thing you need at the moment. Rating 3/5

    1. This 69 page PDF offers good advice on writing a variety of types of test questions. It is called "Is this a trick question?" Despite the length of the PDF, it is easy to browse if you are interested in writing a specific type of question. As may be suggested by the length, this resource is more comprehensive than others. Rating 5/5

    1. NOT READY TO LET GO: A STUDYOFRESISTANCETO GRADING CONTRACTS

      This article was included in the curriculum for the Open Pedagogy track led by Dave Cormier in the 2019 Digital Pedagogy Lab Toronto.

      In a 19 March 2019 Virtually Connecting session, Dave explained that he uses the contract in this article as a negative example — not to be adopted uncritically, but as a starting place to think about how one might generate a better assessment model.

  4. Nov 2018
    1. Students are entitled to a more educative and user-friendly assessment system. They deserve far more feedback -- and opportunities to use it -- as part of the local assessment process.

      Evaluate and reflect on your assessment systems. Do you have a system in the sense that it is longitudinal and recursive? How do you need to adjust your practices to ensure students get this feedback on their learning?

    2. Our job is to teach to the standards, not the test.

      What would it take to go an entire year thinking this way? What habits in planning do you need to address? How would your assignments change?

    3. Inside the Black Box: Raising Standards Through Classroom Assessment
    4. The point of assessment in education is to advance learning, not to merely audit absorption of facts.

      How do we need to change language among teachers and students to change perception? What kinds of practical habits can we adopt?

    1. 1Engaging Adults Learners with Technology

      The pdf provides information from The Twin Cities Campus Library with instruction incorporating technology into teaching adult students.

      It includes a review of instructional technology, assessment for learning, framework for teaching adult learners and a workshop. This 14 page pdf provides the essentials necessary in understanding basic learning needs of adult learners.

      RATING: 3/5 (rating based upon a score system 1 to 5, 1= lowest 5=highest in terms of content, veracity, easiness of use etc.)

  5. Jul 2018
    1. an institutional rather than a user focus

      This is key: Desires to use portfolios in institutional/program assessment practices are part of what has made them cumbersome. Portfolio use in programs that emphasized their value for students and learning have always been the best examples in my opinion (eg, Portland State, LaGuardia CC, Clemson), even if they also use them in institutional/program assessment too.

    2. for many students owning their own domain and blog remains a better route to establishing a lifelong digital identity

      DoOO is definitely a great goal, especially if it is viewed in part as a portfolio activity, so people use their domains to build up a lifelong portfolio. What seems key is having the right supports in place to help people and institutions reach such goals not only technically, but most importantly, as a set of practices integrated into their personal and institutional workflows.

    1. Can explain concepts, principles, and processes by putting it their own words, teaching it to others, justifying their answers, and showing their reasoning.• Can interpret by making sense of data, text, and experience through images, analogies, stories, and models.• Can apply by effectively using and adapting what they know in new and complex contexts.• Demonstrate perspective by seeing the big picture and recognizing differ-ent points of view.• Display empathy by perceiving sensitively and walking in someone else’s shoes.• Have self-knowledge by showing meta-cognitive awareness, using productive habits of mind, and reflect-ing on the meaning of the learning and experience.

      Awesome examples! kind of reminds me of Bloom's taxonomy concept

    1. "The idea is bananas, as far as I'm concerned," says Kelly Henderson, an English teacher at Newton South High School just outside Boston. "An art form, a form of expression being evaluated by an algorithm is patently ridiculous."
  6. Feb 2018
  7. Jan 2018
    1. There are no audits matching your search

      There are no audits matching your search for Dispensary There are no audits matching your search for Cannabis There are no audits matching your search for Marijuana There are no audits matching your search for nutraceutical

  8. Nov 2017
    1. The aim is to demonstrate the distance travelled on their journey in the form of tangible, trackable learning outcomes and applications.
    1. At the very least, the tool should allow for robust formative assessment, and should be capable of giving timely, helpful feedback to learners.

      The “at the very least” part makes it sound as though this were the easy part.

  9. Oct 2017
    1. The distinction between assessment and surveillance seems really blurry to me.

      on the fine line between assessment and surveillance

    1. key skills they then can apply to other situations beyond this specific course or assessment

      Collaborative annotation as a way to assess skills rather than content mastery. Or in addition to.

    1. By giving student data to the students themselves, and encouraging active reflection on the relationship between behavior and outcomes, colleges and universities can encourage students to take active responsibility for their education in a way that not only affects their chances of academic success, but also cultivates the kind of mindset that will increase their chances of success in life and career after graduation.
  10. Sep 2017
    1. University-wide 33–39% of faculty said that fewer than half of their undergraduates meet their expectations

      This could mean that students are lacking in info lit skills, or that a minority of faculty have unrealistic expectations

  11. Jun 2017
    1. This is a draft - so Please make annotate and make suggestion or express concerns as you see fit.

      Please be courteous and respectful.

  12. May 2017
    1. TPS Reflective Exercises

      TPS as metacognition - worth trying out. Would have to budget time for it. Could we combine it with something to capture data? connect to qualtrics or google forms

    1. Summary: I really like this source because it provides amore in-depth analysis of Fake News Stories than my first article does. This source, just like the other ones I am showing for my annotated bibliography are all educational. (I think going over this again is not imperative.) Assessment: Everything I highlighted in yellow is something I believe might be more tricky to teach/talk to students with Disabilities about. This does not mean they are bad (they are actually great ideas to take in) I just have to think about how one can teach that information. What I highlighted in blue are tips the author said that I really appreciated and believe that a lot of people do not think about. I think people who are educated in a way about the fact that Fake News is out there would like this source. I see people who activley share Fake News everyday and there is no way this source would get them to see that all the news they know of is Fake. They would get really angry. That is why me educating my students about Fake News is so important! I think tis source seems less biased because in "Does teh story attach a generic enemy?" it includes the both the Liberal and Conservative side. Being liberal myself, I have been awre of mostly only Conservative Fake News that attacks liberals. Reflection: This source is a great addition for me because it gives me a more detailed lense through which to examine Fake News. It talks about points that rely on one's emotion as well as the actual writing. It gets to points that may are really important and go beyond the surface of a Fake News article.

  13. Apr 2017
    1. p. 12 Heintz 1987 is not in bibliography. A search for the quote suggests it is the same as this: Heintz, Lisa. 1992. “Consequences of New Electronic Communications Technologies for Knowledge Transfer in Science: Policy Implications.” In Washington, DC Congress of the United States. Office of Technology Assessment (OTA) Contractor Report.

      I can't find a full text though. Presumably because it is a contractor report, it isn't in either of the OTA archives:

      http://www.princeton.edu/~ota/ http://ota.fas.org/

  14. Feb 2017
    1. this kind of assessmen

      Which assessment? Analytics aren't measures. We need to be more forthcoming with faculty about their role in measuring student learning. Such as, http://www.sheeo.org/msc

    1. and reflect on its purposes — individually and as a class. 

      Metacognitive work on what the assessment is and how it works. Nice.

    2. Drier’s final grades are based on students’ written self-assessments, which, in turn, are based on their review of items in their portfolios. 

      Really appreciate modalities like this one where students are asked to show us what they've learned and to interact with the instructor and other students.

    3. Extrinsic motivation, which includes a desire to get better grades, is not only different from, but often undermines, intrinsic motivation, a desire to learn for its own sake (Kohn 1999a). 

      Focusing on grades as a / the measure of achievement also seems to undermine the kind of curiosity that is essential to authentic learning.

  15. Jan 2017
    1. No newspaper, no politician, no parent or school administrator should ever assume that a test score is a valid and meaningful indicator without looking carefully at the questions on that test to ascertain that they’re designed to measure something of importance and that they do so effectively.
  16. Sep 2016
    1. There is certainly value in assessing the quality of learning and teaching, but that doesn’t mean it’s always necessary, or even possible, to measurethose things — that is, to turn them into numbers.  Indeed, “measurable outcomes may be the least significant results of learning”

      Just because you need to measure learning get doesn't mean you can.

  17. Jul 2016
    1. How has learning already been changed by the tracking we already do?

      Alfie Kohn probably has a lot to say about this. Already.

    2. ensure that students feel more thoroughly policed

      That ship has sailed.

    1. students were being made to take them several times a year, including “benchmark” tests to prepare them for the other tests.

      Testing has gone sentient. Resistance is futile. At least in the US.

  18. Jun 2016
    1. Afurtherbarriertotheuseofformativefeedbackmaybethatsomestudentsincreasinglyfailtounderstandthetaken-for-grantedacademicdiscourseswhichunderpinassessmentcriteriaandthelanguageoffeedback(Hounsell,1987).AccordingtoEntwistle(1984,p.1),‘effectivecommunicationdependsonsharedassumptions,denitions,andunderstanding’.ButastudyatLancasterUniversityfoundthat50%ofthethird-yearstudentsinoneacademicdepartmentwereunclearwhattheassessmentcriteriawere(Baldwin,1993,citedinBrown&Knight,1994).Asoneofourstudentsnoted:‘Ihaven’tgotacluewhatI’massessedon’

      The extent to which students do not understand what they are being assessed on, even in higher years.

    1. Assessment and Classroom Learning

      Black, Paul, and Dylan Wiliam. 1998. “Assessment and Classroom Learning.” Assessment in Education: Principles, Policy & Practice 5 (1): 7–74.

      This is the original work in the area.

      Largely a literature review from 1988 through 1998.

  19. Apr 2016
    1. The process of peer review ensures the inviola bility of these codes and, in this way, discourages innovative work. What does not conform to the code is deemed unacceptable.

      Jeff Rice: "Assessment love the good guy."

  20. Mar 2016
    1. I told them you could work 60 hours a week, never take a holiday or weekend off, have internationally regarded publications – lots of them, write textbooks, be a great teacher, and managers will still ask for more. And more. I told them you are measured only by what you have not managed to achieve, not what you have achieved, never mind how valuable or prestigious.

      Unfortunately, this is how academics assess their students, too.

  21. Feb 2016
    1. Zoomerang

      zoomerang survey software option

    2. While the display is appealing and easy to read, it is not customizable

      Polldaddy: survey software selection. List of cons.

    3. Polldaddy for iOS was a great option for this type of assessment. The layout and design are optimized for the iPad’s screen, and survey results are loaded offline. Be-cause an Internet connection is not required to administer a survey, the researcher has more flexibility in location of survey administration.

      Polldaddy did not require wireless internet access, making it a more flexible survey-software option

    4. Polldaddy software chosen for second iteration of survey offered at GSU for assessment.

    5. Google Forms

      Chosen survey-taking software for iPad surveys given to users at GSU.

    6. A two-question survey was designed to pilot the iPadas a survey delivery device in GSU Library. The first survey question was ―Why did you come to the library today? Please choose the primary reason.‖ Ten response options were listed in alphabetical order, and survey takers were allowed to select one option. The tenth response option was ―other,‖ with a text field in which survey takers could enter their own explanations. This question was included because the library is extremely busy, with an average daily door count of 10,000 during a typical semester. The door count data show heavy use of the building, but the library has little means of finding out what visitors do while they are in the buildings. The second survey question was ―What is your major?,‖ which was an open-text field.

      Georgia State Library test-survey (two questions).

    7. Bhaskaran (2010) recently weighed in on the benefits of using iPads for survey re-search, but little has been written about the use of tablet computers as mobile assess-ment devices. What literature does exist primarily relates to the healthcare professions.
    8. Over the past few years, the market research literature has reflected a concern about the quality of face-to-face market research as compared to online surveys and polls. Manfreda, Bosnjak, Berzelak, Haas, and Vehovar (2008) analyzed other studies that compared response rates from Web-based surveys to response rates of at least one other survey delivery method. Web survey responses were, on average, eleven percent lower than the other methods investigated. In their study of face-to-face survey responses as compared to online survey responses, Heerwegh and Looseveldt (2008) concluded that responses to Web surveys were of poorer quality and, overall, less suffi-cient than responses to surveys conducted face-to-face.

      face-to-face surveying produces greater results than web-based surveys.

    1. the personal, social, and emotional transformations that adolescents and adults who are at risk experience as they develop resilience and shift from disengagement to engagement, and/or academic failure to success in schools.

      I think it's important to be able to identify the changes in attitude, relationships and moods that we can see when at-risk teenagers begin to be self-directed learners. If we could see what these changes look like and agree on them, then we might be able to assess students better. Currently none of the ways we move or prevent students from moving through school make sense to me: social promotion (advancement because of age), testing (usually of a small subset of math and reading skills), or even portfolio assessment (because at-risk students usually don't have a body of "mastery" level work).

    1. For GEDIVT: How do the arguments made here (and by Daniel Pink) align with or challenge our assumptions about grading and assessment?

  22. Nov 2015
  23. Jul 2015
    1. I have used the bibliographies to conduct my own research in the area of cataloging assessment, and the social justice bibliography has helped me with a project I’m working on to examine video classification practices.

      A lot of my research involves digital library/digital repository assessment, and the assessment literature in that area also relies heavily on quantitative measurements of assessment. I'm very interested in seeing the cataloging + social justice bibliography and if it can help my digital library assessment research.

  24. Jun 2015
    1. there is a powerful impact on growth and self awareness when students can see their own development in speaking, in writing, in thinking and problem solving.

      So it all comes back to self-directed learning again. As I've begun to think about this competency in our school, I've thought about how this might be something that is intertwined with all other competencies. In plain language, this might mean that students are always pulling back holding up a mirror (or taking a snapshot) of their learning/journey.

    2. Using explicit criteria, the student develops the ability to look at her own work and determine the strengths and weaknesses evident in a particular performance or across a set of performances. She begins to set goals to address the areas she needs to develop and to deepen her areas of strength.

      The obvious paradox here is that the more "explicit" and digestible (student friendly) our criteria, the more a student can be independent in assessing her own work. That's a wonderful tension between top-down criteria and bottom-up assessment.

  25. May 2015
    1. “Making creates evidence of learning.” The thing you make—whether it be a robot, rocket, or blinking LED—is evidence that you did something, and there is also an entire process behind making that can be talked about and shared with others. How did you make it? Why? Where did you get the parts? Making is not just about explaining the technical process; it’s also about the communication about what you’ve done.

      This is an important notion, that making something is the beginning of having evidence of learning. AND that embodied in that object is the process and the learning that you went through, which needs to be given time and place to show.

    1. "What connections can I make between what I'm learning in one class with what rm learning in another?" ""What questions do I have about my learning?"

      Versions of these questions would be good for us to consider in our portfolio panels.

    2. making student development visible and accessible to the student, through video portfolios, written portfolios, and multi media collections of work

      What a powerful reason for asking students to keep and develop a portfolio: because we want you to see the progress you will make/are making, or at least see the changes and development of your work.

    3. The challenge for all of us engaged in the design of portfolio assessment is to assist our students to learn how to make their products more "interwoven and complete," weighing "the stress of every chord" to assure that the portfolio becomes an expressio

      What a bracing shift it would be to ask students to consider their portfolios as something that is an expression "worthy of their time and effort." To treat the portfolio as another presentation of their work, for a real audience, and one that matters.

      How can we begin to give students experiences of this kind of presentation of self/work in small ways, not just at the end when a portfolio is due.

    4. the portfolio can be a structure to help an individual express meaning. But its quality depends up what the individual does with it.

      This would suggest that a portfolio is a means of self-expression. Students should be encouraged to show who they truly are through a portfolio.

      So I was just looking at a folder of work that a seventh grader wants to use in her portfolio. She came to me asking me to "approve" of the work. "Is this good enough for my portfolio in Independent Reading?"

      It wasn't easy to get her to understand that I wasn't going to give approval or disapproval, and instead I asked her in as many was as I could think of to show me how the work show us something important about her ability to "have conversations online" (as our competency states) about her reading. Or more generally, I said, "Okay, so here are three responses to short stories that you have first drafts of. You do need to finish them, and as you do, think about what you want these to show about your unique, thoughtful ways of responding to literature."

      We have work to do. But Mary Diez's metaphor here reminds me of how important it is to return the power of the portfolio to the student. It's not my approval of the work that matters, it's the student's ability to recognize and articulate her own sense of why this work matters, how it shows something important about herself.

    1. the unit of analysis for assessment should be a system of activities and practices and take place on multiple time scales —from hours to months
    2. the scope of valued learning outcomes for informal learning activi - ties should include social, emotional, and developmental outcomes as well as content knowledge and should include learning by groups and whole projects as well as by individuals
  26. Mar 2015
    1. We believe that it should be a system of activities and practices over time; these include the actions of individual learn - ers as well as the roles of other participants, such as mediating tools, semiotic media, and local conditions directly relevant to and supportive of (or obstructing) the learning activities

      Well that sounds scalable. Geez?

    2. Conventionally, an (occasionally naïve) attribu - tion of a valued condition to some specific cause (e.g., to an intervention). Rarely, however, are valued learning goals the outcome of discrete, identifiable causes.

      Getting at the idea that traditional outcomes based assessment is shallow.

    3. Assessment, evaluation and research all build on documen - tation but may require different modes and foci of documen - tation. In more traditional terms,

      An important distinction

    4. Know-that matters only insofar as it is mobilized as part of know-how; know-how (cultural capital) matters for career futures and social policy only when effectively combined with know-who (social capital).

      Know -that, know-how, and know who. Interesting way to define knowledge. the latter two being based on capital. As if knowledge is something we build up to spend?

    5. Learning that matters is learning that lasts and that is mobilized across tasks and domains

      Again you see the idea that learning as action is the major goal.

    6. Second is the improved ability to act collaboratively, coor - dinating and completing tasks with others, assisting them, and productively using affective sensibilities in doing so

      Here the group learning is put ahead of the individual. It goes back to it isn't learning if it isn't acted upon and acting in a group not only make learning visible but is also the goal.

    7. First is the personal increase of comfort with, and capacity to partici - pate in, activities that involve inquiry, investigation, and repre - sentation of phenomena in a widening range of domains.

      Interesting that comfort with the domain and listed before knowledge of the domain.

    8. emphasize the importance of taking into account in assessment design the incorporation of relevant knowledge about the history of the project, the community, and the partici - pating organizations and knowledge of the current wider insti - tutional contexts

      Interesting take on the importance of historical knowledge influencing assessment of informal spaces.

    9. “Know-who” is as important as know-how in getting things done.

      I am stealing this when discussing social search and networked learning spaces.

    10. As an aspect of human development—at the individual, group, or organizational level—the learning that mat - ters is learning that is used.

      So this line here reveals a lot about the theoretical underpinnings of the authors. Then again so did their names.

    11. that simple declarative knowledge is only one valued outcome of learning and is too often overemphasized in assessment designs to the exclusion or marginalization of other equally or more important outcomes.

      this is often the case when we think in terms of practicality, efficiency, fidelity, and reliability.

    12. Informal learning experiences, in contrast, build on the diverse interests and curiosity of learners and support their self- motivated inquiries.

      Contrasting to formal education. I feel sometimes that formal education cab be vilified in the literature as being void of intentional learning.

      That just isn't true. Many students have complex reasons for wanting to succeed or not in school.

    13. ond, to offer program staffs, project funders, and other supporters recommendations of good practices in project assessment and identifiable needs for devel - oping improved assessment techniques.

      So more future looking. What do we have to develop?

    14. first, to offer to those who design and assess informal learning programs a model of good assessment practice, a tool kit of methods and approaches, and pointers to the relevant literature

      point away. This fits well with efforts in Mozilla Learning to try and develop friction free assessment.

    15. to reviewing the literature, the authors convened three expert meetings involving a total of 25 participants to dis - cuss key issues, identify successful approaches and outstanding challenges,

      This is a very interesting methodology to add to the traditional literature review.

    16. many sig - nificant learning outcomes may be unpredictable in advance of the learner’s participation

      and this basically sums up what makes assessment of informal learning so difficult.

    17. learning goals pursued by participants are generally open-ended, dependent in part on available resources and on repurposed ways to use those resources

      I like this idea of repurposing resources as a way to reach open ended goal, though sometimes informal learning spaces are joined for goals unrelated to learning or for very specific ended outcomes

    18. “Informal learning” is both a broad category and shorthand for a more complex combination of organized activities in face- to-face or online settings other than formal schools in which particular features are especially salient.

      Key definition of how the paper defines informal learning.

    19. what works in informal learn - ing and what doesn’t

      We also have to define success before we can start to measure it.

    20. knowledge base of science 2 Introduction learning in informal environments (Bell et al. 2009

      I need to go and read this.

  27. Feb 2015
    1. identifying one primary audience

      RK&A says doomed for failure if try to focus on too many communities at once. Choose one primary audience (probably the highest need), and build intended impact/programming/marketing/etc. around them.

    2. Faculty members have the capacity to create their own meaning and feel comfortable and enjoy making sense of the thematic arrangement of objects. Students, on the other hand, usually sought out interpretive devices, like text labels, for an explanation of how they were “supposed to feel.” Further, students did not perceive the thematic organization of the works of art as the interpretive device, whereas faculty did. While most faculty used the text labels to reinforce their own thinking and reassure them they were on the right track, most students used the text labels as their entry point to understanding and experiencing the exhibition.
    3. Faculty and students had distinctly different experiences
    4. RK&A believes that successful exhibitions result when staff acknowledges the perspectives, perceptions, and knowledge of their target audience.

      this should apply to all types of exhibitions

    1. There are two essential building blocks to creating a success project: 1) selecting an audience; and 2) articulating clear outcomes.
    1. we promote the lifecycle approach to exhibition evaluation because we know that when evaluation procedures are built into an exhibition’s lifecycle

      RK&A types of eval: front-end evaluation (after concept-development but before design develops), formative evaluation (evaluating prototypes), remedial evaluation (post-installation/troubleshooting), summative evaluation (at end of everything)

    1. The presence of volunteers further added to visitors’ experiences as volunteers’ presence encouraged visitors to ask questions, look for hints when appropriate, and learn about the mathematical principles behind the exhibits.

      Value of volunteers / peers as POC's in the space. Works for dedicated galleries/museums, but not so sure for libraries. Spatial differences - people not coming to library as a "destination." Possibly helping text/hands-on technology and automated feedback channels more useful for our case.

    2. inviting and amusing nature of the exhibition, designed to be interactive, entertaining, provocative, and challenging all at the same time

      Challenging/less "popular" topics (i.e. math in this case) benefit from simpler exhibitions that still address the challenge, but make it interactive/entertaining

    3. follow-up telephone interviews with visitors one month after each festival date in order to identify how visitors’ ideas about math change over time.

      follow-up interviews to measure impact/changing perceptions over time

    4. onsite interviews with visitors
    5. RK&A explored the ways in which science festival visitors used the MM2GO exhibition and how the exhibition affected visitors’ ideas and attitudes about mathematics.
  28. Jan 2015
    1. Conventional wisdom has it that practice makes perfect and that expertise is the natural outcome of years of practice. But few people become good writers, no matter how much they write.

      Tough reality.

    2. But as a society, we are not concerned with novices. Eventually they will quit being novices, without our having to do anything about it. The important question is what they will become. Will they become experts in their lines of work or will they swell the ranks of incompetent or mediocre functionaries?

      A common thread with Gee in the Ant--Education Era, an open disdain for incompetence.