232 Matching Annotations
  1. Last 7 days
  2. Nov 2024
  3. Aug 2024
    1. Inventories of species remain incomplete – mainly due to limited field sampling –to provide an accurate picture of the extent and distribution of all components ofbiodiversity (Purvis/Hector 2000, MEA 2003).

      for - open source, citizen science biodiversity projects - validation - open source, citizen science climate departure project - validation

      open source, citizen science biodiversity projects - validation - Inventories of species remain incomplete - mainly due to limited field sampling to provide an accurate picture of the extent and distribution of all components of biodiversity - Purvis/Hector 2000, MEA 2003

  4. Jul 2024
    1. Now Mr. Wellsindependently arrives a t the recogni-tion that Science with a capital S notonly neglects the psychological prob-lems in the world's disorder, but alsocarries in its train the dogmatism anduniformity upon which theologicalhate and persecution a r e founded.

      What besides work in behavioral economics has focused on the humanist side of the sciences as a means of helping humanity beyond the basic black and white?

      How to create a "religion of science" which helps to displace the psychological problems, theological hate, etc?

  5. Jun 2024
  6. May 2024
    1. Skeptics may hold that religious experience is an evolved feature of the human brain amenable to normal scientific study.

      Can religious experiences be made scientific? That which is beyond thought (and is wholly subjective)?


      See Steven Kotler referencing flow science as making the supernatural ("A gift from gods") into science.

  7. Dec 2023
  8. Oct 2023
  9. Sep 2023
  10. Aug 2023
  11. Jul 2023
    1. Since I think this is such an important issue for science, I have been working to create a system to do this which launched in the summer of 2022, called Octopus.ac with the backing of Research England. In a way, it’s pulling together the attempts to avoid publication bias and incentives for questionable research practices of initiatives such as the Journal for Negative Results, Registered Reports, or F1000, the faster sharing offered by preprint servers, and the breaking up of narrative formats championed by similar platforms such as ResearchEquals. A holistic approach, though, I think is important. Researchers need to know where the version of record is and how their work will be judged in order to know where to write for and how to write.

      [[Octopus.ac]] is a potential alternative to [[ResearchEquals]] for open publishing founded by [[Alexandra Freeman]]. Appears to be attempting to avoid the tension for academic journals to be both "informative" and "persuasive". Seems to fall within the general [[open science]] movement.

    1. In the case of ResearchEquals the author must pay if they want to have their work published using a more restrictive Creative Commons license. Octopus also employs Creative Commons licenses, but requires one which allows derivative works. The publication types in Octopus are based on the eight stages of scientific research: Research Problem Rationale/Hypothesis Method Results Analysis Interpretation Real World Application Peer Review For ResearchEquals there are many more publication types and they are called modules. Thus, enabling the publication of text, data, code and media. With both platforms, each publication is assigned its own DOI. __ATA.cmd.push(function() { __ATA.initDynamicSlot({ id: 'atatags-26942-64c40660082d9', location: 120, formFactor: '001', label: { text: 'Advertisements', }, creative: { reportAd: { text: 'Report this ad', }, privacySettings: { text: 'Privacy', } } }); });

      Compares the difference between [[Octopus.ac]] and [[ResearchEquals]] platforms in the [[open science]] movement. Looks like Octopus is more strictly matching the [[eight stages of scientific research]], whereas RE allows for more options (including "publication of text, data, code and media.") Notably, each platform gives a [[DOI]] to each publication.

      Questions:

      Does each module in RE get it's own DOI?

      Likewise, does each publication type in Octopus get it's own DOI?

      Do either of these address the concern of other academics "scooping" each other's work?

    1. [[Octopus.ac]] is a potential alternative to [[ResearchEquals]], but notably doesn't mention RE anywhere on it's site (whereas RE does mention Octopus.ac in it's FAQs). Specifically, it's comparison is here.

      Each of their platforms seem to be related to the [[open science]] movement.

    1. See also: Got 15 minutes and want to learn Git? git + LaTeX workflow at StackOverflow Writing the PhD thesis: the tools Part I Collaborating with LaTeX and git at ShareLaTeX blog - a great and comprehensive tutorial What are the advantages of using version control (git, CVS etc) in LaTeX documents - TeX.SE https://tex.stackexchange.com/search?q=version+control

      Some links to resources on using [[LaTeX]] and [[Git]] suggested by [[Piotr Migdal]].

      This was part of his top-voted answer to "Why use [[version control system]]s for writing a paper?" on the [[Academia]] [[StackExchange]] site.

      Was looking into the tools available for [[open science]] collaborations.

    1. People absolutely try. I can't name the journals that try these off the top of my head, but as you can see from that Wikipedia section, there are journals that: Do double-blind peer review (authors don't know who the reviewers are, and vice versa) Do triple-blind peer review (authors & editors & reviewers don't know who each other are) Do open peer review (everyone knows who everyone else is) Do open peer reports (reviews are published together with the paper) Do open participation (reviewers self-select to review the paper) Do post-publication peer review (every paper is published, reviews are done after publication) Do results-blind peer review (reviewers receive a manuscript where the results & conclusions are omitted) Do two-stage results-blind peer review (review done in two stages; in the first stage reviewers don't know the results/conclusions, in the second stage they do) Do novelty-blind peer review (reviewers are specifically instructed not to comment on whether the paper is novel, only if it is correct) The fact that the traditional model has endured is a sign of how robust it is. Everyone knows it is flawed, but nobody has been able to come up with a better model. ShareShare a link to this answer (Includes your user id)Copy linkCC BY-SA 4.0 Edit Follow Follow this answer to receive notifications answered Jan 5 at 7:58 Allure

      A response by [[Allure]] to an [[Academia]] [[StackExchange]] question about alternative publishing models for scientific experiments that help deal with the [[replication crisis]].

      In the comments, Allure suggests that journals that "Do results-blind peer review (reviewers receive a manuscript where the results & conclusions are omitted)" encourage publishing "non-significant results".

    1. How to protect scientific open research from being patented?

      A helpful question on [[Academia]] [[StackExchange]] about preventing [[open science]] research from being patented.

      The top answer from [[Gilles 'SO- stop being evil']] suggests simply publishing one's research protects it (since the disclosure counts as prior art.

      So even if an someone invents something, you publishing it stops someone from being able to file a patent on it (except in countries that have a grace period for inventors, like the US).

      The only risk remaining is that you (or fellow inventor you worked with) take advantage of the grace period in countries that have this.

      Some research institutions (public or private) have a formal practice of defensive publications: publish potential inventions that they don't intent to patent as soon as possible, in order to block anyone else from patenting them. Technically, any publication is a public disclosure, including an arXiv preprint, a blog post, or even a research seminar if it's legally open to external visitors. However, since it's easier to fight a patent before it's granted, it is advantageous to make it easy for patent examiners to find the defensive publication.

      If you're concerned about someone filing a patent on something you discovered, or for that matter anything that you know about, you can watch patent applications. Patent applications are published for a period of at least a few months, during which time anyone can point the patent examiner to something that they consider to be prior art. Stack Exchange participates in this process through their Patents site where people can coordinate prior art searches.

  12. www.researchequals.com www.researchequals.com
    1. How is ResearchEquals different from Octopus? Isn't it the same? We get that question a lot! Octopus is indeed similar. The main differences are: ResearchEquals allows you to link any steps together, Octopus has a specific order of events ResearchEquals allows for a wide variety of steps (focus on provenance), Octopus has 8 specific ones (focused on empirical cycles mostly) It's a flavor difference mostly. Like onion rings and calamari.

      What [[ResearchEquals]] claims is different between it's platform and that of [[Octopus.ac]].

    1. Learned about this because

      [[ResearchEquals]] is a Liberate Science GmbH project, which is funded by the [[Shuttleworth Foundation]] until the end of 2022.

  13. Jun 2023
    1. This analysis will result in the form of a new knowledge-based multilingual terminological resource which is designed in order to meet the FAIR principles for Open Science and will serve, in the future, as a prototype for the development of a new software for the simplified rewriting of international legal texts relating to human rights.

      software to rewrite international legal texts relating to human rights, a well written prompt and a few examples, including the FAIR principles will let openAI's chatGPT do it effectively.

  14. May 2023
  15. Apr 2023
    1. Information Creation as a Process

      Information (or knowledge) creation is a *continuous* process. Scientific publication could (maybe should) be continuously be updated as presented in the following book chapter:

      HELLER, Lambert, THE, Ronald and BARTLING, Sönke, 2014. Dynamic Publication Formats and Collaborative Authoring. In: BARTLING, Sönke and FRIESIKE, Sascha (eds.), Opening Science. Online. Springer International Publishing. pp. 191–211. [Accessed 11 January 2014]. ISBN 978-3-319-00025-1. Retrieved from: http://link.springer.com/chapter/10.1007/978-3-319-00026-8_13

  16. Dec 2022
  17. Nov 2022
    1. In an Open Science context,  “infrastructure” -- the "structures and facilities" -- refers to the scholarly communication resources and services, including software, that we depend upon to enable the scientific and scholarly community to collect, store, organise, access, share, and assess research.
    1. Creating video tutorials has been hard when things are so in flux. We've been reluctant to invest time - and especially volunteer time - in producing videos while our hybrid content and delivery strategy is still changing and developing. The past two years have been a time of experimentation and iteration. We're still prototyping!

      Have you thought about opening the project setting and the remixing to educators or even kids? That could create additional momentum.

      A few related resources you might want to check out for inspiration: Science Buddies, Seesaw, Exploratorium

  18. Oct 2022
  19. Sep 2022
    1. In 1990, 15.1 percent of the poor were residingin high- poverty neighborhoods. That figure dropped to 10.3 percent by 2000,rose to 13.6 percent for 2010, and then fell to 11.9 percent for 2015.

      Is there a long term correlation between these rates and political parties? Is there a potential lag time between the two if there is?

  20. Aug 2022
  21. Jul 2022
    1. It draws together data scientists, experimental and statistical methodologists, and open science activists into a project with both intellectual and policy dimensions.

      open science activists

    1. This perspective has been called an “emblematic worldview”; it is clearly visible in the iconography ofmedieval and Renaissance art, for example. Plants and animals are not merely specimens, as in modernscience; they represent a huge raft of associated things and ideas.

      Medieval culture had imbued its perspective of the natural world with a variety of emblematic associations. Plants and animals were not simply specimens or organisms in the world but were emblematic representations of ideas which were also associated with them.

      example: peacock / pride

      Did this perspective draw from some of the older possibly pagan forms of orality and mnemonics? Or were the potential associations simply natural ones which (re-?)grew either historically or as the result of the use of the art of memory from antiquity?

  22. Jun 2022
    1. But systems of schooling and educational institutions–and much of online learning– are organized in ways that deny their voices matter. My role is to resist those systems and structures to reclaim the spaces of teaching and learning as voice affirming. Voice amplifying.

      Modeling annotation and note taking can allow students to see that their voices matter in conversation with the "greats" of knowledge. We can and should question authority. Even if one's internal voice questions as one reads, that might be enough, but modeling active reading and note taking can better underline and empower these modes of thought.

      There are certainly currents within American culture that we can and should question authority.

      Sadly some parts of conservative American culture are reverting back to paternalized power structures of "do as I say and not as I do" which leads to hypocrisy and erosion of society.

      Education can be used as a means of overcoming this, though it requires preventing the conservative right from eroding this away from the inside by removing books and certain thought from the education process that prevents this. Extreme examples of this are Warren Jeff's control of religion, education, and social life within his Mormon sect.

      Link to: - Lawrence Principe examples of the power establishment in Western classical education being questioned. Aristotle wasn't always right. The entire history of Western science is about questioning the status quo. (How can we center this practice not only in science, but within the humanities?)


      My evolving definition of active reading now explicitly includes the ideas of annotating the text, having a direct written conversation with it, questioning it, and expanding upon it. I'm not sure I may have included some or all of these in it before. This is what "reading with a pen in hand" (or digital annotation tool) should entail. What other pieces am I missing here which might also be included?

    1. Open Science

      Open science and citizen science are complementary, for citizen science openness has to be even more discussed for the benefits of participants

  23. Apr 2022
  24. Mar 2022
  25. Jan 2022
  26. Dec 2021
    1. AIMOS. (2021, November 30). How can we connect #metascience to established #science fields? Find out at this afternoon’s session at #aimos2021 Remco Heesen @fallonmody Felipe Romeo will discuss. Come join us. #OpenScience #OpenData #reproducibility https://t.co/dEW2MkGNpx [Tweet]. @aimos_inc. https://twitter.com/aimos_inc/status/1465485732206850054

  27. Nov 2021
    1. it builds on the following key pillars: open scientific knowledge, open science infrastructures, science communication, open engagement of societal actors and open dialogue with other knowledge systems.

      penerbitan makalah di jurnal open access jelas hanyasebagian kecil saja dari lima pilar kunci: open scientific knowledge, open science infrastructures, science communication, open engagement of societal actors.

    2. Deploying appropriate monitoring and evaluation mechanisms

      rekomendasi pertama: membuat instrumen monev untuk implementasi sains terbuka pada level nasional di negara masing-masing. >> ini akan dan telah terbukti juga menggunakan layanan komersial.

    1. "The Guide to Social Science Data Preparation and Archiving is aimed at those engaged in the cycle of research, from applying for a research grant, through the data collection phase, and ultimately to preparation of the data for deposit in a public archive: " from tweet

  28. Oct 2021
  29. Sep 2021
  30. Jul 2021
  31. Jun 2021
  32. May 2021
  33. Apr 2021
    1. Robson, S. G., Baum, M. A., Beaudry, J. L., Beitner, J., Brohmer, H., Chin, J., Jasko, K., Kouros, C., Laukkonen, R., Moreau, D., Searston, R. A., Slagter, H. A., Steffens, N. K., & Tangen, J. M. (2021). Nudging Open Science. PsyArXiv. https://doi.org/10.31234/osf.io/zn7vt

  34. Mar 2021
    1. ReconfigBehSci. (2020, November 5). In 4 days: SciBeh workshop ‘Building an online information environment for policy relevant science’ Join us! Topics: Crisis open science, interfacing to policy, online discourse, tools for research curation talks, panels, hackathons https://t.co/SPeD5BVgj3… I https://t.co/kQClhpHKx5 [Tweet]. @SciBeh. https://twitter.com/SciBeh/status/1324286406764744704

  35. Feb 2021
  36. Jan 2021
    1. Ways will be found to make communities sustainable,

      Ways will also be found to legibilize the deliberately inscrutable. With biomed funding so centralized, forces can be applied to increase the adoption of practices like data sharing and open science.

  37. Dec 2020
  38. Nov 2020
  39. Oct 2020
    1. The ideas here make me think that being able to publish on one's own site (and potentially syndicate) and send/receive webmentions may be a very useful tool within open science. We should move toward a model of academic samizdat where researchers can publish their own work for themselves and others. Doing this will give them the credit (and job prospects, etc.) while still allowing movement forward.

    1. High-level bodies such as the US National Academies of Sciences, Engineering, and Medicine and the European Commission have called for science to become more open and endorsed a set of data-management standards known as the FAIR (findable, accessible, interoperable and reusable) principles.
    1. The plan is to use the site to share surveys, interviews, and researcher notes.

      Note to self: I need to keep documenting examples of these open labs, open notebooks, etc. in the open science area.


      [also on boffosocko.com]

  40. Sep 2020
    1. Hennessy, E. A., Acabchuk, R., Arnold, P. A., Dunn, A. G., Foo, Y. Z., Johnson, B. T., Geange, S. R., Haddaway, N. R., Nakagawa, S., Mapanga, W., Mengersen, K., Page, M. J., Sánchez-Tójar, A., Welch, V., & McGuinness, L. A. (2020). Ensuring Prevention Science Research is Synthesis-Ready for Immediate and Lasting Scientific Impact [Preprint]. MetaArXiv. https://doi.org/10.31222/osf.io/ptg9j

  41. Aug 2020
  42. Jul 2020
  43. Jun 2020
  44. May 2020
  45. Apr 2020
    1. ``Debugging is parallelizable''. Although debugging requires debuggers to communicate with some coordinating developer, it doesn't require significant coordination between debuggers. Thus it doesn't fall prey to the same quadratic complexity and management costs that make adding developers problematic.

      contrast this to physical manufacturing: Gereffi's typology of manufacturing Manufacturing today is rarely evolved to the modular stage for complex projects (such as code), and yet it proceeds across oceans, machinery, and---more frequently---across languages. Programming standardizes the languages of production while allowing the languages of collaboration to be multiple. These multiples are the parallel clusters around the world hacking away at their own thing. They are friends, they are scientists, they are entrepreneurs, they are all of the above.

  46. Feb 2020
    1. "We are at a time where some people doubt the validity of science," he says. "And if people feel that they are part of this great adventure that is science, I think they're more inclined to trust it. And that's really great."

      These citizen scientists in Finland helped identify a new type of "northern light". Basically, 2 people were able to take a shot of the same display at the same second, 60 miles apart, allowing for depth resolution.

  47. Jan 2020
    1. Overall, we received 60 submissions for the Call for Poster Presentations. Among the high amount of excellent abstracts, the programme committee decided to accept 20 abstracts for poster presentations.

      Even a normal conference in the geo-sciences is more open than this "open science" conference. There is a limited amount of time for speakers, but why would anyone deny someone the possibility to present a poster and try to find an audience for their research? There is no scientific need for this gate keeping.

  48. Dec 2019
    1. Four databases of citizen science and crowdsourcing projects —  SciStarter, the Citizen Science Association (CSA), CitSci.org, and the Woodrow Wilson International Center for Scholars (the Wilson Center Commons Lab) — are working on a common project metadata schema to support data sharing with the goal of maintaining accurate and up to date information about citizen science projects.  The federal government is joining this conversation with a cross-agency effort to promote citizen science and crowdsourcing as a tool to advance agency missions. Specifically, the White House Office of Science and Technology Policy (OSTP), in collaboration with the U.S. Federal Community of Practice for Citizen Science and Crowdsourcing (FCPCCS),is compiling an Open Innovation Toolkit containing resources for federal employees hoping to implement citizen science and crowdsourcing projects. Navigation through this toolkit will be facilitated in part through a system of metadata tags. In addition, the Open Innovation Toolkit will link to the Wilson Center’s database of federal citizen science and crowdsourcing projects.These groups became aware of their complementary efforts and the shared challenge of developing project metadata tags, which gave rise to the need of a workshop.  

      Sense Collective's Climate Tagger API and Pool Party Semantic Web plug-in are perfectly suited to support The Wilson Center's metadata schema project. Creating a common metadata schema that is used across multiple organizations working within the same domain, with similar (and overlapping) data and data types, is an essential step towards realizing collective intelligence. There is significant redundancy that consumes limited resources as organizations often perform the same type of data structuring. Interoperability issues between organizations, their metadata semantics and serialization methods, prevent cumulative progress as a community. Sense Collective's MetaGrant program is working to provide a shared infastructure for NGO's and social impact investment funds and social impact bond programs to help rapidly improve the problems that are being solved by this awesome project of The Wilson Center. Now let's extend the coordinated metadata semantics to 1000 more organizations and incentivize the citizen science volunteers who make this possible, with a closer connection to the local benefits they produce through their efforts. With integration into Social impact Bond programs and public/private partnerships, we are able to incentivize collective action in ways that match the scope and scale of the problems we face.

  49. Sep 2019
    1. Google Translate translation into English:

      Freedom of information. The movement behind Open Science will soften the academic evaluation culture and pull researchers out of the clutches of journals. Interview with one of the movement's front figures, the "detached paleontologist" Jon Tennant.

      All data is born free

      By RASMUS EGMONT FOSS

      More and more researchers are frustrated by the state of science in 2019. Academic journals have too much power over research, they say. Many test results cannot be reproduced. And they are tired of being measured and weighed with a wealth of numbers that quantify the fruits of their labor. In a revolt against the prevailing norms, a growing number of dissatisfied scientists are gathering in these years behind the Open Science movement. People are angry about many things: publishers' profit margins. The time it takes to publish in journals. The way they are evaluated. Open Science is a reaction to all that, a counter-movement that brings together the frustration of a big wave that no one really knows what stands for or where to go, says British Jon Tennant, one of the leading proponents of the movement. Tennant has paused a promising career in paleontology and travels around the world as a "looser" for years to spread the enthusiasm for an open science. In particular, he has been noted as the founder of Open Science MOOC, an online community and educational platform in the field. He is currently visiting the University of Southern Denmark. The broad group of supporters ranges from those who simply want scars to make all academic articles freely available on the web, to those who want to revolutionize the work of researchers. They strive to engage colleagues in every aspect of their work, for example, by exchanging ideas, releasing early data, or the crowdsource editing process. Several organizations and scientists are joining the cause in these years. The movement is particularly characterized by iniciacives such as Plan S, a project to release all government-funded research from 2021, which is, among other things, larger by the European Commission. Also, foundations such as the Gates Foundation have promoted the ideas by forcing all beneficiaries to share their data. Common to followers is that they will bring modern research closer to the real purpose of science, as they see it: to increase the knowledge base of society by working in groups rather than in silos. Several of them have now started pointing fingers at the universities' growing evaluation culture as the main obstacle to achieving that goal. It distorts researchers' motivation and creates an unhealthy environment, they say. The biggest problem today is how scientists are measured and who has control over that evaluation system, Jon Tennant believes. Researchers are to a greater extent measured by how much and how much they publish than what they publish. It gives wrong incentives. At the same time, the evaluation process itself is guided by the commercial interests of a narrow group of publishers who do not always share the researchers' interests. Today, scientists are not in control of systems, and that is a major problem, he elaborates.

      JON Tennant and the Open Science movement will do away with what German sociologist Steffen Mau has dubbed “the quantification culture of science. Over the past few decades, many universities have begun to adapt their culture to live up to the rankings and scoring systems that give prestige in the field. In the researchers' everyday life, factors such as circulation rates and h-indices (a measure of a researcher's influence) as well as the impact factors of journals, for example, have gained great importance for their career and reputation among colleagues. The voices behind Open Science want a new model. It must promote quality research and be responsible to the community rather than narrow interests. The first step is to expand access to academic articles. Researchers need to be able to build on everyone's work, and private publishers should not have the power over the product, they say. According to advocates like Jon Tennant, we should also open up the entire scientific process by using the Internet better. The journals must still have a place in the system, but today their old-fashioned model stands in the way of communicating our research effectively. We are not taking advantage of network technology opportunities well enough, he says. From a new idea arises, until the method is developed, data is obtained and the conclusions are available, everyone should be able to follow and propose improvements, the invitation reads. For example, researchers should publish their plans for new projects before they begin collecting data (a so-called pre-registration) and should be encouraged to share their results before the article is published (a micro-publication). But as long as publishers such as Elsevier and Springer Nature have power over researchers' careers, researchers lack the incentive to collaborate openly and inspire each other, Jon Tennant believes. A more open and free process could also solve the reproducibility crisis in science by making studies more transparent. At the same time, it has the potential to prevent large amounts of time wasting, as researchers will be able to see other people's failed projects before starting their own. OPEN Science is part of a larger modern movement, which, according to Israeli historian Yuval Noah Harari, is "the first since 1789 to invent a whole new freedom of value information. There is the idea that data has the right to be free and that humans should not restrict its movements. The mindset is the phloxof behind projects like Wikipedia, Google and Open Source in software programming. Based on that logic, the power must lie with the community and not a narrow group of editors when the quality of the researchers' work needs to be assessed (for it must, after all). We should not discard the peer review model, merely reform it, says Jon Tennant: We still need to evaluate the quality of research, but we should take advantage of opportunities in online community and networking. However, a new evaluation culture has its own pitfalls, and the biggest uncertainties about the Open Science agenda stem from this. Prestigious journals such as Nature and Science give scientists and lay people confidence that their articles are trustworthy. Everyone needs these kinds of pointers when navigating the academic world. At the same time, there is no guarantee that the quality of research will increase when the masses decide. The risk of a democratic evaluation system is that it creates a new and more intense quantification cult, where research articles are instead measured on colleagues' ratings, as we know it from services like Uber and Tripadvisor. Competition for prestige is an inevitable part of any industry, and today's race will simply be replaced by a new one - on other terms. Here, other studies will lose the battle, probably those with a narrower appeal. The established institutions have an ambivalent relationship with the Open Science movement. Leaders at universities and publishers positively mention it in closed forums, Jon Tennant says, but would rather stick to their existing benefits as long as they can. They also hesitate because the consequences of the new regime are unpredictable. Everyone is scared to move like the first, he says.

    2. Original content in Danish:

      Informationsfrihed. Bevægelsen bag Open Science vil mildne den akademiske evalueringskultur og trække forskerne ud af tidsskrifternes kløer. Interview med en af bevægelsens frontfigurer, den «løsgående palæontolog» Jon Tennant.

      Alle data er født frie

      Af RASMUS EGMONT FOSS

      Flere og flere forskere er frustrerede over videnskabens tilstand anno 2019. Udgiverne af akademiske tidsskrifter har for stor magt over forskningen, siger de. Mange forsøgsresultater kan ikke reproduceres. Og de er trætte af at blive målt og vejet med et væld af tal, som kvantificerer frugten af deres arbejde. I et oprør mod de herskende normer samler et stigende antal utilfredse forskere sig i disse år bag bevægelsen Open Science. Folk er vrede over mange ting: Udgivernes profitmargener. Tiden, der tager at publicere i tidsskrifter. Måden, de bliver evalueret på. Open Science er en reaktion mod alt det, en modbevægelse, der samler frustrationen i en stor bølge, som ingen rigtigt ved, hvad står for, eller hvor bevæger sig hen, fortæller britiske Jon Tennant, en af de førende fortalere for bevægelsen. Tennant har sat en lovende karriere inden for palæonrologien på pause og rejser verden rundt som «løsgænger« for ar udbrede begejstringen for en åben videnskab. Han har især gjort sig bemærket som stifter af Open Science MOOC, et online fællesskab og uddannelsesplatform på området. I disse måneder er han på besøg på Syddansk Universitet. Den brede gruppe af støtter spænder fra dem, der blot ønsker ar gøre alle akademiske artikler frit tilgængelige på nettet, til dem, som ligefrem vil revolucionere forskernes arbejde. De stræber efter at indvie kolleger i alle aspekter af deres arbejde, for eksempel ved at udveksle ideer, frigive tidlige data eller crowdsource redigeringsprocessen. Adskillige organisationer og videnskabsfolk slutter sig til sagen i disse år. Bevægelsen er især kendetegnet ved iniciaciver som Plan S, et projekt om at frigive al statsfinansieret forskning fra 2021, der blandt andet størres af EU-Kommissionen. Også fonde som Gates Foundation har fremmet ideerne ved at tvinge alle støttemodtagere til at dele deres data. Fælles for tilhængerne er, at de vil bringe den moderne forskning tættere på videnskabens ækte formål, som de ser der: at forøge samfundets vidensbase ved at arbejde i flok frem for i siloer. Flere af dem er nu begyndt at pege fingre ad universiteternes voksende evalueringskultur som den vigtigste hindring til at nå det mål. Den forvrænger forskernes motivation og skaber er usundt miljø, siger de. Det største problem i dag er, hvordan forskere bliver målt, og hvem der har kontrollen over det evalueringssystem, mener Jon Tennant. Forskere bliver i højere grad målt på, hvor og hvor meget de publicerer, end hvad de udgiver. Det giver forkerte incitamencer. Samtidig er selve evalueringsprocessen styret af kommercielle interesser hos en snæver gruppe udgivere, som ikke altid deler forskernes interesser. I dag er forskerne ikke i kontrol over systemer, og det er et stort problem, uddyber han.

      JON Tennant og Open Science-bevægelsen vil gøre op med det, som den tyske sociolog Steffen Mau har døbt "kvantificeringskulturen i videnskaben. Over de seneste årtier er mange universiteter begyndt ar tilpasse deres kultur for at leve op til de ranglister og pointsystemer, som giver prestige på feltet. l forskernes hverdag har faktorer som cirationsrater og h-indeks (en målestok for en forskers indflydelse) samt tidsskrifternes impact factors for eksempel opnået stor betydning for deres karriere og anseelse blandt kolleger. Stemmerne bag Open Science ønsker en ny model. Den skal fremme kvalitetsforskning og være ansvarlig over for fællesskabet frem for snævre interesser. Første skridt er ar udbrede adgangen til akademiske artikler. Forskere skal kunne bygge videre på alles arbejde, og private udgivere bør ikke have magten over produktet, siger de. I følge talsmænd som Jon Tennant bør vi også åbne hele den videnskabelige proces op ved at bruge internettet bedre. Tidsskrifterne skal fortsat have en plads i systemet, men idag står deres gammeldags model i vejen for at kommunikere vores forskning effektivt. Vi udnytter slet ikke netværksteknologiens muligheder godt nok, siger han. Fra en ny ide opstår, til metoden udvikles, data indhentes, og konklusionerne foreligger, skal alle kunne følge med og foreslå forbedringer, lyder opfordringen. Forskere bør for eksempel publicere deres planer for nye projekter, inden de går i gang med at indsamle data (en såkalt førregistrering) , og de skal opfordres til at dele deres resultater, før artiklen udkommer (en mikroudgivelse). Men så længe udgivere som Elsevier og Springer Nature har magt over forskernes karrierer, mangler forskerne incitamentet ril at samarbejde åbent og inspirere hinanden, mener Jon Tennant. En mere åben og fri proces vil også kunne løse reproducerbarhedskrisen i videnskaben ved at gøre studier mere transparente. Samtidig har det potentialet til at forhindre store mængder tidsspilde, da forskere vil kunne se andres fejlslagne projekter, før de begynder deres eget. OPEN Science er del af en større moderne bevægelse, som ifølge den israelske historiker Yuval Noah Harari er "den første siden 1789, der har opfundet en helt ny værdiinformationsfrihed. Der er ideen om, ar data har ret til at være frit, og at mennesker ikke bør begrænse dets bevægelser. Tankesættet udgør fllosofien bag projekter som Wikipedia, Google og Open Source inden for softwareprogrammering. Ud fra den logik skal magten ligge hos fællesskabet og ikke en smal gruppe af redaktører, når kvaliteten af forskernes arbejde skal vurderes (for der skal den trods alt). Vi skal ikke kassere peer review-modellen, blot reformere den, siger Jon Tennant: Vi skal stadig evaluere kvaliteten af forskningen, men vi bør udnytte mulighederne i online fællesskab og netværk. En ny evalueringskultur har dog sine egne faldgruber, og de største usikkerheder ved agendaen i Open Science stammer herfra. Prestigefyldte tidsskrifter som Nature og Science giver forskere og lægfolk tillid til, at deres artikler er troværdige. Alle har brug for den slags pejlemærker, når de skal navigere i den akademiske verden. Der er samtidig ingen garanti for, at forskningens kvalitet stiger, når masserne bestemmer. Risikoen ved et demokratisk evalueringssysrem er, at det skaber en ny og mere intens kvantificeringskult, hvor forskningsarcikler istedet måles på kollegernes ratinger, som vi kender det fra tjenester som Uber og Tripadvisor. Konkurrencen om prestige er en uundgåelig del af enhver branche, og dagens ræs vil blot erstattes af et nyt – på andre præmisser. Her vil andre studier tabe kampen, formentlig dem med en smallere appel. De etablerede institutioner har et ambivalent forhold til Open Science-bevægelsen. Ledere hos universiteter og udgivere omtaler den positivt i lukkede fora, fortæller Jon Tennant, men vil helst holde fast i deres eksisterende fordele, så længe de kan. De tøver også, fordi konsekvenserne af det nye regime er uforudsigelige. Alle er bange for ar flytte sig som de første, siger han.

  50. Jul 2019
  51. Jun 2019
  52. Apr 2019
    1. A Vision for Scholarly Communication Currently, there is a strong push to address the apparent deficits of the scholarly communication system. Open Science has the potential to change the production and dissemination of scholarly knowledge for the better, but there is no commonly shared vision that describes the system that we want to create.

      A Vision for Scholarly Communication

  53. Mar 2019
    1. The main purpose of the Discovery IN is to provide interfaces and other user-facing services for data discovery across disciplines. We explore new and innovative ways of enabling discovery, including visualizations, recommender systems, semantics, content mining, annotation, and responsible metrics. We apply user involvement and participatory design to increase usability and usefulness of the solutions. We go beyond academia, involving users from all stakeholders of research data. We create FAIR and open infrastructures, following the FAIR principles complemented by the principles of open source, open data, and open content, thus enabling reuse of interfaces and user-facing services and continued innovation. Our main objectives are:
  54. Feb 2019
    1. Research methodologies and methods used must be open for full discussion and review by peers and stakeholders.

      So does this mean totally open? As in publish your protocols open?

    1. every individual has the means to decide how their knowledge is governed and managed to address their needs
    2. knowledge commons

      The idea of a "knowledge commons" was referenced in the book, "Campesino a Campesino: Voices from Latin America’s Farmer to Farmer Movement for Sustainable Agriculture" by Eric Holt-Giménez in the context of agroecological knowledge inherent in agrarian communities in Latin America.