219 Matching Annotations
  1. Last 7 days
    1. Untersuchungen zeigen, dass die COP28 mit dem Emissions Peak für Treibhausgase zusammenfallen könnte. Um das 1,5°-Ziel zu erreichen, müssten allerdings die Emissionen bis 2030 um die Hälfte sinken. https://www.theguardian.com/environment/ng-interactive/2023/nov/29/cop28-what-could-climate-conference-achieve

  2. Nov 2023
    1. Die Pläne der Kohle-, Öl- und gasproduzierenden Staaten zur Ausweitung der Förderung würden 2030 zu 460% mehr Kohle, 83% mehr Gas und 29% mehr Ölproduktion führen, als mit dem Pariser Abkommen vereinbar ist. Der aktuelle Production Gap Report der Vereinten Nationen konzentriert sich auf die 20 stärksten Verschmutzer-Staaten, deren Pläne fast durchgängig in radikalem Widerspruch zum Pariser Abkommen stehen. https://www.theguardian.com/environment/2023/nov/08/insanity-petrostates-planning-huge-expansion-of-fossil-fuels-says-un-report

      Report: https://productiongap.org/

  3. Sep 2023
    1. “Given the high unemployment rate in South Africa as well … you cannot sell it as a climate change intervention,” says Deborah Ramalope, head of climate policy analysis at the policy institute Climate Analytics in Berlin. “You really need to sell it as a socioeconomic intervention.”
      • for: quote, quote - climate change intervention, Trojan horse, Deborah Ramalope
      • quote
        • Given the high unemployment rate in South Africa as well … you cannot sell it as a climate change intervention, you really need to sell it as a socioeconomic intervention.
      • author: Deborah Ramalop
      • date: Aug. 15, 2023
      • source: https://www.wired.co.uk/article/just-energy-transition-partnerships-south-africa-cop
      • comment
        • A Trojan horse strategy
  4. Jul 2023
    1. Fundamentals for crypto Token Terminal is a platform that aggregates financial data on the leading blockchains and decentralized applications.

  5. Jun 2023
    1. Debug mode allows you to see only the data generated by your device while validating analytics and also solves the purpose of having separate data streams for staging and production (no more separate data streams for staging and production).

      good to know.

      Seems to contradict their advice on https://www.optimizesmart.com/using-the-ga4-test-property/ to create a test property...

    1. Will not read or write first-party [analytics cookies]. Cookieless pings will be sent to Google Analytics for basic measurement and modeling purposes.
  6. Feb 2023
    1. student outcomes, including learning, persistence, or attitudes.

      I would think that this would be one of the easiest things to measure and also would provide significant and useful data. We should check in with Brian (?) to see what data is currently being tracked.

  7. Jan 2023
    1. 3.1 Guest Lecture: Lauren Klein » Q&A on "What is Feminist Data Science?"<br /> https://www.complexityexplorer.org/courses/162-foundations-applications-of-humanities-analytics/segments/15631


      Theories of Power

      Patricia Hill Collins' matrix of domination - no hierarchy, thus the matrix format

      What are other broad theories of power? are there schools?

      Relationship to Mary Parker Follett's work?

      Bright, Liam Kofi, Daniel Malinsky, and Morgan Thompson. “Causally Interpreting Intersectionality Theory.” Philosophy of Science 83, no. 1 (January 2016): 60–81. https://doi.org/10.1086/684173.

      about Bayesian modeling for intersectionality

      Where is Foucault in all this? Klein may have references, as I've not got the context.

      How do words index action? —Laura Klein

      The power to shape discourse and choose words - relationship to soft power - linguistic memes

      Color Conventions Project

      20:15 Word embeddings as a method within her research

      General result (outside of the proximal research) discussed: women are more likely to change language... references for this?

      [[academic research skills]]: It's important to be aware of the current discussions within one's field. (LK)

      36:36 quantitative imperialism is not the goal of humanities analytics, lived experiences are incredibly important as well. (DK)

    1. https://www.complexityexplorer.org/courses/162-foundations-applications-of-humanities-analytics/segments/15625


      Looking at three broad ideas with examples of each to follow: - signals - patterns - pattern making, pattern breaking

      Proceedings of the Old Bailey, 1674-1913

      Jane Kent for witchcraft

      250 years with ~200,000 trial transcripts

      Can be viewed as: - storytelling, - history - information process of signals

      All the best trials include the words "Covent Garden".

      Example: 1163. Emma Smith and Corfe indictment for stealing.

      19:45 Norbert Elias. The Civilizing Process. (book)

      Prozhito: large-scale archive of Russian (and Soviet) diaries; 1900s - 2000s

      How do people understand the act of diary-writing?

      Diaries are:

      Leo Tolstoy

      a convenient way to evaluate the self

      Franz Kafka

      a means to see, with reassuring clarity [...] the changes which you constantly suffer.

      Virginia Woolf'

      a kindly blankfaced old confidante

      Diary entries in five categories - spirit - routine - literary - material form (talking about the diary itself) - interpersonal (people sharing diaries)

      Are there specific periods in which these emerge or how do they fluctuate? How would these change between and over cultures?

      The pattern of talking about diaries in this study are relatively stable over the century.

      pre-print available of DeDeo's work here

      Pattern making, pattern breaking

      Individuals, institutions, and innovation in the debates of the French Revolution

      • transcripts of debates in the constituent assembly

      the idea of revolution through tedium and boredom is fascinating.

      speeches broken into combinations of patterns using topic modeling

      (what would this look like on commonplace book and zettelkasten corpora?)

      emergent patterns from one speech to the next (information theory) question of novelty - hi novelty versus low novelty as predictors of leaders and followers

      Robespierre bringing in novel ideas

      How do you differentiate Robespierre versus a Muppet (like Animal)? What is the level of following after novelty?

      Four parts (2x2 grid) - high novelty, high imitation (novelty with ideas that stick) - high novelty, low imitation (new ideas ignored) - low novelty, high imitation - low novelty, low imitation (discussion killers)

      Could one analyze television scripts over time to determine the good/bad, when they'll "jump the shark"?

  8. Jul 2022
  9. May 2022
    1. faciliter l’accès aux données des systèmes du MES, notamment par rapport à laréussite de groupes ciblés d’étudiants (par exemple, les étudiants en situationde handicap, les étudiants autochtones, les étudiants issus de l’immigration, lesétudiants de première génération3, les étudiants internationaux)
  10. Apr 2022
    1. Adam Kucharski. (2021, February 6). COVID outlasts another dashboard... Https://t.co/S9kLCva3WQ Illustrates the importance of incentivising sustainable outbreak analytics—If a tool is useful, people will come to rely on it, which creates a dilemma if it can’t be maintained. [Tweet]. @AdamJKucharski. https://twitter.com/AdamJKucharski/status/1357970753199763457

  11. Feb 2022
    1. Contemporary digital learning technologies generate, store, and share terabytes of learner data—which must flow seamlessly and securely across systems. To enable interoperability and ensure systems can perform at-scale, the ADL Initiative is developing the Data and Training Analytics Simulated Input Modeler (DATASIM), a tool for producing simulated learner data that can mimic millions of diverse user interactions. view image full screen DATASIM application screen capture. DATASIM is an open-source platform for generating realistic Experience Application Programming Interface (xAPI) data at very large scale. The xAPI statements model realistic behaviors for a cohort of simulated learner/users, producing tailorable streams of data that can be used to benchmark and stress-test systems. DATASIM requires no specialized hardware, and it includes a user-friendly graphical interface that allows precise control over the simulation parameters and learner attributes.
    1. The video profile of the xAPI was created to identify and standardize the common types of interactions that can be tracked in any video player.
  12. Jan 2022
    1. xAPI Wrapper Tutorial Introduction This tutorial will demonstrate how to integrate xAPI Wrapper with existing content to capture and dispatch learning records to an LRS.

      roll your own JSON rather than using a service like xapi.ly

    1. Storyline 360 xAPI Updates (Winter 2021)Exciting xAPI update for Storyline users! Articulate has updated Storyline 360 to support custom xAPI statements alongside a few other xAPI-related updates. (These changes will likely come to Storyline 3 soon, though not as of November 30, 2021.)
    1. Making xAPI Easier Use the xapi.ly® Statement Builder to get more and better xAPI data from elearning created in common authoring platforms. xapi.ly helps you create the JavaScript triggers to send a wide variety of rich xAPI statements to the Learning Record Store (LRS) of your choice.

      criteria for use and pricing listed on site

    1. Here you will find a well curated list of activities, activity types, attachments types, extensions, and verbs. You can also add to the registry and we will give you a permanently resolvable URL - one less thing you have to worry about. The registry is a community resource, so that we can build together towards a working Tin Can data ecosystem.

      **participant in the Spring 2022 XAPI cohort suggested that 'Registry is not maintained, and they generally suggest using the Vocab Server (which is also the data source for components in the Profile Server).'

  13. www.json.org www.json.org
    1. JSON (JavaScript Object Notation) is a lightweight data-interchange format. It is easy for humans to read and write. It is easy for machines to parse and generate. It is based on a subset of the JavaScript Programming Language Standard ECMA-262 3rd Edition - December 1999. JSON is a text format that is completely language independent but uses conventions that are familiar to programmers of the C-family of languages, including C, C++, C#, Java, JavaScript, Perl, Python, and many others. These properties make JSON an ideal data-interchange language.
    1. The xAPI Vocabulary and Profile Server is a curated list of xAPI vocabulary concepts and profiles maintained by the xAPI community.
    1. xAPI Foundations Leverage xAPI to develop more comprehensive learning experiences. This on-demand e-learning course is available online immediately after purchase. Within the course, you will have the opportunity to personalize your learning by viewing videos, interacting with content, hearing from experts, and planning for your future. You will have access to the course(s) for 12 months from your registration date.
    1. Learning program analytics seek to understand how an overall learning program is performing. A learning program typically encompasses many learners and many learning experiences (although it could easily contain just a few).
    2. Learning experience analytics seek to understand more about a specific learning activity. 
    3. Learner analytics seek to understand more about a specific person or group of people engaged in activities where learning is one of the outputs.
    4. There are many types of learning analytics and things you can measure and analyze. We segment these analytics into three categories: learning experience analytics, learner analytics, and learning program analytics.
    5. Learning analytics is the measurement, collection, analysis, and reporting of data about learners, learning experiences, and learning programs for purposes of understanding and optimizing learning and its impact on an organization’s performance.
    1. Social learning This is a feature the LXP has really expanded. Although some of the more advanced LMSs boast social features, the Learning Experience Platform is better formatted for them and far more likely to provide.  Firstly, the LXP caters for a broader range of learning options than the LMS. It’s usually not difficult to use your LXP to set up an online class or webinar.  LXPs also provide a chance for learners to share their opinions on content: liking, sharing, or commenting on an article or online class. Users can follow and interact with others, above or below them in the organisation. Sometimes LXPs even provide people curation, matching learners and mentors.  Users also have a chance to make the LXP their own by setting up a personalised profile page. It might seem low-priority, but a sense of ownership usually corresponds with a boost in engagement.  As well prepared as Learning & Development leaders are, there’ll be things that people doing a job every day will know that you won’t. They can use their personal experience to recommend or create learning content in an LXP. This helps on-the-job learning and gives employees a greater chance of picking up the skills they need to progress in their role. 
    1. How, exactly, can we design for engagement and conversation? In comparison to content-focused educational technology such as the Learning Management System (LMS), our (not so secret) recipe is this:1. Eliminate the noise2. Bring people into the same room3. Make conversation easy and meaningful4. Create modularity and flexibility

      Spring 2022 #xAPICohort resource

    1. To learn more, there are two books I highly recommend. "Digital Body Language," by Steve Woods, and "Big Data: Does Size Matter?" by Timandra Harkness. If you would like a deeper dive into data-driven learning design, there's a free e-book and toolkit you can download from my blog. You can also reach me there at loriniles.com. Remember, start with the data you have readily available. Data does not have to be intimidating Excel spreadsheets. Be prepared with data every single time you meet with your stakeholders. And before you design any strategy, ask what data you have to support every decision. You're on an exciting journey to becoming a more well-rounded HR leader. Get started, and good luck.

      Spring 2022 #xAPICohort resource

    1. LXPs and LMSs accomplish two different objectives. An LMS enables administrators to manage learning, while an LXP enables learners to explore learning. Organizations may have an LXP, an LMS or both. If they have both, they may use the LXP as the delivery platform and the LMS to handle the administrative work.

      Spring 2022 #xAPICohort resource

    2. 4. Highly intuitive interfaces

      Spring 2022 #xAPICohort resource

    3. 3. Supports various types of learning

      Spring 2022 #xAPICohort resource

    4. 2. Rich learning experience through deeper personalization

      Spring 2022 #xAPICohort resource

    5. Here are some other characteristics that set LXPs apart from LMS’s: 1. Extensive integration capabilities

      Spring 2022 #xAPICohort resource

    6. The gradual shift, from one-time pay to cloud-based subscription-based business has lead learning platforms to also offer Software-as-a-Service (SaaS) models to their clients. As such content becomes part of digital learning networks, they are integrated into commercial learning solutions and then become part of broader LXPs. Looking back at all these developments, from how new data consumption platforms evolved, to the emergence of newer content development approaches and publishing channels, it’s easy to understand why LXPs naturally evolved as a result of DXPs.

      Spring 2022 #xAPICohort resource

    7. The growth of social learning has also created multiple learning opportunities for people to share their knowledge and expertise. As they socialize on these platforms (Facebook, LinkedIn, YouTube, Instagram and many others), individuals and groups learn from each other through various types of social interactions – sharing content, exchanging mutually-liked links to external content. LXP leverage similar approaches in corporate learning environments, and scale learning experience and opportunities with such user-generated content as found in social and community-based learning.

      Spring 2022 #xAPICohort resource

    8. Integrations are also possible with AI. If you integrate LXP and your Human Resource Management (HRM) system, the corporate intranet, your Learning Record Store (LRS) or the enterprise Customer Relationship Management (CRM) system, and collect the data from all of them, you can identify many different trends and patterns. And based on those patterns, all stakeholders can make informed training and learning decisions. Standard LMS’s cannot do any of that. And though LMS developers are trying to get there, they’ve still got a long way to go to bridge the functionality gap with LXPs. As a result, there was an even greater impetus to the emergence of LXPs.

      Spring 2022 #xAPICohort resource

    9. Another driver for the emergence of LXP’s is the standards adopted by modern-day LMS’s – which are SCORM-based. While SCORM does “get results”, it is limited in what it can do. One of the main goals of any corporate learning platform is to connect learning with on-the-job performance. And SCORM makes it very difficult to decide how effective the courses really are, or how learners benefit from these courses. Experience API (xAPI) on the other hand – the standard embraced by LXPs – offers significantly enhanced capabilities to the platform. When you use xAPI, you can follow different parameters both while you learn and perform on the job tasks. And, what’s even better is that you can do that on a variety of digital devices.

      Spring 2022 #xAPICohort resource

    10. LMS’s primarily served as a centralized catalog of corporate digital learning assets. Users of those platforms often found it hard to navigate through vast amounts of content to find an appropriate piece of learning. LMS providers sought to bridge that gap by introducing smart searches and innovative querying features – but that didn’t entirely address the core challenge: LMS’s were still like huge libraries where you should only go to when you have an idea of what you need, and then spend inordinate amounts of time searching for what you specifically want!

      Spring 2022 #xAPICohort resource

    1. Experience API (xAPI) is a tool for gaining insight into how learners are using, navigating, consuming, and completing learning activities. In this course, Anthony Altieri provides an in-depth look at using xAPI for learning projects, including practical examples that show xAPI in action.

      Spring 2022 #xAPICohort resource

    1. The xAPI Learning Cohort is a free, vendor-neutral, 12-week learning-by-doing project-based team learning experience about the Experience API. (Yep, you read that right – free!) It’s an opportunity for those who are brand new to xAPI and those who are looking to experiment with it to learn from each other and from the work itself.

      Spring 2022 #xAPICohort resource

    1. If your current course development tools don't create the activity statements you need, keep in mind that sending xAPI statements requires only simple JavaScript, so many developers are coding their own form of statements from scratch.

      Spring 2022 #xAPICohort resource

    2. An xAPI activity statement records experiences in an "I did this" format. The format specifies the actor, verb, object: the actor (who did it), a verb (what was done), a direct object (what it was done to) and a variety of contextual data, including score, rating, language, and almost anything else you want to track. Some learning experiences are tracked with a single activity statement. In other instances, dozens, if not hundreds, of activity statements can be generated during the course of a learning experience. Activity statements are up to the instructional designer and are driven by the need for granularity in reporting.

      Spring 2022 #xAPICohort resource

    3. xAPI is a simple, lightweight way to store and retrieve records about learners and share these data across platforms. These records (known as activity statements) can be captured in a consistent format from any number of sources (known as activity providers) and they are aggregated in a learning record store (LRS). The LRS is analogous to the SCORM database in an LMS. The x in xAPI is short for "experience," and implies that these activity providers are not just limited to traditional AICC- and SCORM-based e-learning. With experience API or xAPI you can track classroom activities, usage of performance support tools, participation in online communities, mentoring discussions, performance assessment, and actual business results. The goal is to create a full picture of an individual's learning experience and how that relates to her performance.

      Spring 2022 #xAPICohort resource

    1. For any xAPI implementation, these five things need to happen:A person does something (e.g., watches a video).That interaction is tracked by an application.Data about the interaction is sent to an LRS.The data is stored in the LRS and made available for use.Use the data for reporting and personalizing a learning experience.In most implementations, multiple learner actions are tracked by multiple applications, and data may be used in a number of ways. In all cases, there’s an LRS at the center receiving, storing, and returning the data as required.

      Spring 2022 #xAPICohort resource

    2. Experience API (also xAPI or Tin Can API) is a learning technology interoperability specification that makes it easier for learning technology products to communicate and work with one another.

      Spring 2022 #xAPICohort resource

    1. Instructional DesignerWhen implementing xAPI across an organization, there isn’t usually a need for instructional designers to take on new roles or duties. However, they may experience a learning curve that presents an opportunity to understand how to best package and effectively deploy xAPI in newly created content. Your learning designer(s) is a key partner in getting good data, so keep them in the loop regarding your strategy, goals, and expected outcomes.

      Spring 2022 #xAPICohort resource

  14. Nov 2021
  15. Oct 2021
    1. Analytics is the key to understanding your app's users: Where are they spending the most time in your app? When do they churn? What actions are they taking?
    1. How to Install the DigitalOcean Metrics Agent

      DigitalOcean Monitoring

      DigitalOcean Monitoring is a free, opt-in service that gathers metrics about Droplet-level resource utilization. It provides additional Droplet graphs and supports configurable metrics alert policies with integrated email Slack notifications to help you track the operational health of your infrastructure.

  16. Aug 2021
    1. The Recorded Future system contains many components, which are summarized in the following diagram: The system is centered round the database, which contains information about all canonical event and entities, together with information about event and entity references, documents containing these references, and the sources from which these documents were obtained
    2. We have decided on the term “temporal analytics” to describe the time oriented analysis tasks supported by our systems

      RF have decided on the term “temporal analytics” to describe the time oriented analysis tasks supported by our systems

  17. Jul 2021
    1. https://blog.jonudell.net/2021/07/21/a-virtuous-cycle-for-analytics/

      Some basic data patterns and questions occur in almost any business setting and having a toolset to handle them efficiently for both the end users and the programmers is an incredibly important function.

      Too often I see businesses that don't own their own data or their contracting out the programming portion (or both).

  18. Mar 2021
    1. Plausible is a lightweight, self-hostable, and open-source website analytics tool. No cookies and fully compliant with GDPR, CCPA and PECR. Made and hosted in the EU 🇪🇺

      Built by

      Introducing https://t.co/mccxgAHIWo 🎉<br><br>📊 Simple, privacy-focused web analytics<br>👨‍💻 Stop big corporations from collecting data on your users<br>👉 Time to ditch Google Analytics for a more ethical alternative#indiehackers #myelixirstatus #privacy

      — Uku Täht (@ukutaht) April 29, 2019
      <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>

  19. Feb 2021
  20. Nov 2020
    1. Spotting a gem in it takes something more. Without domain knowledge, business acumen, and strong intuition about the practical value of discoveries—as well as the communication skills to convey them to decision-makers effectively—analysts will struggle to be useful. It takes time for them to learn to judge what’s important in addition to what’s interesting. You can’t expect them to be an instant solution to charting a course through your latest crisis. Instead, see them as an investment in your future nimbleness.

      This is where the expectations from today's organizations differ and lead to a big gap in expectations from Analyts and Analytics as a function.

  21. Oct 2020
    1. Conclusiones

      Tratando de encontrarle una utilidad real al HR analytics, pudieran considerarse las siguientes:

      1. conocer los ingresos promedio que genera cada empleado, como una medida de la eficiencia de una organización
      2. conocer la tasa de aceptación de ofertas, es decir, el número de ofertas de trabajo formales aceptadas entre el número total de ofertas de trabajo, para redefinir la estrategia de adquisición de talento de la empresa.
      3. conocer los gastos de formación por empleado, para reevaluar el gasto de capacitación por empleado.
      4. conocer la eficiencia de la formación, analizando la mejora del rendimiento, para evaluar la eficacia de un programa de formación.
      5. conocer la tasa de rotación voluntaria e involuntaria, para identificar la experiencia de los empleados que lo conducen a la deserción voluntaria o para desarrollar un plan para mejorar la calidad de las contrataciones para evitar la rotación involuntaria.
      6. conocer el tiempo de reclutamiento y de contratación, para reducir este tiempo.
      7. conocer el absentismo, que es una métrica de productividad, o como un indicador de la satisfacción laboral de los empleados.
      8. conocer el riesgo de capital humano, para identificar la ausencia de una habilidad específica para ocupar un nuevo tipo de trabajo, o la falta de empleados calificados para ocupar puestos de liderazgo.
    2. Los Modelos HR Analytics

      Existe una gran variedad de software para HR analytics, entre otros Sisense, Domo, Clic data, Domo, Activ trak. Pero para utilizarlos un departamento de RRHH debe estar capacitado específicamente en ello.

    3. Lecciones aprendidas

      para utilizar esta herramienta se requiere que participen personas que conozcan los procesos organizacionales de la empresa, además de expertos en psicometría y analítica estadística.

    4. Human Resources Analytics (HR Analytics

      HR Analytics es una metodología para obtener datos de los empleados para analizarlos y hallar evidencias para la toma de decisiones estratégicas.

    1. Om Malik writes about a renewed focus on his own blog: My first decree was to eschew any and all analytics. I don’t want to be driven by “views,” or what Google deems worthy of rank. I write what pleases me, not some algorithm. Walking away from quantification of my creativity was an act of taking back control.

      I love this quote.

    2. What I dwell on the most regarding syndication is the Twitter stuff. I look back at the analytics on this site at the end of every year and look at where the traffic came from — every year, Twitter is a teeny-weeny itty-bitty slice of the pie. Measuring traffic alone, that’s nowhere near the amount of effort we put into making the stuff we’re tweeting there. I always rationalize it to myself in other ways. I feel like Twitter is one of the major ways I stay updated with the industry and it’s a major source of ideas for articles.

      So it sounds like Twitter isn't driving traffic to his website, but it is providing ideas and news. Given this I would syndicate content to Twitter as easily and quickly as possible, use webmentions to deal with the interactions and then just use the Twitter timeline for reading and consuming and nothing else.

  22. Jul 2020
  23. Jun 2020
    1. The bit.ly links that are created are also very diverse. Its harder to summarise this without offering a list of 100,000 of URL’s — but suffice it to say that there are a lot of pages from the major web publishers, lots of YouTube links, lots of Amazon and eBay product pages, and lots of maps. And then there is a long, long tail of other URL’s. When a pile-up happens in the social web it is invariably triggered by link-sharing, and so bit.ly usually sees it in the seconds before it happens.

      link shortener: rich insight into web activity...

  24. May 2020
    1. You should then also create a new View and apply the following filter so as to be able to tell apart which domain a particular pageview occurred onFilter Type: Custom filter > AdvancedField A --> Extract A: Hostname = (.*)Field B --> Extract B: Request URI = (.*)Output To --> Constructor: Request URI = $A1$B1
  25. Mar 2020
    1. If you want to disable Google Analytics-tracking for this site, please click here: [delete_cookies]. The cookie which enabled tracking on Google Analytics is immediately removed.

      This is incomplete. The button is missing.

    1. Google Analytics created an option to remove the last octet (the last group of 3 numbers) from your visitor’s IP-address. This is called ‘IP Anonymization‘. Although this isn’t complete anonymization, the GDPR demands you use this option if you want to use Analytics without prior consent from your visitors. Some countris (e.g. Germany) demand this setting to be enabled at all times.
    1. Do you consider visitor interaction with the home page video an important engagement signal? If so, you would want interaction with the video to be included in the bounce rate calculation, so that sessions including only your home page with clicks on the video are not calculated as bounces. On the other hand, you might prefer a more strict calculation of bounce rate for your home page, in which you want to know the percentage of sessions including only your home page regardless of clicks on the video.
    1. Here you need to decide if you want to take a cautious road and put it into an “anonymous” mode or go all out and collect user identifiable data. If you go with anonymous, you have the ability to not need consent.
  26. Feb 2020
    1. One important aspect of critical social media research is the study of not just ideolo-gies of the Internet but also ideologies on the Internet. Critical discourse analysis and ideology critique as research method have only been applied in a limited manner to social media data. Majid KhosraviNik (2013) argues in this context that ‘critical dis-course analysis appears to have shied away from new media research in the bulk of its research’ (p. 292). Critical social media discourse analysis is a critical digital method for the study of how ideologies are expressed on social media in light of society’s power structures and contradictions that form the texts’ contexts.
    2. t has, for example, been common to study contemporary revolutions and protests (such as the 2011 Arab Spring) by collecting large amounts of tweets and analysing them. Such analyses can, however, tell us nothing about the degree to which activists use social and other media in protest communication, what their motivations are to use or not use social media, what their experiences have been, what problems they encounter in such uses and so on. If we only analyse big data, then the one-sided conclusion that con-temporary rebellions are Facebook and Twitter revolutions is often the logical conse-quence (see Aouragh, 2016; Gerbaudo, 2012). Digital methods do not outdate but require traditional methods in order to avoid the pitfall of digital positivism. Traditional socio-logical methods, such as semi-structured interviews, participant observation, surveys, content and critical discourse analysis, focus groups, experiments, creative methods, par-ticipatory action research, statistical analysis of secondary data and so on, have not lost importance. We do not just have to understand what people do on the Internet but also why they do it, what the broader implications are, and how power structures frame and shape online activities
    3. Challenging big data analytics as the mainstream of digital media studies requires us to think about theoretical (ontological), methodological (epistemological) and ethical dimensions of an alternative paradigm

      Making the case for the need for digitally native research methodologies.

    4. Who communicates what to whom on social media with what effects? It forgets users’ subjectivity, experiences, norms, values and interpre-tations, as well as the embeddedness of the media into society’s power structures and social struggles. We need a paradigm shift from administrative digital positivist big data analytics towards critical social media research. Critical social media research combines critical social media theory, critical digital methods and critical-realist social media research ethics.
    5. de-emphasis of philosophy, theory, critique and qualitative analysis advances what Paul Lazarsfeld (2004 [1941]) termed administrative research, research that is predominantly concerned with how to make technologies and administration more efficient and effective.
    6. Big data analytics’ trouble is that it often does not connect statistical and computational research results to a broader analysis of human meanings, interpretations, experiences, atti-tudes, moral values, ethical dilemmas, uses, contradictions and macro-sociological implica-tions of social media.
    7. Such funding initiatives privilege quantitative, com-putational approaches over qualitative, interpretative ones.
  27. Jan 2020
  28. Nov 2019
  29. Sep 2019
    1. “But then again,” a person who used information in this way might say, “it’s not like I would be deliberately discriminating against anyone. It’s just an unfortunate proxy variable for lack of privilege and proximity to state violence.

      In the current universe, Twitter also makes a number of predictions about users that could be used as proxy variables for economic and cultural characteristics. It can display things like your audience's net worth as well as indicators commonly linked to political orientation. Triangulating some of this data could allow for other forms of intended or unintended discrimination.

      I've already been able to view a wide range (possibly spurious) information about my own reading audience through these analytics. On September 9th, 2019, I started a Twitter account for my 19th Century Open Pedagogy project and began serializing installments of critical edition, The Woman in White: Grangerized. The @OPP19c Twitter account has 62 followers as of September 17th.

      Having followers means I have access to an audience analytics toolbar. Some of the account's followers are nineteenth-century studies or pedagogy organizations rather than individuals. Twitter tracks each account as an individual, however, and I was surprised to see some of the demographics Twitter broke them down into. (If you're one of these followers: thank you and sorry. I find this data a bit uncomfortable.)

      Within this dashboard, I have a "Consumer Buying Styles" display that identifies categories such as "quick and easy" "ethnic explorers" "value conscious" and "weight conscious." These categories strike me as equal parts confusing and problematic: (Link to image expansion)

      I have a "Marital Status" toolbar alleging that 52% of my audience is married and 49% single.

      I also have a "Home Ownership" chart. (I'm presuming that the Elizabeth Gaskell House Museum's Twitter is counted as an owner...)

      ....and more

  30. Jul 2019
    1. We translate all patient measurements into statisticsthat are predictive of unsuccesfull discharge

      Egy analitikai pipeline, kb amit nekünk is össze kéne hozni a végére.

  31. Apr 2019
    1. Annotation Profile Follow learners as they bookmark content, highlight selected text, and tag digital resources. Analyze annotations to better assess learner engagement, comprehension and satisfaction with the materials assigned.

      There is already a Caliper profile for "annotation." Do we have any suggestions about the model?

  32. Mar 2019
  33. Feb 2019
    1. Which segments of text are being highlighted?

      Do we capture this data? Can we?

    2. What types of annotations are being created?

      How is this defined?

    3. Who is posting most often? Which posts create the most replies?

      These apply to social annotation as well.

    4. Session Profile

      Are we capturing the right data/how can Hypothesis contribute to this profile?

    5. Does overall time spent reading correlate with assessment scores? Are particular viewing patterns/habits predictive of student success? What are the average viewing patterns of students? Do they differ between courses, course sections, instructors, or student demographics?

      Can H itself capture some of this data? Through the LMS?

  34. Dec 2018
    1. And while content analytics tools (e.g., Chartbeat, Parsely, Content Insights) and feedback platforms (e.g., Hearken, GroundSource) have thankfully helped close the gap, the core content management experience remains, for most of us, little improved when it comes to including the audience in the process.
  35. Jul 2018
  36. May 2018
    1. hi there check on the SAS Training and Tutorial with better analysis On the Data and forecasting methods for better implication on Business analytics


  37. Mar 2018
  38. Jan 2018
  39. Nov 2017
    1. Mount St. Mary’s use of predictive analytics to encourage at-risk students to drop out to elevate the retention rate reveals how analytics can be abused without student knowledge and consent

      Wow. Not that we need such an extreme case to shed light on the perverse incentives at stake in Learning Analytics, but this surely made readers react. On the other hand, there’s a lot more to be said about retention policies. People often act as though they were essential to learning. Retention is important to the institution but are we treating drop-outs as escapees? One learner in my class (whose major is criminology) was describing the similarities between schools and prisons. It can be hard to dissipate this notion when leaving an institution is perceived as a big failure of that institution. (Plus, Learning Analytics can really feel like the Panopticon.) Some comments about drop-outs make it sound like they got no learning done. Meanwhile, some entrepreneurs are encouraging students to leave institutions or to not enroll in the first place. Going back to that important question by @sarahfr: why do people go to university?

    1. Information from this will be used to develop learning analytics software features, which will have these functions: Description of learning engagement and progress, Diagnosis of learning engagement and progress, Prediction of learning progress, and Prescription (recommendations) for improvement of learning progress.

      As good a summary of Learning Analytics as any.