1,755 Matching Annotations
  1. Sep 2019
    1. Purchase green power certificates. In July 2017, China launched a pilot program that permits voluntary trade of green power certificates from solar and wind power. Each certificate represents 1 MGh of electricity. Buying green power certificates allows companies to claim environmental benefits associated with renewable energy generation, even if electricity from a renewable power plant does not feed directly into a data center facility.

      Oh, wow, they do RECS too now? I wonder if they publish them too

    1. On the other hand, a resource may be generic in that as a concept it is well specified but not so specifically specified that it can only be represented by a single bit stream. In this case, other URIs may exist which identify a resource more specifically. These other URIs identify resources too, and there is a relationship of genericity between the generic and the relatively specific resource.

      I was not aware of this page when the Web Annotations WG was working through its specifications. The word "Specific Resource" used in the Web Annotations Data Model Specification always seemed adequate, but now I see that it was actually quite a good fit.

  2. Aug 2019
    1. And worst of all, we’ve lost sight of the most salient part about computers: their malleability. We’ve acquiesced the creation of our virtual worlds, where we now spend most of our time, to the select few who can spend the millions to hire enough software engineers. So many of the photons that hit our eyes come from purely fungible pixels, yet for most of us, these pixels are all but carved in stone. Smartphone apps, like the kitchen appliances before them, are polished, single-purposes tools with only the meanest amount of customizability and interoperability. They are monstrosities of code, millions of lines, that only an army of programers could hope to tame. As soon as they can swipe, our children are given magical rectangles that for all their lives will be as inscrutable as if they were truly magic.

      I was a professional web developer for two years and I now have to bring myself to even touch CSS or the DOM. Whenever I have to make anything on the web work I know I'm gonna spend 3 hours in pain for something that should take 5 minutes.

    1. If you’re part of an existing climate campaigning group, think of how you can participate in or help organise around September 20 to keep the momentum going. Link your action explicitly to the school strikes (“We’re doing this because we’re answering the call of the striking students, and taking action”).
    1. Centric web solution is the renowned best web development company.

      We have a very highly experienced and expert development team who are experts in web design & development.

      We provide various services like Wordpress web development, eCommerce web development, Wordpress theme design and plugin development, website maintenance & optimization.

      For more our services call us on +91 98587 58541 or visit our website https://www.centricwebsolution.com/.

      Our Services Are:-

      • Web Design & Development
      • WordPress Development
      • WooCommerce Development
      • Custom Web Application Development
      • Website Migration Services
      • Website Maintenance & Performance optimization
      • Website Plugin and API Development
      • Website Store Optimization
      • PHP Web Development
      • Enterprise Web Application Development
      • Laravel Development Services
      • Angularjs Development Services

  3. Jul 2019
    1. I truly wish that this will lead to a service that allows users to transfer their data from one of the big corporations to the decentralised web. If that's even going to be possible, considering the security implications. It's manageable, though: the question is whether the big corporations will let it happen.

    1. If you are an individual working, or planning to work, in this industry, then by signing your declare you won’t work on fossil fuel clients. No one is policing you or checking up, this is a promise to yourself.

      Explicit about the monitoring here.

    2. If you lead an agency, by signing you promise to disclose your turnover by sector, and highlight any climate conflicts. Check the Client Disclosure Reports on this site to see what we mean. Your deadline is end 2019 to disclose.

      Simple, explicit. Disclose turnover by sector, they have an example.

    1. The backend data repository is based on MySQL. Therepository contains 16 tables that captures the varioussource information described above. The current size of therepository (excluding the real-time data) is 92 MB.

      100mb for all the data, mapping all that infra?

    2. Provider maps often contain additional information aboutnetwork node resources. This information can range from lo-cation (potentially down to Lat/Lon coordinates), to IP ad-dresses, to resource or service types. Our ability to extractnetwork node information from the discovered resources isdependent on an assembly of scripts that include Flash-based extraction and parsing tools [3], optical characterrecognition parsing tools [7], PDF-based parsing tools [8],in addition to standard text manipulation tools. This li-brary of parsing scripts can extract information and enterit into database automatically. For instances where none ofthe tools or scripts are successful on the provider data, wemanually parse and enter the data.

      This sounds like an meaningful thing you could use OSM to augment, and the argument for doing it makes sense - "you like the internet right? So help map it, before you lose it"

    3. Visualization-centric representations often reveal no in-formation about link paths other than connectivity (e.g.,line-of-sight abstractions are common). For these we en-ter the network adjacency graph by hand into Atlas. How-ever, some maps provide highly detailed geographic layoutsof fiber conduit connectivity (e.g.,Level3 [5]). We transcribethese, maintaining geographic accuracy, into the Atlas us-ing a process and scripts that(i)capture high resolutionsub-images,(ii)patch sub-images into a composite image,(iii)extract a network link image using color masking tech-niques,(iv)project the link-only image into ArcGIS usinggeographic reference points (e.g.,cities), and(v)use linkvectorization in ArcGIS to enable analysis (e.g.,distanceestimation) of the links.

      So, it sounds like they're using some kind of computer viz to pull compare a set of tiles to some other image showing the infrastructure, to work out the rough coords to project onto a map

    4. In addition to Internet search, we appeal to the largenumber of existing Internet systems and publicly availabledata they provide. This includes PeeringDB [9], NetworkTime Protocol (NTP) servers, Domain Name System servers(DNS), listings of Internet Exchange Points (IXPs), Look-ing Glass servers, traceroute servers, Network Access Points,etc.Beyond their intrinsic interest, it is important to rec-ognize that NTP servers [6] often publish their Lat/Lon co-ordinates and are typically co-located with other network-ing/computing equipment. Similarly, DNS servers routinelypublish their location via the LOC record [18]. In total, over4,700 network resources of various types are annotated inthe Internet Atlas database.

      Okay, this is properly smart, and use pretty much all the data sources I would have thought to look at.

    5. Third, the de facto useof IP addresses gathered from TTL-limited probing cam-paigns as the basis for inferring structure has inherent diffi-culties. These include the well known interface disambigua-tion problem [26], widely varying policies on probe blockingamong providers, and difficulties in managing large scalemeasurement infrastructures [24]. We believe that a differ-ent approach to building and maintaining a repository ofInternet maps is required.

      So, this basically says traceroute by itself isn't enough

    1. To localize overlap we develop a Coastal Infrastructure Risk(CIR) metric that highlights the concentration of Internetinfrastructure per geographic location (e.g., city). The CIRmetric will be used to elucidate the impact of sea level riseon Internet assets temporally. Using CIR, we identify the top10 major geographic locations most at risk, and thus in needof action by municipalities and service providers to secureexisting deployments and plan for new deployments.

      Wow, this is equal parts fascinating and horrifying

    2. In this paper we consider the risks to Internet infrastructurein the US due to sea level rise. Our study is based on sealevel incursion projections from the National Oceanic andAtmospheric Administration (NOAA) [12] and Internet in-frastructure deployment data from Internet Atlas [24]. Wealign the data formats and assess risks in terms of the amountand type of infrastructure that will be under water in dif-ferent time intervals over the next 100 years. We find that4,067 miles of fiber conduit will be under water and 1,101nodes (e.g.,points of presence and colocation centers) willbe surrounded by water in the next 15 years.
    1. Mark Parrington, a senior scientist at the European Centre for Medium-Range Weather Forecast, said the amount of CO2 emitted by Arctic wildfires between 1 June and 21 July 2019 is around 100 megatonnes and is approaching the entire 2017 fossil fuel CO2 emissions of Belgium.

      Fuuuuuuuucking hell. The last 20 days of Arctic wildfires is the same as all the fossil fuel emissions from Belgium last year.

    1. provide a regularly refreshed set of minimum ICT sustainability provisions (including energy/carbon reporting)

      OK. Here's what I would ask to see in an FoI. I imagine any other organisation thinking about the emissions from digital services would also benefit from seeing these, as

      a) they're likely covered by the binding legal targets set for the UK b) I've spoken to a few asking me for some myself

    2. quantify and report on its e-waste and energy and carbon footprint of the digital and technology services used and their sustainability impacts

      So, this looks like a pretty explicit commitment to measure the carbon footprint of digital services to me.

      It seems like it might be FOI-able, as Paul suggested. Anyone?

    1. It is this combination of features that also makes HyperCard a powerful hypermedia system. Users can build backgrounds to suit the needs of some system, say a rolodex, and use simple HyperTalk commands to provide buttons to move from place to place within the stack, or provide the same navigation system within the data elements of the UI, like text fields. Using these features, it is easy to build linked systems similar to hypertext links on the Web.[5] Unlike the Web, programming, placement, and browsing were all the same tool. Similar systems have been created for HTML but traditional Web services are considerably more heavyweight.
  4. Jun 2019
    1. It is difficult to underestimate the impact of a properly chosen web tech stack on the project's general success. The technology stack that powers your product does not only bring it to life but stand for its further maintainability, scalability, and many other factors. However, a lot of things should be taken into account before getting a perfect web tech stack and other web development services.

      There are a few popular web development stacks that may help you build a high-quality website, so to choose web dev stacks you need, you should study more about them and keep up with useful tips. Read this guide to know more!

    1. This is especially true for online gaming

      WASM is being used to run many demanding applications directly in the browser. Autocad is one important example where architects can use this application without installing a usually very heavy piece of software on their computers. They can access the Autocad suite from almost any computer only by logging into to a website. It is expected that a large part of the gaming industry will shift this way as well as many other services. One of the main advantages of this approach aside from a lack of a local installation is real-time software updates for any number of users. A new model of software building and execution will be based on WASM. WASM is also very good for blockchains. Search for the WASM section to learn more.

  5. May 2019
    1. Bloggers can bundle, but making Tweets look like Tweets is actually pretty difficult for normal people and even for geeks like me.

      This has been handled pretty well on a couple of platforms (Wordpress, the late lamented Storify, etc). Does anyone know whether it was fixed more on the Twitter API side or more on the bundling tool side? It's interesting to think that forms of information are more or less bundle-able (reactive or inert, in the atomic metaphor) and that this can be controlled as much by the publisher as by the remixer.

  6. Apr 2019
  7. quickthoughts.jgregorymcverry.com quickthoughts.jgregorymcverry.com
    1. XML pioneer and early blogger Tim Bray says that Google maybe suffers of deliberate memory loss. I may have found more evidence that this is the case.

      Very interesting, as this ties more knots together in allowing people to know that Google is not the end-all of web knowledge.

      People is the power, the corporation is not power.

    1. Air pollution contributed to nearly one in every 10 deaths in 2017, making it a bigger killer than malaria and road accidents and comparable to smoking, according to the State of Global Air (SOGA) 2019 study published on Wednesday. In south Asia, children can expect to have their lives cut short by 30 months, and in sub-Saharan Africa by 24 months, because of a combination of outdoor air pollution caused by traffic and industry, and dirty air indoors, largely from cooking fires. In east Asia, air pollution will shorten children’s lives by an estimated 23 months. However, the life expectancy burden is forecast to be less than five months for children in the developed world.

      And we still SUBSIDISE fossil fuels

  8. Mar 2019
  9. eds.a.ebscohost.com.libproxy.nau.edu eds.a.ebscohost.com.libproxy.nau.edu
    1. The purpose of this book is to help learners plan ,develop and deliver online training programs for adults in the workplace. This book can be understood as a guide for training managers, instructional designers, course developers and educators who are looking to transition from classroom material to self-paced instructional programs.The main purpose of this book is for people who deliver training programs to be able to design programs for online. Most importantly, the learners needs are addressed in development. Rating 7/10 material is interesting and relevant but slightly outdated.

    1. Online is clearly where the growth is, especially when it comes to enrolling adults.

      This article is based around the idea that online education increases access for learners but lacks in completion data. This article provides data around the United States from a study conducted over a few years. Generally speaking this article encourages blended learning rather than all online to obtain better outcomes for adult learners. Rating 7/10 for use of graphs and evidence from data.

    1. Reading on the web is a critical skill for engaging content online. They can be viewed as “exploring,” or “navigating the web.” Just as traditional reading requires knowledge of the text and concepts of print, reading online requires a basic understanding of web mechanics. Good online readers know the tools and strategies that can be used to search for and locate people, resources, and information. They then know how to judge the credibility of these sources.1 The web literacy skills and competencies identified under reading on the web are as follows. Search

      Web Literacy 2.0 discusses how people use web literacy in their everyday lives. For example, "navigating the web" needs to be taught just as the concepts of print do. Quality online readers know where to look, what to ignore, and how to locate information. Writing on the web is also a skill that needs to be explicitly taught. A writer must be able to learn through making and creating. They must be able to communicate their ideas in written word, through presentations as well as through well organized and chosen aesthetics. Rating 10/10

    1. The purpose of this paper is to propose an in-structional-design theory that supports a sense of community.

      This article addresses the fact that new instructional design theories and methods are needed to keep up with new technologies and ways of learning. This article reviews instructional design tools for creating a sense of community online for learners. Additionally, this article discusses the differences between design theory and descriptive theory as it pertains to instructional design. 6/10 This article is very specific and might only be relevant for a specific study or topic

  10. webstandards.hhs.gov webstandards.hhs.gov
    1. Usability guidelines This site seems a bit dated in its appearance but still provides the user the opportunity to review usability standards in general, together with a rating of the weight of evidence that supports each assertion. It would take some time to go through all the information available on this site. It is also usable enough that a designer can check up on guidelines while in the middle of designing a specific project. Rating 3/5

    1. Despite the existence of these two contradictory trends, even the most optimistic studies express concerns regarding the capacity of technological progress to counter the growth in volumes by 2020.For example, this report from the American Department of Energy and the University of California on the energy consumption of data centers in 2016 in the United States, states:"The key levers for optimizing theenergy efficiency [of data centers] identified in this report, better PUE, better rate of use of servers and more linear consumption all have theoretical and practical limits and the amount of progress already achieved suggests that these limits will be reached in the relatively near future."(Shehabi, A. et al., 2016)

      Okay it was that same paper they referred to.

    2. India plans to launch a massive program to deploy commercial 5G networks in 2020 to boost the performance and capacity of existing mobile networks, taking into account that the 4G networks (which only took off in 2017 due to a price war over data started by the telecommunications operator Reliance Jio)are making big advances towards general coverage.

      Hello, so they do reference the massive increase in data and data plans

  11. Feb 2019
    1. Last year, Google quietly started an oil, gas, and energy division. It hired Darryl Willis, a 25-year veteran of BP, to head up what the Wall Street Journal described as “part of a new group Google has created to court the oil and gas industry.” As the VP of Google Cloud Oil, Gas, and Energy, Willis spent the year pitching energy companies on partnerships and lucrative deals. “If it has to do with heating, lighting or mobility for human beings on this planet, we’re interested in it,” Mr. Willis told the Journal. “Our plan is to be the partner of choice for the energy industry.”

      Jeez. At what point do we grow a spine and take climate change seriously?

    1. The emerging wave of Avant-Pop artists now arriving on the scene find themselves caught in this struggle to rapidly transform our sick, commodity-infested workaday culture into a more sensual, trippy, exotic and networked Avant-Pop experience. One way to achieve this would be by creating and expanding niche communities. Niche communities, many of which already exist through the zine scene, will become, by virtue of the convergent electronic environments, virtual communities. By actively engaging themselves in the continuous exchange and proliferation of collectively-generated electronic publications, individually- designed creative works, manifestos, live on-line readings, multi- media interactive hypertexts, conferences, etc., Avant-Pop artists and the alternative networks they are part of will eat away at the conventional relics of a bygone era where the individual artist- author creates their beautifully-crafted, original works of art to be consumed primarily by the elitist art-world and their business- cronies who pass judgement on what is appropriate and what is not.
    1. Work with what you have, to support the people around you and together you'll create a community that has a defined shape and form only in hindsight. Instead of worrying about having enough onboarding ramps, I say we make a future space that is so exciting, so fun, that is such a cool party with lights so bright that everyone wants to build their own methods to get here and join in. And I thought: what's the coolest, most party thing in the world? Reading.
    1. Salesforce was the first major internet company that exclusively leased data center space to adopt a 100 percent renewable energy commitment in 2013. Salesforce has multiple data center leases in Data Center Alley, totaling 46 megawatts, including a massive new lease with QTS in its new Manassas data center.[

      How to do green DCs when you don't own DCs

    2. But despite recent creative claims of being “100 Percent Renewable Globally” from surplus supply of renewable credits in other markets,[66] Google has not yet taken steps to add renewable energy to meet the demand of its data centers in Virginia

      Ah! So they do the "RECs in other markets" too!

    3. In 2018, five major IT brands with long-term commitments to renewable energy[52] and who operate data centers or have significant colocation leases in Virginia sent a letter to the Virginia State Corporation Commision (SCC) asking that they not be used by Dominion to justify new fossil fuel growth, asking instead for a greater supply of renewable energy.[53] The SCC ultimately rejected Dominion’s Integrated Resource Plan for the first time in December 2018, providing an important opportunity for additional large corporate customers to tell regulators they need a greater supply of renewables, not more investment in fossil fuel generation assets or pipelines like the ACP.[54]

      Wait, so these things two things are related? The letter forced the SCC to respond?

    4. The rapid deployment of renewable energy and the stagnation of mandatory renewable energy targets in many states has created a large surplus of “naked” or unbundled renewable credits available at the national level for purchase by the voluntary market, driving their price to record lows, less than $1/megawatt hour.

      So, if you're a huge buyer of electricity, and you are opaque about your offsets, it's easy to imagine that you're just loading up on these.

    5. AWS customers seeking to immediately reduce carbon emissions related to their cloud hosting could request to be hosted in Amazon’s California cloud, which is connected to a grid that is 50[33] to 70[34] percent powered by clean sources of electricity

      Not oregon?

    6. Dominion’s projected demand for the pipeline ignores the fact that six of its 20 largest customers, five of which are data center operators, have made commitments to run on 100 percent renewable energy.[

      How can you publicly audit a commitment like this?

    7. However, neither of these options improves the energy mix of Virginia or influences future direction and is therefore not ideal for those companies concerned with meaningfully reducing their operational carbon emissions. Of the 15 companies measured in this report, only Apple has invested in enough renewable energy procurement to match its demand in the region

      Ok, this makes me think that companies are relying on RECs everywhere else, and crediting Apple with specifically investing directly in RE in Virginia.

    8. If Amazon and other internet companies continue their rapid expansion of data centers in Virginia, but allow Dominion to continue with its strategy to use rising data center demand to justify significant new investment in fossil fuel infrastructure, they will be responsible for driving a massive new investment in fossil fuels that the planet cannot afford.

      So this is interesting. This report seems to be more about Dominion than anything, else, and basically pressuring amazon to get Dominion to step away from fossil fuels

    9. Dominion Energy, Virginia’s largest electricity provider and the primary electric utility for Data Center Alley, has strongly resisted any meaningful transition to renewable sources of electricity, currently representing only 4 percent of its generation mix, with plans to increase to only slightly over 10 percent by 2030.[1]

      Wow, 10% by 2030? That it?

    1. pite the coal-friendly policies of the central government. A study showed that Australia is currently installing 250 watts of PV or wind for each inhabitant per year. The EU and US are about one fifth of this. If this rate of growth continues, Australia will reach 50% renewables by 2024 and 100% of electricity demand by 2032. Costs of new large scale PV and wind are now around US35/MWh, lower than the running costs of older coal stations.

      Wow, go Australia

    1. Williams and Tang (2013)8performed a rigorous and detailed energy consumption analysis of three cloud-based office productivity applications. They analyzed the power consumption ofthe data center, network, and user devices that access the cloud service. The study also performed an energy consumption analysis on “traditional” noncloud versions of the software to understand the overall impact of cloud services.

      Are the findings accessible publicly?

    2. Average power consumption can be estimated from single points, although this results in increasing uncertainty. Manufacturers publish the maximum measured electricity (MME)value, which is the maximum observed power consumption by a server model. The MME can often be calculated with online tools, which may allow the specification of individual components for a particular server configuration. Based on these estimations of maximum power consumption, the average power consumption is commonly assumed to be 60 percent of MME for high-end servers and 40 percent for volume and mid-range servers.

      okay, this is a useful stat. I think

    3. In most cases, the “embodied emissions” (all stages excluding the use stage) of software are not significant compared with the overall emissions of the ICT system, particularly when the embodied emissions caused by development of the software are amortized over a large number of copies. In these cases, it is not necessary to carry out a detailed life cycle assessment of the software as part of a wider system. An exception is where bespoke software has very high emissions associated with its development, and these emissions are all allocated to a small number of software copies.

      Smaller, internal software might count

    4. Currently, input-output(IO)tables are published every five years, a long time in IT product evolution. Consequently, EEIO is good at representing basic commodities / materials industries like plastics or metals manufacturing, but not high-tech industries like microprocessors and fiber optic lasers manufacturing.

      Every 5 years. So, when the iPhone 6 was the brand new hotness, compared to today.

    5. rapidly with the onset of innovations, but lag in being included in EEIO databases available to the practitioner. More detail on EEIO data is provided in the calculation sections below

      Useful point. Because the top-down data is lagging, it'll give worse than expecred figures for hardware

    6. It is interesting to note that the figures from GSMA and GeSI show that energy intensity per gigabyte is improving at about 24% per year for mobile networks, and at about 22% per year for fixed line networks.(The study by Aslan et al calculates a figure of 50% reduction in energy intensity every two years for fixed line networks, equivalent to 29% reduction per year).Also the data shows that the energy intensity per gigabyte for mobile networks is about 50 times that for fixed line networks.

      Okay, this isn't that far from the 45x figure before

    7. Assuming that the reduction in energy efficiency can be fitted to an exponentially decreasing curve (i.e. because it is more and more difficult to achieve the same reductions), then the data points can be extrapolated to give energy intensity factors for 2015 of 0.15 for fixed linenetworks, and 6.5 for mobile networks, with both factors measured in kWh/GB (kilowatt-hours per gigabyte).

      FORTY FIVE TIMES MORE ENERGY INTENSIVE THAN WIRED

    8. A simple energy intensity factor for the use of the internet would make calculating the emissions resulting from ICT simpler and more widely accessible. Whilst this has been attempted in the past, resulting estimates show huge disparities. Coroama and Hilty6review 10 studies that have attempted to estimate the average energy intensity of the internet where estimates varied from 0.0064 kWh/GB to 136 kWh/GB, a difference factor of more than 20,000.

      TWENTY THOUSAND TIMES DIFFERENCE.

      Did I drive across London? Or did I drive to the moon?

    9. For example, these measurements might involve running a series of traffic traces16over a period of time to build up statistics on network parameters. The measurements also need to include the energy consumption for the network’s ancillary equipment such as cooling, power conditioning, and back-up power. If this latter data is not attainable, then techniques described in Section 2.8.2“Calculating GHG emissions for the customer domain use stage,” (TPCF and PUE factors), can be used to provide an estimated value for this equipment.

      Validation!

    10. Equipment manufacturers may have estimates of TPCFs for their equipment based on defined operating conditions. In all cases, if a TPCF approach is selected, then the basis for selecting the factor should be fullynoted and documented in the GHG inventory report

      How much do these figures fluctuate for cloud boxen?

    11. Allocation of emissions among independent products that share the same process: for example, multiple products sharing the same transport process (vehicle); multiple telecommunication services sharing the same network; multiple cloud services (email, data storage, database applications) sharing the same data center

      k8s makes this a pain, if its designed to co-mingle services on the same boxen

    12. This chapter provides software developers and architects guidance to benchmark and report the GHG emissions from software use in a consistent manner and make informed choices to reduce greenhouse gas emissions. The chapter is in two parts. Part A provides guidance on the full life cycle assessment of software, while Part B relates specifically to the energy use of software, and covers the three categories of software: operating systems (OS), applications, and virtualization.

      actual formal guidance!

    13. 2015 GeSI published the SMARTer 20308report, extending the analysis out to 2030. This study predicted that the global emissions of the ICT sector will be 1.25 Gt CO2e in 2030 (or 1.97% of global emissions), and emissions avoided through the use of ICT will be 12 Gt CO2e,which is nearly 10 times higher than ICT’s own emissions.

      theres a 2030 report now. I did not know

    14. The Product Standarddefines products to be both goods and services, thus for the ICT sector it covers both physical ICT equipment and delivered ICT services. This Sector Guidance, however, focuses more on the assessment of ICT services. In this Sector Guidance the definition of products includes both networks and software as ICT services.

      this makes me think that services like e-commerce or ride sharing might not count on he first read thru

  12. Jan 2019
  13. Dec 2018
    1. In the growing trade war between China and the US, it seems the world is unwilling even to think about the entirely legitimate use of consumption-based or border carbon pricing either to encourage cleaner production in China, or to deter the Trump administration from using discriminatory trade measures to re-industrialize drawing partly on older and more carbon-intensive technologies.

      How would border carbon pricing work? You pay a tax on the CO2 emissions 'imported'?

    2. Averaged over the full 35 years, a constant percentage reduction would require c = - 10%/yr to reach the same end-point – almost impossible at the starting point, but entirely feasible and easily observed in the latter stages of sunset industries.

      Okay, so the argument as I see it so far is that, change, while averaged out might be 3.68 per year, but assuming it's a straight line, is a mistake, as substition of high carbon energy to low carbon looks more like an S shaped curve

    3. their analysis leads both teams to the – only slightly caveated - conclusion that the emission reductions required to the deliver the Paris Aims (“well below 2 deg.C”) are implausible, by almost any standard of macroeconomic evidence – and still more so for the most ambitious “1.5 deg.C” end of the spectrum.

      Ah, so this is a response to the we're doomed papers from before

  14. Nov 2018
    1. We need to learn to see the cumulative impact of a multitude of efforts, while simultaneously keeping all those efforts visible on their own. There exist so many initiatives I think that are great examples of how distributed digitalisation leads to transformation, but they are largely invisible outside their own context, and also not widely networked and connected enough to reach their own full potential. They are valuable on their own, but would be even more valuable to themselves and others when federated, but the federation part is mostly missing. We need to find a better way to see the big picture, while also seeing all pixels it consists of. A macroscope, a distributed digital transformation macroscope.

      This seems to be a related problem to the discovery questions that Kicks Condor and Brad Enslen have been thing about.

    1. Learning needs analysis of collaborative e-classes in semi-formal settings: The REVIT exampl

      This article explores the importance of analysis of instructional design which seems to be often downplayed particularly in distance learning. ADDIE, REVIT have been considered when evaluating whether the training was meaningful or not and from that a central report was extracted and may prove useful in the development of similar e-learning situations for adult learning.

      RATING: 4/5 (rating based upon a score system 1 to 5, 1= lowest 5=highest in terms of content, veracity, easiness of use etc.)

    1. List of web 2.0 applications

      EDUTECH wiki is a site that contains a variety of links to lists to hep educators with web 2.0 applications improving productivity Caution: some of the links are not active!

      RATING: 4/5 (rating based upon a score system 1 to 5, 1= lowest 5=highest in terms of content, veracity, easiness of use etc.)

    1. This means that software that deals with Internet must be actively maintained. If it is not it will become more and more useless in practice over time, however much it remains theoretically correct, not because it has bugs or security holes as such but because the environment it was designed to work in no longer exists and thus the assumptions it was built on are now incorrect.

      internet software decays

    1. Using Model Strategies forIntegrating Technology into Teaching

      In this pdf, there are many helpful tips and techniques in creating a foundation for technology. The introduction of model strategies are laid out with lots of supporting detail and examples and weblinks. It includes nearly 400 pages of peer-reviewed lessons, models and various strategies.

      RATING: 5/5 (rating based upon a score system 1 to 5, 1= lowest 5=highest in terms of content, veracity, easiness of use etc.)

  15. Oct 2018
  16. Sep 2018