943 Matching Annotations
  1. Oct 2021
    1. On every execution, the Function node compares the incoming data with the data from the previous execution. If the data got changed, we pass it to the next node in the workflow. We also update the static data with this new data so that the next execution knows what data gets stored in the previous node. If the data did not get changed, you may return a message based on our use-case.

      A-ha! If you can use this to set a 'bookmark' for the latest row in a spreadsheet/CSV for example, you have a nice way to query for "all the things since when I last ran this last Sunday"

    1. Ensuring that the regional market is competitive and that there are incentives for companies to buy local cloud infrastructure is a role that only government actors can fulfill. Moreover, it is a responsibility that is clearly within their mandate. Not coincidentally, such an approach clearly aligns with the European Commission’s and some European Union member state governments’ laudable competition and antitrust strategies, echoing attempts to safeguard the European market and uphold strong values throughout the EU.

      This is literally the opposite of how it works right now, with some procurement specifying AWS

    2. Such a subsidy can be aimed effectively, not at the cloud provider itself, but at companies undertaking a digital transformation or startups, giving them the freedom to choose any local provider and receiving cloud credits from their regional government. These credits are then spent locally, lifting and strengthening the local IT and digital ecosystem.

      So, basically providing access to the same capital that the cloud giants have, but raised from taxation?

    1. It’s true! None of these tactics, on their own, will address complex, deep-rooted social problems. But each of them represents a potential pathway that we can ascend when other routes are blocked.

      Useful framing in the syllabus.

      We have some idea of the goals, and talking in terms of methods provides options to suite the context

  2. Sep 2021
    1. With Amazon the sole customer of the substation it will (via Oppidan) pay for the 26 month-long design and construction process, with the exception of the City-owned control building. It is expected to cost $5,388,260 across three payment milestones, one of which has already been paid.After it is built, property rights will transfer over to SVP, which will operate and maintain the substation.

      OK. so it's not so much a substation owned like a block box.But Amazon is the sole customer, and it likely bought the site so :

      a) it would stop others making a datacentre there b) it could then make use of the substation, and providing extra distribution for the other DCs it wants to operate and use so it can expand further

    1. This verticalization will have the great flaw of making the real consumption of these infrastructures invisible. Today we can still retrieve some data from water and energy providers but when Amazon builds its own substations, like in Santa Clara, or Google its own pumping stations then the black box will continue to grow.

      I had no idea Amazon is building its own substations.

    2. At the environmental level, the territorial approach makes it possible to get out of the mystique of relative efficiency values to align consumption in absolute value with a local stock and a precise environment.

      Absolutt comsumption as a percentage of the local resources would be a huge jump forward here

    3. However, the possible unsustainability of the new data center project was outweighed by an $800 million project with various financial benefits to the community, so the construction project was voted 6-1 in the city council.

      It's worth comparing this to other water reservations for context. Comparing it to agriculture in the same area might help, to see the choices people are facing

    4. It also raises the point that data centers could crowd out renewable energy capacity on the grid, slowing down the country's energy transition.

      I think the arguent made here is that the load can exceed the generation coming from renewable sources, meaning that this would end up leading to more dirty power coming online to meet the demand.

      The alternative might be to adjust demand, with the virtual capacity curves proposed in the google paper,and supplemen that with storage

    5. Energy used in a mine, in freight, in the supply and production chain is much less likely to be renewable.

      It's worth considering things like how a CBAM a carbon border adjustment mechanism might affect this, as it's designed specifically to address this issue of high carbon intensity goods crossing country or trading block borders, like the EU

    6. The US giant advertises that its data center in Eemshaven in the Netherlands would be 100% powered by RE since its opening in 2016. However, on Google's electricity supply matrices we can clearly see that 69% of the electricity supply was provided by RE. The remaining 31% is offset by RECs or virtual PPAs. Google's statement in the preamble is therefore not factually correct.

      These might still be offset by RECs that are tied to a specific point in time, sometimes referred to as TEACS.

    7. Google seems to be in the best position to obtain or bring about direct APPs.

      PPAs presumably

    8. In this scientific literature, it is estimated that the manufacturing phase (construction of the building + manufacturing of the IT equipment) represents on average 15% of the energy and GHG footprint of a data center in a country with "medium" carbon electricity (approx. 150-200gCO2/kWh).. To get to this figure, it is assumed that the building is new and will last 20 years and that the IT equipment is replaced every 4 to 5 years. Based on GAFAM's Scopes 3, a recent publication by researchers from Facebook, Harvard and Arizona University estimated that the carbon impact of data centers related to IT equipment, construction and infrastructure was higher than imagined. There is therefore a growing interest in better understanding these "omissions".

      This is a good point. Refresh rates can be closer to a 1-2 years in some hyperscalers. Good for use phase carbon, bad for embodied carbon

    1. The Commission found that the arrangement, as currently written, could result in annual revenue shortfalls ranging in the millions of dollars, which other customers would have to cover due to the credits that could completely zero-out Facebook’s bill.“The Commission noted this is not logical— that a customer could reduce its bill by using more resources,” it said.

      As I understand this, structuring this deal to give a a low cost for a loooong term agreement would mean bills would have to be raised on other rate payers to make sure the company with the monopoly is able to make the pre-agreed rate of return it as allowed to make each year.

    1. Multiply that by the 80k Server rooms and we get a staggering £4,600,000,000 per annum, (4.6Billion) or 38.54TWh or 11.37 percent of the electricity generated in the UK, a long way from the 1 percent cited earlier [although that was one percent of energy, not electricity - Editor].The CCA info (2017) revealed that the total electricty consumed by sites taking part in the Agreement was 2.573TWh which was 0.79 percent of the country's total electricity generation.So, my figures and the CCA (2017) figures total 41.11TWh representing just over 12 percent of total generation.

      In most cases this is likely a steady load across the whole year.

      41.11TWh seems incredibly high, compared to IEA figures of 200 TWh for the year, but assuming this is correct,

      We'd need to divide this by 365 for every day in the year, and then 24 for every hour to get an idea of the likely continous power draw, every hour as the infra doesn't really get turned off.

    2. After techUk’s Emma Fryer released the results of the second period of the UK data center sectors climate change agreement (CCA) 2nd Period findings in 2017, I conducted some desk-based research which looked at the issue from a UK PLC perspective and included all those enterprise data centers, server cupboards and machine rooms that are largely hidden.

      John mentioned to me the the CCA notes from 2017 might be a little out. It's worth sanity checking that.

    3. “In the UK 76.5 percent of the electricity purchased by our commercial data center operators is 100 percent certified renewable”

      This is an annualised figure, so it doesn't match the actual time of use of datacentres.

      For that we'd need to have a rough idea of how well generation matches the load profile in the UK

    1. In building this system we simultaneously solved three high-level challenges: supporting exabyte-scale, isolating performance between tenants, and enabling tenant-specific optimizations. Exabyte-scale clusters are important for operational simplicity and resource sharing. Tectonic disaggregates the file system metadata into independently scalable layers, and hash-partitions each metadata layer into a scalable shared key-value store. Combined with a linearly scalable storage node layer, this disaggregated metadata allows the system to meet the storage needs of an entire data center.

      So, it seems to add a layer of indirection, so instead of everyone needing to read off the same bits of a disk, the data is stored in places indexed by the KV store, which allows reads and writes to be spread across a linearly scaling storage layer.

      Worth reading the paper to check if this guess is close to reality

    1. The combination of Raspberry Pi and LimeSDR Mini brings with it another major advantage: cost. “The network we built in 2014 was using a software defined radio and a compute unit, but I think there, even for the cheap ones, we were looking at $1,500 for each base station. So, in four years we’ve come down probably eightfold in cost, and complexity and power and all the other stuff that goes with that, because stuff’s got physically smaller as well as cheaper.”

      Each base station for about 200 USD?

    1. Earlier this month, cleantech startup Clearloop broke ground on a 1‑megawatt solar project outside Jackson, Tennessee that stakes out a bold new definition of solar’s carbon-reduction value. It’s the first utility-scale solar project in the country to be partially financed by selling the carbon emissions it will displace over its lifetime. These transactions will not take the customary form of renewable energy credits that average out that value over time, but rather of carbon offsets that are directly related to the power grid the project is connected to.  In more specific terms, about $400,000, or roughly one-third of the project’s cost, was raised via the sale of offsets for nearly 60 million pounds of carbon emissions.

      Wow, this is so much more 'additional' than the often tokenistic measures I see with RECs

    1. The aim of the financial declaration clauses is to provide social pressure to make sure donations happen. This clause doesn't require anybody to make a donation to the DSF - there's no mandatory license fee for any use of the Django mark. However, if you want to use the Django name or logo, you do have to publicly declare that you're not giving anything back to the project. The hope is that this will provide enough social pressure to encourage some level of contribution back to the DSF.

      Social pressure tied to the trademark - useful example to refer to

    2. ... I'm not happy with the way an event/group handled my code of conduct complaint? The first line for reporting any code of conduct violation should always be the event or group organizers themselves. However, if you've done this, and you're not happy with the response you've received, contact the DSF by email: foundation@djangoproject.com and the DSF will investigate and respond. In the extreme case, this response may be to revoke the group/events license to use the Django name

      A potential escalation path for events and conferences when the denial and delay response talks come out

    3. You must also adopt a code of conduct (the Ada initiative draft is a good starting point, but you can choose another code if you wish), and agree to run your event in the spirit of the Django Community code of conduct.

      This is good example of trademark law being used to enforce norms around codes of conducts

    1. Originally, Google was building flying cell towers to beam down the Internet from the sky (over RF), but for balloon-to-balloon backhaul, the company was planning communications via laser beam. Space X just started doing something similar by equipping its Starlink satellites with space lasers for optical intra-satellite communication. One benefit of Sky- and space-based laser communication is that not much can interfere with a point-to-point optical beam. Ground-based lasers have more interference to consider, since they have to deal with nearly everything: rain, fog, birds, and once, according to Alphabet's blog post, "a curious monkey."

      a 'curious monkey' caused an internet outage by blocking the beam? outage

    1. For example, a 1:1 HD 1080p video meeting of 1 hour between two people would require 3.24GB of bandwidth, consuming 0.0486 kWh of electricity. The 2019 UK electricity emissions factor is 0.25358 kgCO2 per kWh, so the CO2 emissions for this call are 0.012 kgCO2. If this happened in the US between two people in New York, the emissions factor is similar to the UK at 0.28839 kgCO2 per kWh but if it was between two people in Chicago, that would be 0.56191 kgCO2 per kWh. Location matters.

      These figures are for 2019.

      The infra has got more efficient, in the two years since 2019, and the grid has also got greener, so the number is likely lower as well now in 2021.

    1. My estimate of 36gCO2 per hour is more than 2,100-times lower than Marks et al. (2020) who estimated that 35 hours of HD video emits 2.68tCO2, or 77kgCO2 per hour.

      This is a figure for netflix, which is likely lower than a zoom call.

      With a video on netflix, the infrastructure is designed to stream a file that's already been encoded.

      This is different to something zoom, where the infrastructure is on the fly encoding the live video streams coming from the camera, and if need be re-encoding them to suit the device

    2. a result, the central IEA estimate for one hour of streaming video in 2019 is now 36gCO2, down from 82gCO2 in the original analysis published in February 2020.

      By comparison an espresso coffee is around 280g CO2e. So if even if we use the high figure from the shift project, it's still three hours of video for around the carbon footprint of a cup of coffee.

    1. t’s not actually going to be a standard, per se, because you can’t pass regulatory standards through reconciliation. Instead, it’s going to be a system of fines and payments that will incentivize utilities to increase their proportion of renewable energy to meet the targets. It’s called a clean electricity payment program (CEPP). A CEPP actually has some advantages over the traditional CES’s and renewable portfolio standard (RPSs) commonly seen in states. For one thing, it’s more progressive: the money to drive the transition comes from federal coffers (via taxes on corporations and the wealthy) rather than from electricity rates, which are regressive.

      If you are paying for transition from taxation like this, because it's largely coming from richer members of society it's more progressive than tacking the charge onto every kilowatt used by consumers, which disproportionately affects lower income groups

    1. With a drink containing approximately 18 g of green coffee (Starbucks Coffee Company, 2019), each kg of green coffee makes approximately 56 espresso beverages. Thus, the carbon footprint found in the LCA is on average 0.28 and 0.06 kg CO2e per espresso beverage for conventional and sustainable coffee, respectively (9.2 and 2.1 g CO2e ml–1). In an LCA of milk production, Hassard et al. (2014) estimated a carbon footprint of 2.26 g CO2e ml–1. Using these values, the carbon footprint of standard coffee beverages was estimated: with the conventional production of coffee beans, the carbon footprints for one serving of caffe latte, flat white, and cappuccino were estimated to be 0.55, 0.34, and 0.41 kg CO2e, respectively. When produced sustainably, these values were reduced to 0.33, 0.13, and 0.20 kg CO2e

      These are the figures cited in Phys, and what you might compare to the streaming to get an idea of where losing an 3 hours of netflix is more important to you than skipping that next coffee

    1. Renewable energy surcharge (21%) Finances the feed-in tariffs for renewable power and the market premium paid to larger producers - 6.41 ct/kWh.

      About 20% of every kilowatt hour in germany goes towards the renewables surcharge. So the greater percentage of your earnings that you spend on electricity, the more you are contributing to renewables compared to someone who earns more and pays a lower amount.

    1. In the 2014 Radio Equipment Directive, EU lawmakers called for a common charger to be developed and gave the Commission powers to pursue this via a delegated act. The Commission's approach of “encouraging” industry to develop common chargers fell short of the co-legislators’ objectives. However, some progress has been made, said the Commission in the plenary debate on 13 January 2020: in 2009, there were more than 30 charging solutions, while today there are three charger types. In its resolution on the European Green Deal, Parliament called for an ambitious new circular economy action plan aiming to reduce the total environmental and resource footprint of EU production and consumption, with resource efficiency, zero pollution and waste prevention as key priorities.

      This is why I say expecting end users to just shop greener is missing a key part of the picture. The ability to compel an industry to standardise on a smaller number of chargers has saved an immense amount of waste, as it decouples the thing you want (shiny phone) from the thing you need so it can be used (charger).

    1. This article derives criteria to identify accurate estimates over time and provides a new estimate of 0.06 kWh/GB for 2015. By retroactively applying our criteria to existing studies, we were able to determine that the electricity intensity of data transmission (core and fixed-line access networks) has decreased by half approximately every 2 years since 2000 (for developed countries), a rate of change comparable to that found in the efficiency of computing more generally.

      this is a figure from 2017, but the halving has been going for 20 years. There are signs of it slowing down, but not by much.

    1. As the graph above shows, renewables will need to do the heavy lifting, growing even faster than in recent years. By 2050, says the IEA, around 90 per cent of global electricity supply will need to be low carbon, about 70 per cent from solar and wind power, with the rest mostly from nuclear.

      this is the edited version of the data in the IEA 1.5 C net zero report in May

    1. The average cup of coffee contains about 18g of green coffee, so 1 kg of it can make 56 espressos. Just one espresso has an average carbon footprint of about 0.28 kg, but it could be as little as 0.06 kg if grown sustainably.

      This is the figure I use when comparing the quoted figures for video streaming

    1. Climate protection is a human right and must be included in the constitution

      I have never seen this before - having it included in the constitution itself?

    1. But I'm wondering if VC-backed firms should be excluded from projects that require long-term maintenance because their growth requirements (fiduciary duties to investors, aka "we need a hockey stick") means they can't commit to 10-20 year contracts?

      Actual serious point. We've had 20-odd years of seeing the incentives at work, and they're not always good.

    1. 2020 is the year in which the current Dutch subsidy scheme for renewable energy, the Renewable Energy Production Incentive Scheme (de stimuleringsregeling duurzame energieproductie (SDE+)), will change. From 2020 onwards, the SDE + will be broadened to achieve the target of a 49 percent reduction in CO2 emissions in the Netherlands by 2030 (or at least to keep this goal within reach). The broadened SDE+ focuses on the reduction of greenhouse gas emissions (CO2 and other greenhouse gases). This will change the focus from energy production to energy transition. The broadened subsidy scheme is therefore called the Renewable Energy Transition Incentive Scheme (SDE++).

      So, this is the expanded version that is focussed on a more holistic, systemic approach

  3. Jun 2021
    1. Air pollution caused by the burning of fossil fuels such as coal and oil was responsible for 8.7m deaths globally in 2018, a staggering one in five of all people who died that year, new research has found.

      This is not climate change, heat related deaths - this is particulate matter (PM2.5) from burning the fuels themselves

  4. Feb 2021
    1. ClimateTech Jobs Fair: CAT members Terra.do, in addition to being one of the first orgs to donate to CAT, are organising a virtual jobs fair on March 5th. Register to have live 1:1 conversations with hiring managers on for jobs in software, data science, product management, hardware and more at top climatetech companies.

      Terra do's plug for their job fair.

    1. While itis criticalto enhancethe sustainability of digitalisation, the ICT sector’s estimated potential in the reduction of GHGs is ten times higher than its own footprint

      Source?

    Annotators

  5. Jan 2021
    1. Currently, there is a significant lack of transparency of environmental cost, which should be urgently resolved given the vast scale of resource usage. Therefore, NGIatlantic.eu invites EU – US applicants that can provide and experiment with transparency mechanisms on the environmental cost of the Internet. Identification and tagging of most resource consuming elements are also very important and urgent. On both sides of the Atlantic, there has already been some early research and innovation projects and initiatives focussing on alternatives to improving energy efficiency to ensure the greening and sustainability of the Internet and of the economy relying on it. This topic welcomes the results from these EU activities to team up with US teams (or vice versa, with US teams twinning with EU teams) to carry out experiments in this vitally important NGI topic.

      Interesting. I couldn't see many in the link though.

    1. That will allow the utility to match in real-time specific units of renewable energy generation with Microsoft’s usage – and dispatch energy from storage if there’s a shortfall – to ensure Microsoft is continually supplied with renewable energy, Janous said.

      They ARE pairing it with batteries

    1. Typically, a data center is seen as an inflexible load, or a “single block” of power, she said. In other words, if the facility’s total load is 10MW, the conventional approach is to ensure there’s 10MW of backup power. Google has learned to not treat total load as an inflexible block and match backup capacity more tightly with the capacity required by applications that run in the facility and the duration for which it’s required.

      This is a good summary of carbon-aware compute.

    2. Also this year, Microsoft announced a successful test for powering a data center rack with a hydrogen-fueled automotive fuel cell, looking at the technology as one of the potential replacements for diesel generators.

      Bosch sell these now, or are making them.

      https://www.bosch.com/research/know-how/success-stories/high-temperature-fuel-cell-systems/

      There's also a collosal amount of EU money in creating a hydrogen economy, so it feels like there conditions are good for switching.

    3. Google estimates that the total generation capacity of all diesel-fueled data center backup generators deployed worldwide is more than 20 gigawatts, which could spell vast opportunities for renewable energy storage.

      20 gigawatts needed to replace all fossil generators worldwide.

      Not sure how long diesel generators last - 12 hrs? 24hrs?

    4. Maud Texier, Google’s carbon free energy lead

      Maud is the lead person on making Making DC's an active part of a future, responsive, sustainable grid

    1. But proponents of immersion cooling technologies emphasize their efficiency advantages. These solutions don’t require any fans. Server fans are usually removed altogether. “You can probably get at least a 15-percent energy reduction in a lot of environments by going to liquid cooling,” Brown said.

      I didn't know that liquid cooling was more energy efficient than air cooling. I also didn't really think about it much either, but it make sense. Liquid is a better conductor of heat than air in cooking, so…

    2. As Uptime’s Lawrence pointed out, generators are “a problem.” They are expensive to deploy and maintain, they pollute the atmosphere, and make a lot of noise. So far, however, they’ve been an indispensable part of any data center that’s meant to keep running around the clock.

      Wow, I didn't know the direction of travel away from diesel was so pronounced in DCs. Makes sense tho.

    1. Perhaps the most critical feature of a decentralized UPS architecture is it allows data centers to meet increased demand by deploying equipment-specific resources that can scale according to needs. New cabinets can expand capacity as necessary, while additional rectifiers and battery modules can increase power for servers added to open racks. By relying on DC power components that connect directly to the AC utility feed, a decentralized power architecture allows facilities to operationalize stranded white space and maximize infrastructure without placing any additional strain on their existing UPS system.

      Oh wow, so they decentralise INSIDE the datacentre. Would this mean it's would be easier to move loads (as in power usage) around in the DC, and power down entire sections more easily?

  6. Nov 2020
    1. 1,966,423

      Total renewable energy used.

      Between ~28% is from renewable sources

    2. 6,904,262 MWh

      Total energy use

    Tags

    Annotators

    1. As a leader in the European data centre industry, we strive to set an example of environmental responsibility. In addition to innovations in engineering and diligent operations for maximising energy efficiency, Interxion supports and consumes energy from sustainable and low carbon sources to the greatest practical extent in our markets of operation. A large proportion of our power comes from sustainable sources, including water, solar and wind.

      A large proportion is a bit vague, and not the same as 100%.

      If they're part of the same group, as Digital Realty, they should have these numbers as DR's report includes total powered used, vs total power from renewable sources, and there are clear ways to confirm this if true.

    1. We’re now 100% powered by renewable and sustainable energy which is great in further minimizing our impact on the planet. Plausible Analytics script weights less than 1 KB which is more than 45 times smaller than the recommended Google Analytics Global Site Tag implementation.

      After speaking to the folks at Plausible they pointed me to this page on the digital ocean community forums:

      https://www.digitalocean.com/community/questions/what-kind-of-electricity-do-you-run-on

      And this one here:

      https://www.interxion.com/why-interxion/sustainability

      The TLDR version is that the servers they are using are run by Digital Ocean, who lease from Interxion, who source the power for the datacentre from renewables.

      Interxion themselves are owned by Digital Realty, who do release figures, but not at a granularity to confirm.

      Once there is info from Interxion, it's possible to confirm this.

    1. ours of session time in each region is multiplied by estimated powerconsumption of devices and device breakdown (% laptop, % tablet,etc.) used to access Mozilla products. This calculation produces totalannual kWh by region. Regional electricity emissions factors areapplied to calculate total regional emissions and summed to calculatetotal emissions by product

      Yup. the more I look at this the more it doesn't seem to account for transfer.

      I can see why you might not account for this, and leave it outside the system boundary, but I can't see the how the logic for including hardware, but not including network usage, then it's likely to be material.

    2. Technology energy consumption used to access Mozilla products,broken down by type and power consumption: desktop computer,laptop computer, tablet, mobile, modem/router.

      Looking at this, it seems not to account for emissions from network transfer, which would be a key use for a browser - just end use.

    3. Renewable energy reportedat 4 sites. Nodocumentation provided.

      Is this normal?

    4. Server allocation per $ spend: 0.001 server / $ spend

      So, put another way for every ~10k of spend, you assume you're using one whole physical server's worth of compute.

    5. Sum of space-specific refrigerant leak in kg by refrigerant type,divided by 1,000, multiplied by refrigerant type-specific globalwarming potential (GWP) to calculate organization total GHGemissions in mtCO​2​e.

      This gives an idea of why fixing refrigeration ranked such a high intervention on drawdown. Many of these refrigerant gases are utter carbon bastards, and really bad news. one kilo released has a similar impact as 1.3 tonnes of 'regular' CO2!

    6. Mozilla uses “business services and operations”to refer to the organization’s calculated Scope 1, 2 and 3 emissions with the exception of Scope3 Use of Products.

      14k tonnes would be the footprint of Mozilla if they didn't account for end user products.

    7. Use of Products = 98% ofScope 3 total

      Wow, this a huge percentage. It's pretty rare of orgs to report end user emissions as well, but it highlights the important of setting sensible defaults.

    8. Scope 2 emissions were calculated under two accounting methodologies: location-based andmarket-based.

      Calculating with both is good. this stops you being able to hide emissions by only showing the net emissions which is what you get if you use market based emissions only.

      For more, see this commentary on Amazon's reporting: https://greeningdigital.substack.com/p/greening-digital-2-omgclimate-and

    9. Mozilla’s 2019 GHG inventory is comprised of emissions from scope 1, scope 2 and relevantscope 3 categories.

      It's surprising how many companies don't do all three scopes. It's good to see the effort gone in.

    10. 2019 GHG Inventory Report

      Hi there.

      I use Mozilla Firefox, and broadly speaking, I'm a fan of the orgnisation. Here are my notes as I read though te report/

    Annotators

  7. Oct 2020
    1. We have: defined procurement principles and standards. These are (in summary): 100% renewable energy and/or carbon neutral suppliers 0% to landfill and an annual increase in reuse and material recycling increased transparency across HMG, suppliers and the supply chain 100% traceability of ICT at end of life a yearly increase in procured ICT and services that is remanufactured/refurbished

      Pretty explicit

    1. I'm really curious whether or how creating more effective car battery recycling technologies could support or underwrite consumer electronics recycling. For years, the line on e-waste has been that the economic model for electronics recycling just doesn't work. As devices get smaller, extraction gets more time-consuming and with increased component miniaturization, recycling companies are getting less and less value from what metal they can recover (although notably, Redwood Materials is refining their battery recycling process through recycling consumer electronics). Recycling car batteries at scale fills a massive need and companies will definitely pay for those materials. If Redwood can get it right, they could potentially make a less-profitable recycling niche more viable with the resources afforded by its primary more profitable niche.

      Basically, ways to reclaim existing materials in the technosphere, than need to mine stuff

  8. Sep 2020
    1. In addition, the recent need to accelerate deep-learning and artificial intelligence applications has led to the emergence of specialized accelerator hardware, including graphics processing units (GPUs), tensor processing units (TPUs) and field-programmable gate arrays (FPGAs). Owing to its in-memory data model, NumPy is currently unable to directly utilize such storage and specialized hardware. However, both distributed data and also the parallel execution of GPUs, TPUs and FPGAs map well to the paradigm of array programming: therefore leading to a gap between available modern hardware architectures and the tools necessary to leverage their computational power.

      Ah, so it whie it supports SIMD, it doesn't support this stuff yet.

    2. NumPy operates on in-memory arrays using the central processing unit (CPU). To utilize modern, specialized storage and hardware, there has been a recent proliferation of Python array packages

      So these can drop down to take advantage of SIMD and all that?

  9. Aug 2020
    1. his dream of it being as easy to “insert facts, data, and models in political discussion as it is to insert emoji” 😉 speaks to a sort of consumerist, on-demand thirst for snippets, rather than a deep understanding of complexity. It’s app-informed, drag-and-drop data for instant government.
  10. Jun 2020
    1. James Hetherington, Directorate of Digital Research Infrastructure, UK Research and Innovation

      Holy shit. AMEE James? Small world.

    Annotators

    1. The set-up of a cloud services marketplace for EU users from the private and public sector will be facilitated by the Commission by Q4 2022. The marketplace will put potential users (in particular the public sector and SMEs) in the position to select cloud processing, software and platform service offerings that comply with a number of requirements in areas like data protection, security, data portability, energy efficiency and market practice.

      Best source of compute, but across the entire EU?

  11. May 2020
    1. In France, as in other Western countries, for various reasons too long to mention in this study, the collection level of waste electrical and electronic equipment is around 45%.

      This is higher than I thought - almost half of all electronics are collected in France?

    2. Had they been implemented as of 2010, these 4 measures would have reduced the global digital footprint over the period observed (2010 to 2025) by between 27% and 52%

      How would you back this claim up, or disprove it? It feels like catnip for journalists, but I don't see how you could interrogate it.

    3. Reducing the number of flat screens by replacing them with other display devices such as augmented / virtual reality glasses, LED video projectors, etc.

      Presumably this would be down to the energy intensive nature of making a flatscreen?

    4. Their contribution to the impact of the digital world thus goes f rom less than 1% (all environmental indicators combined) in 2010 to between 18% and 23% in 2025. It is huge!

      These numbers seem comically high

    5. In 2025, user equipment will concentrate from 56% to 69% of the impact. For example, 62% of digital GHG emissions will be user-related, 35% of which comes from equipment manufacturing.

      What kind energy mix does this assume in 2025?

    6. The hierarchy of impact sources is as follows, in descending order of importance:1. Manufacturing of user equipment;2. Power consumption of user equipment;

      So these guys say it's the end use that's the big problem, not the infrastructure or network usage per se.

    7. In 2019, the mass of this digital world amounts to 223 million tonnes, the equivalent of 179 million cars of 1.3 tonnes (5 times the number of cars in France).

      So this might be a bit like the idea of biomass, but for technology?

    1. We've likely cleared half the biomass of plants on earth, and after that, they make up more than 80% of all the biomass of everything alive today.

      The livestock we farm is likely around 10 times the biomass of all wild mammals and birds

    1. This document gives a good background on where you might choose to use LCA, and where it's not such a useful tool.

      It's from 2012, but what it's saying is mostly in line with my understanding of the subject.

      Things have moved on since it was written and the open source projects seem not to be all that active now.

    2. © Intellect July 2012

      This report is 8 years old.

    3. Source: IBM/Carnegie Mellon University Carbon Footprint Research Study

      What year is this from?

    4. Analysis of a web-search of published LCA study results for ICT devices showing percentage use-stage carbon. Source: Darrel Stickler, Cisco

      This diagram is v handy - at a glance, it gives an idea where the main levers for reducing might be depending on the kind of product

    5. This framework tends to follow a similar pattern which includes most or all of the following steps: set goals and define scope, inventory analysis, impact assessment, interpretation, reporting and critical review.

      This is a really helpful diagram for explaining what the alphabet soup of standards means, and how you might apply them.

    6. Even the OECD definition of ICT includes a whole range of consumer electronic (CE) products that many would not expect to see classed as ICT.

      Where is the official definition in the OECD?

    7. And with carbon, as with calories, it is the pies –or rather their carbon equivalents -that are important. LCA is a poor tool for differentiating between strawberries and raspberries, but it is a wonderful tool for identifying where the pies are, and who is eating them

      This is totally worth using in future

    Annotators

    1. 1,211,22

      This is the final figure for scope 1, 2, and 3, after accounting for market based figures for CO2 from energy.

      They've split out a huge chunk of emissions in scope 3 as *other, but it's not immediately obvious to me what this includes, and they've chosen not to account for it when it comes to purchasing the offsets anyway.

    2. 15,027,224

      So this figure is actually pretty close to Microsoft now. The figures I was looking at were for just scope 1 and 2, not scope 3.

    1. The action was the digital equivalent of queueing up at McDonalds and ordering the non-existent vegan, zero-waste Happy Meal again and again. Rebels targeted a different polluter each day, including fossil fuel companies Shell and BP, shipping company Maersk, and the Danish Finance Ministry for its recent bailout of Scandinavian Airlines.

      This sounds a lot like DDOSing websites to me.

    1. What exactly has changed?

      Y'kow, they could have mentioned this earlier in this long post full of verbiage.

    1. Technological improvements also play a role in driving down battery costs. Frith points out that the term lithium-ion battery is actually an umbrella term for a number of different battery chemistries.

      Did not know this.

    1. Displace dirty energy demand: New renewables supply displaces demand for existing dirty electricity generation.

      How do you demonstrate this at a supplier level.

    2. The Oil Climate Index from the Carnegie Institution gives full lifecycle emissions for a selection of global crude oils. That analysis does not specifically profile Permian crude oils, but does find that the median U.S. crude oil will lead to 0.51 metric tons of CO2-eq over its lifecycle.[

      Are these figures the updated ones where life cycle figures for fossil fuels turn out to be higher than previously thought?

    3. Public commitment to no longer offer machine learning or high performance computing capabilities for the oil and gas sector for the purpose of new exploration or increased production, and to not renew existing contracts. End membership with the Open Subsurface Data Universe Forum.

      If you were a company doing this, what would the wording look like?

    4. In 2018, Google attracted former President and General Manager of BP, Darryl Willis, as VP of Oil, Gas, and Energy at Google Cloud, where he was tasked with developing new products and solutions and building trusted relationships with key leaders and companies in the oil and gas sector

      VP of Oil, Gas and Energy

    5. o realize the climate commitments they have set, Google, Microsoft, and Amazon must continue to reduce carbon emissions throughout their own operations and publicly distance themselves from customers that are making the climate crisis worse.

      It's crazy that this would be seen as controversial, and yet…

    1. She labelled this phenomenon “socially organised denial”.

      This is a useful term

    2. With the backing of Green Alliance and some philanthropic funders, I set up a training programme. We offered parliamentary candidates and new MPs the chance to learn about the science, policy and politics of climate in a series of tailor-made workshops. We worked with small groups of around 10 politicians, all from the same party, to allow them to question and debate freely.

      I wonder how much this cost, to design these kinda of workshops?

    1. Strengthening: The team understands its role in the larger organizational system and actively works to make that system more successful.

      Ah, so strengthening refers to an outward use of the term, not the team itself getting stronger per se.

    2. Although you may be tempted to hire new employees to fill the gaps, it’s usually more effective to include employees who already understand your business’s unique priorities and constraints.

      So, basically they propose embedding a domain expert in an existing team, as it's easier to propose than hiring a new role

    3. team chartering

      ?

    4. Each fluency zone brings new benefits, so it may seem that the model should be treated as a maturity model, in which the goal is to reach maximum maturity. That would be a mistake. Unlike maturity models, where more mature is always better, the fluency model describes a collection of choices. Each zone represents a fully-mature choice. Each one brings value.

      Using fluency instead of maturity - less of a value judgement, and more of a deliberate decision. no one wants to admit to being immature, but admitting to not being fluent is easier

    5. Although teams develop proficiencies in any order, even from multiple zones simultaneously, we’ve observed that teams tend to gain zone fluency in a predictable order.

      So, there's a fairly clear order of where to start.

    6. Focusing teams produce business value. Delivering teams deliver on the market cadence. Optimizing teams lead their market. Strengthening teams make their organizations stronger

      Focussing, Delivering, Optimising, Strengthening. How

    1. Once done with injecting my performance marks inside my HTML, I switched to the “Performance” tab, made sure I selected a “Fast 3G” network and “4x slowdown” for the CPU

      It's worth checking the profile on sitespeed.io to see how this compares

    2. Since I only wanted to see the CSS coverage, I used the filter “.css” and what I could see was that 92% of the CSS I was loading was not used. (Unused bytes will change in real-time when you start interacting with the page):

      Is this exposed in sitespeed?

    3. Structure of my page after adding the performance marks

      Breaking out the CSS, JS and head tags, to time them all independently

    4. So, I decided to use some custom metrics using the Performance API to get a rough idea of what was time-consuming on the page I was auditing.

      So, using the performance API to translate it into something meaningful for clients

    1. “the tragedy of the horizon.”

      New term to me. It's good.

    2. to avoid mass death.

      This is as good a climate slogan as any, right now.

  12. Apr 2020
    1. For example, the GUI tool for PostgreSQL administration, PGAdmin 3, is used by many people.  (I’m an old-school Unix guy, and thus prefer the textual “psql” client.)  I’ve discovered over the years that while PGAdmin might be a useful and friendly way to manage your databases, it also automatically uses double quotes when creating tables.  This means that if you create a table with PGAdmin, you might find yourself struggling to find or query it afterwards.

      Oh my god. I wish I knew this ten years ago.

    1. Averylarge data centre may consume 30GWh of power in a year, costing its operator around £3,000,000 for electricityalone. A handful of sites in the UK consume even more than this although the majority of sites consume far less. The total power demand of the UK data centre sector is between 2-3TWh per year2. Energy is usually the largest single element of operating costs for data centres, varying from 25-60%.

      This is about one percent of UK electricity usage. And this seems to discount smaller datacentres, which make up a much larger use of power is the datacentre dynamics piece from john booth is anything to go by.

      https://www.datacenterdynamics.com/en/opinions/data-centers-reaching-net-zero-in-the-uk/

    1. In this case, establishing a direct connection between thepeers seems to be impossible, and the only remaining op-tion is to use a server with a public IP address to proxythe communication between the peers, e.g. using the TURNprotocol [18]

      This is just like our experiences with video again then. Back to TURN.

    2. For ephemeral updates PushPin uses an additional mes-saging channel, adjacent to the CRDT, which ties arbitrarymessages to a device and user context. The current implemen-tation is rudimentary: ephemeral data is not associated witha particular CRDT state and is distributed only over directP2P connections. Nevertheless, it enables shared contextualawareness in the user experience of PushPin, providing afeeling of presence when other users are online or collabo-rating

      So THAT's how they do the presence like mouse pointers and the rest.

      If there's already a channel here, presumably you could do video too, if there was a central server

    3. Each URL also includes acontentTypeparameter, whichindicates how that document should be rendered in the userinterface. This parameter is part of the URL, not the docu-ment content, because the same document content may berendered differently in different contexts. For example, Push-Pin could be extended to support flashcards for languagelearning. In one context, the document containing the data-base of flashcards could be rendered as a list of entries, whilein another context it might be rendered as a quiz interface,presenting one side of one flashcard at a time.

      I had no idea that there was a separate 'viewer' concept. Neat

    1. People make light of the idea that digital should be the most basic of Maslow’s hierarchy of needs — over food, water, shelter, and warmth — but there is evidence that people do, to an extent, prioritise connectivity over food and comfort. Some refugees, for instance, are known to have asked for Wi-Fi or charging services ahead of food or water on arrival in a new country.

      🤯

    1. Globally, Amazon has 86 solar and wind projects that have the capacity to generate over 2,300 MW and deliver more than 6.3 million MWh of energy annually—enough to power more than 580,000 U.S. homes.

      The use of "has the capacity to" here is misleading.

      Capacity factors for renewables range betweeen 15 and 40, so this makes it sound like amazon has paid for and used 6.3 MWh of power when it's likely to be a fraction of this.

      By comparison, Microsoft, report on how much power they did use each year, and how much of that was renewable. Last year they reported using around 7m MWh of power, and buying almost this figure in renewable energy.

  13. www.gitops.tech www.gitops.tech
    1. Additionally the image registry can be monitored to find new versions of images to deploy.

      Ahh, so this is how you might make it possible for a developer to just push changes, and not have to have access to the environment repository.

    1. For now, suffice it to say that Tailscale uses several very advanced techniques, based on the Internet STUN and ICE standards, to make these connections work even though you wouldn’t think it should be possible. This avoids the need for firewall configurations or any public-facing open ports, and thus greatly reduces the potential for human error.

      I wonder how this relate to VOIP and so on?

    2. Here’s what happens: Each node generates a random public/private keypair for itself, and associates the public key with its identity (see login, below). The node contacts the coordination server and leaves its public key and a note about where that node can currently be found, and what domain it’s in. The node downloads a list of public keys and addresses in its domain, which have been left on the coordination server by other nodes. The node configures its WireGuard instance with the appropriate set of public keys.

      So it's a little bit like a private DNS server?

    3. Sadly, developers have stopped building peer-to-peer apps because the modern Internet’s architecture has evolved, almost by accident, entirely into this kind of hub-and-spoke design, usually with the major cloud providers in the center charging rent.

      This is such a quote.

    1. A breadcrumb in this case is a single pixel that you can place in a precise location on a webpage. Placing a breadcrumb could be as simple as Option + click. While navigating the web, you could leave breadcrumbs on different pages you find interesting over the course of a browsing session. When you're done, that sequential "trail of breadcrumbs" would be saved. You could then jump back into the trail and navigate "forward" and "backward" through the things you found interesting in that browsing session. Or share the trail with a friend, and they could step through your spatial path of navigating the web.

      This isn't a million miles away from how hypothesis allows you to annotate specific sections, although much more lightweight.

      It's plausible to show a set of thumbnails of a pages with the highlights ... highlighted for others to see.

    2. When you run a standup at a technology company, you typically go around a circle and each person gives their daily update. But in Zoom, there is no circle. You get confused about who is next. Two people start speaking at the same time. It's awkward and confusing. Eventually, you realize that one person, probably the manager, just has to dictate who goes next, at the risk of seeming bossy.

      Just having a line from left to right would be an improvement, or some consistent, implied order

    1. Since emission reduction projects registered under crediting programmes to date have been mostly developed in the context of cost-saving, rather than ambition-raising mechanisms, we understand that there are very few, if any, examples of existing credited projects that represent those high-hanging fruits, and which could be considered truly additional in the context of the Paris Agreement. Given the difficulty in objectively determining additionality in line with this definition, we consider that only a niche and ever reducing number of activities could count for this, and that this does therefore not represent a viable option for rapidly increasing demand volume of the market.

      This is a really important point, that the current "offsets" framing undermines.

    2. A climate responsibility approach needs to first and foremost incentivise and facilitate the reduction of one’s own emissions.

      I can't help thinking they should have lead with this

    3. Emissions from project-specific activities, such as project-related travel, are attributed as cost items to their respective project cost lines.

      Project level carbon budgeting?

    4. We recognise that some of the activities with the highest transformation potential – and therefore with high suitability for supporting the objectives of the Paris Agreement – may be at early stages of development and/or may carry a risk of not delivering attributable emission reductions.

      This addresses some of the issues around needing to get a definite amount of emissions drawn down, when the science makes this very very hard to measure

    1. The High-Level Commission on Carbon Prices surveyed the available scientific literature, concluding that the explicit carbon-price level consistent with the Paris Agreement temperature objectives is at least US$40–80/tCO2 by 2020, provided that a supportive policy environment is in place (High-Level Commission on Carbon Prices, 2017). Informed by this report and allowing for its uncertainties, NewClimate Institute has imposed a price level of EUR 100/tCO2e for the 2014-2019 period. This is also in-line with the central estimate of climate change avoidance costs over the period to 2030 used in the European Commission’s 2019 Handbook on the External Costs of Transport (European Commission, 2019).

      Holy biscuits. They're not fucking about. Microsoft just increased their price for carbon to 14 USD per tonne, by comparison.

    2. This methodology for the estimation of GHG emissions includes the estimated equivalent climate impact of non-carbon climate forcers from aviation, such as condensation trails, ice clouds and ozone generated by nitrogen oxides and results in emission estimates approximately three times greater than if calculating only direct CO2 emissions (Atmosfair, 2016).

      Wow, they use the updated science

    1. For the moment, the main take-away is that there is a goodargument that registration fees should ​not​​be set at $0, even this year. Rather, organizersshould look at their existing budgets, and rework them by eliminating the costs associated withthe physical event. Virtual conferences that do choose to set their prices low should be carefulnot to encourage an expectation that other virtual conferences (or future instances of this one)will always be free or cheap. (For example, one of the suggestions heard by the ASPLOSorganizers was “keep it free;” obviously, this may not be financially sustainable.)

      I'm really glad this is mentioned, as it provides a chance to talk about actually paying presenters for their time, increasing the likelihood of new voices speaking at conferences.

  14. Mar 2020
    1. He mentioned a few examples where the AI fallacy is already playing out. One is the idea from the national statistician that we might not need to keep doing a census, because there will be lots of data from other sources. Neil pointed out that we actually need more classical statistics than ever, to verify all this machine learning and data. He calls this the “Big Data Paradox” - that as we measure more about society, we understand less. We need to be able to sanity check our large complex systems - the census is still valuable.

      The census as a calibration tool

    2. In the section on technical debt, the mythical man month, etc, I was amused to note that Neil called out Amazon's fabled "two pizza team" as American cultural imperialism. The problem arising from the separation of concerns and specialisation of teams is that no one is concerned with the whole system.

      This is such a good quote and helps explain so much.

    1. GENEVA (Reuters) - European countries need to invest to prepare their transport infrastructure for the impacts of climate change or face hundreds of millions of dollars in repair costs, a U.N. regional commission said in a study it says is the first of its kind.

      This is for transport. What about networks?

  15. www.fairphone.com www.fairphone.com
    1. GWPkg CO2e43.8535.98-1.115.983.00100.0%82.1%-2.5%13.6%6.8%

      These figures here, for a smart phone from 2016 put the use phase at around 6kg of CO2 over a 3 year life cycle.

      That's fairly close to the Shift Project figures in their Lean ICT report.

    2. The following use pattern is applied for the Fairphone 2:

      This assumes similar energy use figures to the Shift project of 5.9kwH per year

    1. The quantification of this unit impact is done in kWh/byte. Three contributions are considered:The electricity consumption associated with using the terminal on which the action is performed;The electricity consumption generated by the activity of the data centers involved in transferring the data;The electricity consumption generated by the activity of the other network infrastructures during the transfer of the data.

      Awright. So, reading this makes me think all the numbers from the 1byte model that are being cited appear to be concerned with the use phase, not the actual production phase.

  16. Feb 2020
    1. Fundamental requirement:Climate-neutral datastorageand transfer

      I'm curious about how this is measured and managed.

    1. Data centres and telecommunications will need to become more energy efficient, reuse waste energy,anduse more renewable energy sources. They can and should become climate neutral by 2030

      For maddie

    2. Destination Earth, initiativeto develop a high precision digital model of Earth (a “Digital Twin of the Earth”) that would improve Europe’s environmental prediction and crisis management capabilities (Timing: from 2021

      Good heavens, a digital twin of Earth

    3. A circular electronics initiative, mobilising existing and new instruments in line with the policy framework for sustainable products of the forthcoming circular economy action plan

      I wonder if the restart project gang have seen this

    1. Climate change poses an unprecedented threat to humanity in the 21st century. In the period up to 2030, an estimated $3.5 trillion is required for developing countries to implement the Paris climate pledges to prevent potentially catastrophic and irreversible effects of climate change.

      Ah, that's where it comes from

    1. Methodology and sources The analysis of the carbon intensity of streaming video presented in this piece is based on a range of sources and assumptions, calculated for 2019 or the latest year possible. Bitrate: global weighted average calculated based on subscriptions by country and average country-level data streaming rates from Netflix in 2019; resolution-specific bitrates from Netflix.Data centres: low estimate based on Netflix reported direct and indirect electricity consumption in 2019, viewing statistics and global weighted average bitrate (above); high estimate based on 2019 cloud data centre IP traffic from Cisco and energy use estimates for cloud and hyperscale from IEA.Data transmission networks: calculations based on Aslan et al. (2017), Schien & Priest (2014), Schien et al. (2015), and Andrae & Edler (2015), and weighted based on Netflix viewing data by devices. Devices: smartphones and tablets: calculations based on Urban et al. (2014) and Urban et al. (2019), iPhone 11 specifications (power consumption and battery capacity), and iPad 10.2 specifications; laptops: Urban et al. (2019); televisions: Urban et al. (2019) and Park et al. (2016), and weighted based on Netflix viewing data by devices. Carbon intensity of electricity: based on IEA country-level and global data, and 2030 scenario projections.

      Let's roll these into the new model

    1. Aristotle’s view of the process: “We are what we repeatedly do. Excellence, then, is not an act but a habit.”

      This is a really handy quote. totally borrowing it.

    2. Buying carbon offsets might still have greater impact in the short run, but you can’t see them, so their purchase is less likely to be contagious.

      I've never thought of them in terms of 'contagion' like this. interesting

    1. Today the average carbon intensity of electricity generated is 475 gCO2/kWh, a 10% improvement on the intensity from 2010.

      There is more recent data, but it's not for the whole world. Should come out around March 2020.

    1. It is relevant to estimate how much data is generated by - and associated electric power used - normal behavior like video streaming several hours every day.

      If we assume video streaming is 'normal behaviour' like web surfing and working online, then these numbers are probably the safest to use for now.

    Annotators

    1. TechnologyIntensity [kgCO2eq/MWh]solar0.00410geothermal0.00664wind0.141nuclear10.3hydro16.2biomass50.9gas583unknown927oil1033coal1167

      The proportions of each are going affect this.

      What would a global average (mean) figure be for renewable energy (i.e. not including nuclear)?

    2. Fig. 1. The 28 areas considered in this case study, and the power flows between them for the first hour of January 1, 2017. The width of the arrows is proportional to the magnitude of the flow on each line. Power flows to and from neighboring countries, e.g. Switzerland, are included when available, and these areas are shown in gray. The cascade of power flows from German wind and Polish coal are highlighted with blue and brown arrows, respectively.

      I had no idea Germany sold so much power, net to other countries. Always assumed it bought loads of France's power

    1. So energy consumption in the oil and gas sector, into which Google Cloud is selling its cost-reducing services, is four orders of magnitude larger than Google’s data center decarbonization efforts. The harm that Google Cloud will do to the planet, if it reduces underlying costs of this industry by even a small percent, completely dwarfs the data center decarbonization work.

      This is the first time I've seen numbers putting these into perspective. I wonder how these compare for Amazon and M$ ?

    1. Therefore, we estimate an advertising share of 50% for the traffic class web, email, and data in 2016, with an uncertainty range of [25%–75%]. The share is the same for both mobile and fixed traffic.

      Holy biscuits, HALF OF WEB TRAFFIC as ads?

    2. According to a Solarwinds company 2018 study, the average load time for the top 50 websites was 9.46 s with trackers and 2.69 s without.

      Trackers have a 4x impact on performance compared to not having them

    3. The Internet's share of the global electricity consumption was 10% in 2014 (Mills, 2013): As a reference, the entire global residential space heating in 2014 consumed the same amount (International Energy Agency, 2017a).

      Heating ALL the homes in all the world is about the same as the internet's carbon footprint according to this paper

  17. Jan 2020
    1. How to Build Data Visualizations in Excel

      Does this exist for Google Spreadsheets or Libreoffice?

    1. . In particular,on the host, we propose a first configuration of a software-defined power meter that builds on a new CPU power modelthat accounts for common power-aware features of multi-core processors to deliver accurate power estimations at thegranularity of a software process. In the VM, we introduce asecond configuration of a software-defined power meter thatconnects to the host configuration in order to distribute thepower consumption of VM instances between the hosted ap-plications. The proposed configuration can even be extendedto consider distributed power monitoring scenarios involvingapplication components spread across several host machines.

      So, this would be the app level metering you would want.

    2. WhileBITWATTSis a modular framework that can accom-modate different power models (includingrunning averagepower limit(RAPL) probes and power meters), we propose aprocess-level power model, which is application-agnostic andaccounts for virtualization—i.e., for emulated cores withina VM—and for the power-aware extensions of modern pro-cessors, notably hardware threading anddynamic voltageand frequency scaling(DVFS).

      Ah, so this is the key difference and the reason for the Smartwatt formula business

    1. the politics of the armed lifeboat

      I expect to use, and hear this term a lot in 2020, as people realise the implications of changing climates on how we live.

    1. For Apple, this may be a feature rather than a bug: Documents obtained by Motherboard in 2017 revealed that the company requires its recycling partners to shred iPhones and MacBooks so that their components cannot be reused, further reducing the value recyclers can get out.

      not great for a parts market

    2. So, Apple has started collecting that scrap, melting it down and forming new hunks of aluminum that can be used to carve more gadget husks.

      Wait, this is the recycling?

    1. In February 2018, she published a report — titled Mission-Oriented Research & Innovation in the European Union — that defined five criteria missions should obey: they must be bold and inspire citizens; be ambitious and risky; have a clear target and deadline (you have to be able to unambiguously answer whether the mission was accomplished to deadline or not, Mazzucato says); be cross-disciplinary and cross-sectorial (eradicating cancer, for example, would require innovation in healthcare, nutrition, artificial intelligence and pharmaceuticals); and allow for experimentation and multiple attempts at a solution, rather than be micromanaged top-down by a government.
      1. bold and inspiring to citizens
      2. ambitious and risky
      3. clear target and deadline
      4. cross disciplinary and cross sectortial
      5. allow for multiple approaches and experiments
    2. Mazzucato traced the provenance of every technology that made the iPhone. The HTTP protocol, of course, had been developed by British scientist Tim Berners-Lee and implemented on the computers at CERN, in Geneva. The internet began as a network of computers called Arpanet, funded by the US Department of Defense (DoD) in the 60s to solve the problem of satellite communication. The DoD was also behind the development of GPS during the 70s, initially to determine the location of military equipment. The hard disk drive, microprocessors, memory chips and LCD display had also been funded by the DoD. Siri was the outcome of a Stanford Research Institute project to develop a virtual assistant for military staff, commissioned by the Defense Advanced Research Projects Agency (DARPA). The touchscreen was the result of graduate research at the University of Delaware, funded by the National Science Foundation and the CIA.

      This paragraph.

  18. Dec 2019
    1. Just this month it was announced that over half of the power plants operated by China’s Big Five state-owned utilities are running at a loss. The government has plans for up to one third of them to shut by 2021, removing 15% of the country’s coal capacity.

      🤯

    2. No matter that there is no single robust methodology: internalizing Carney’s three types of climate risk – physical, liability and transition – will inevitably work through to large-scale asset disposals and new reinvestments; it will without doubt affect the cost of capital of high-carbon businesses.

      This taxonomy of risk sounds useful to talk about.

    3. No single “sneeze” will wipe out fossil fuel use across energy and transport; It will occur sector by sector, country by country. Over the past six years, LED light-bulbs have gone from less than 5% global market share to over 40%; coal power in the U.K. from 40% to a couple of percent; plug-in vehicles in Norway from less than 5% to over 50%. In each case, there was a slow start, an agonizing wait, and then the sneeze. Bless you!

      Good reference for explaining a punctured equilibrium

    4. The average capacity factor of the world’s hydro plants is 42%; gas peaking plants 15%. Even so-called baseload coal plants run on average only 54% of the time. If technology is cheap, and demand or supply are intermittent, we overbuild. Wind and solar are no different.

      Cripes. I didn't know that even baseload coal runs so infrequently. From the public discourse you'd assume it was 24/7

    5. economic drag caused by resource waste

      I've never heard of this, but I assume it's basically shifting from saying "we'd grow faster if only we had another two power stations" to "we'd grow faster if we used energy create the effect of having two more power stations of useful energy"

    1. THE DESIGN OF BROWSING AND BERRYPICKING TECHNIQUES FOR THE ONLINE SEARCH INTERFACE by Marcia J. Bates

      Seminal paper on how we navigate online with search

    1. Starting with a small cohort, we’ll refine the patterns and practices that sustain learning. We will see case studies of how the organizing model is made locally relevant. Globally, we will be a large peer learning community. And we’ll also have smaller, local peer learning communities connected and experimenting in a place.

      This is very applicable in CAT. Get a small core to make the ground game.

      lol… ground CATs

    2. For our community that teaches the web, we understand ourselves as participating in i) a global network and ii) a local context.

      What is the narrative for CAT like this?

    1. The general argument I see in this paper is largely:

      • most companies end up reporting CO2 emissions from energy use misleadingly, because the standards allow them to claim credit using a financial instrument that's not proven to be very effective.
      • worse, reporting like this causes double counting elsewhere, as other companies rely on an average figure that doesn't take into account these RECS
    2. However, to be accurate and relevant, GHG inventories must reflect the emissions caused by the reporting entity. The fundamental issue with the contractual method is that it does not represent any causal relationship between the reporting entity and the emissions reported.

      OK, so this is little bit like, like how a lagging indicator, reported at a low enough resolution won't provide enough of a signal to actually drive changes in behaviour for an org or team

    3. The distinction between attributional and consequential GHG accounting is well understood in some fields, such as life cycle assessment (LCA), but much less so in the area of corporate GHG accounting, which has traditionally used attributional accounting (Brander et al., 2015).

      Attribution and consequential accounting. Need to research these phrases

    4. However, in reality, contractual emission factors represent a market failure, because the implied goal (actual reduction of emissions to the atmosphere) is in fact not delivered by contractual emission factors (as discussed in Section 2), and therefore the approach has only the appearance, and not the substance, of a market-based solution.

      Eep.

    5. Further, for Company A's use of contractual emission factors not to lead to double-counting of the claimed renewable attributes, the method requires all other reporting entities to also apply a ‘residual grid mix’ emission factor. As this emission factor would be higher than the grid average, due to removing some renewable generation from the calculation, Company B's performance is again made to look worse, although its actual contribution to reducing emissions is greater than Company A's.

      So if they claiming the market based figures here, the avg figures on the grid would need to look dirtier, for everyone else, but it doesn't.

    6. Electricity generation accounts for approximately 25% of global greenhouse gas (GHG) emissions, with more than two-thirds of this electricity consumed by commercial or industrial users

      Two thirds comes from industry - useful figure when talking smart meters in the home

    7. This issue is highly topical as recently published reporting guidance from the GHG Protocol (WRI, 2015) has endorsed the market-based approach, while the forthcoming update of ISO 14064-1 for corporate GHG inventories provides an opportunity to establish a more robust approach.

      This was last year. I wonder if the guidance has been watered down, or made stronger.

      Recent reporting suggests not.

    8. The GHG Protocol's Scope 2 Guidance, published in 2015, requires that companies use both the locational grid average method and the market-based method to report scope 2 emissions (i.e. dual reporting). However, the guidance also allows companies to choose a single method for meeting their reduction targets and for reporting their supply chain emissions (WRI, 2015). The same guidance has been adopted by CDP, formerly the Carbon Disclosure Project (CDP, 2016a).

      This doesn't seem clear. This means that for reporting, they need both - if this is the case then companies that publish only one are basically not following the guidance properly.

    9. Moreover, in many countries the amount of renewable generation is increasing due to the other drivers, such as government subsidies (IEA, 2016b), and therefore the point at which additionality might be achieved (i.e. beyond Q1) is continually advancing further beyond the reach of voluntary market demand for contractual emission factors.

      So, they're changing anyway, and the RECs don't have much impact.

    10. ET Index Research data, which includes 2000 of the world's largest listed companies, shows 97 companies using the market-based accounting method to report lower emissions, equating to 22.2 million tCO2e/yr.2 This approximates to ~ 1% of globally available renewable electricity generation in 2015,3 and therefore demand for contractual emission factors would need to increase a hundred-fold to reach the existing supply threshold for renewable attributes (which is continually increasing anyway), and only once above that threshold would demand cause a fractional increase in renewable generation.

      We're massively under reporting corporate CO2 emissions from energy use, and the majority of companies are basically hiding the real figures

    1. New School’s Digital Equity Lab

      I had no idea this existed! Greta sounds like a cool cat

    2. Chris Adams, a web designer and climate activist in Berlin, tells me he thinks a green internet must be free of advertising. “Ninety percent of a web page being ads requires servers, and those servers are taking electricity, and that electricity is generated by burning coal,” he says. Adams has written that the European site for USA Today is a model of efficiency. It removed all of its tracking scripts and ads to be compliant with recent General Data Protection Regulation legislation in the European Union. The site size immediately shrank from 5 megabytes to 500 kilobytes, but it still basically looks the same—there are just no ads.

      I don't think the internet should be free of all advertising, and I have never said this. I'm not sure it CAN be.

      That said, not all advertising is equal - some is really scummy, invasive stuff that can get in the sea, but to say I think this feels like a reach.

    1. Describing the move as an industry first, Repsol said it wanted to lead a wider transition to renewable energy, in line with the goals of the 2015 Paris Agreement to avert catastrophic climate change.

      Oil companies are getting out of oil, and still people frame access to energy in terms of getting cheap out the ground oil.

    2. Giving extra impetus to its new goals, Repsol will link at least 40% of managers’ long-term variable pay to its emissions reduction targets.

      HOLY BISCUITS

    1. Build on narrative that integrates the two approaches, probably building oncircularity (be part of the coming standard) and efficiency of services (e.g. theexample of interface not knowing their impact in services besides havingenvironmental product declarations for all their products).

      Key thing. Inward sustainability focus misses the big picture

    Annotators

    1. In short, Nordhaus, who is mentioned both in my 2007 and 2012 pieces, tells us not to worry too much about climate change. It will be cheaper to adapt to it than to prevent it or slow it down.

      Oh jeez, so THIS is where that adaptation vs mitigation meme came from. In the context of what research gets funded, why he got the Nobel prize is making more sense now.

    1. Basically, all services in public administration will moveinto the cloud, which means that a huge amount of data will be digitalised. This has huge implications forthe security of the data (which can quickly translate into national security matters). Most of the services are now provided by companies outside Europe, which raises questions aboutdata sovereignty, ownership, transpar-ency, etc. Also, as recently proposed by Germany (GAIA-X project), the European Union should re-gard the cloud as a critical infrastructure which should be developed within Europe.

      eeep

    Annotators

    1. Almost a third of UK boards (32%) feel little or no responsibility for the climate crisis, according to a global survey of 640 chairs and non-executive directors by recruitment firm Harvey Nash and London Business School’s Leadership Institute.

      😬

  19. Nov 2019
    1. The emissions from onereturn ticket from London to New York are roughly equivalent to that of heating a typical home in the EU for a whole year (European Commission, 2019).

      Holy balls. I've never seen it compared in those terms before.

    1. Those instances of decoupling that can be observed (like the UK or Germany in past years) result mainly from deindustrialisation and the outsourcing of energy-intensive industrial production to other countries.

      I haven't found a good rebuttal to this. I'd love to find one.

    1. energyusage.evaluate(exp, 10)

      So, you work out the figures by passing the function and the params for it.

    1. The questions within this Guide are organized into three relevant impact areasto assist with this evaluation: EnvironmentalPractices and Policiesof the Cloud Service Providers; Data Center Facility Management and Equipment; and Data Center Power Sources.

      So, basically:

      • policy
      • how they provision capacity
      • how they source power
    2. In the Fallof 2017, GECsponsored an Arizona State University(ASU) graduate capstone project that foundthatCloud Service Providers’publicly available sustainability data was limited,and that thesustainabilitymetrics and terminology used by those providers wereinconsistent and confusing. The ASU research findings assisted in thedevelopment of this Guide.

      So, basically every group say it's a trainwreck right now, transparency-wise.

    3. We meet our mission by supporting institutional purchasers in leveraging their purchasing power forsustainableproductsand servicestoadvance the market for those products.

      Procurement as a lever, then.

    1. easyJet’s aircraft carbon emissions in the 2018 financial year were 7.6 million tonnes, compared to 7.1 million tonnes in the 2017 financial year. easyJet’s calculation of emissions is based on fuel burn measurement, which complies with the EU’s Emissions Trading System requirements.

      7.1m in a year in 2018. That's about 3 and a half Googles.

    1. More figures extracted from the EURECA project, 2018. 40% of servers in public sector data centres were over 5 years old. These performed 7% of compute but used 66% of power.

      Wow, that's a stat.

    2. UK Government departments are required to report energy usage and explain how they are reducing their respective carbon footprints and this obligation to understand and report scope 3 emissions is likely to become more widespread, given UK government commitments to climate change targets and increasing observance of UN Sustainable Development Goals (SDGs). It is also a requirement of the Science Based Targets methodology.

      Ah, it's a requirement of the Science Based Targets methodology. So any company signing this will need to ask the questions.

    Annotators