149 Matching Annotations
  1. Mar 2024
    1. Given that we historically didn't release many majors, some people have started to colloquially call "Yarn 2" everything using this new codebase, so Yarn 2.x and beyond (including 3.x). This is incorrect though ("Yarn 2" is really just 2.x), and a better term to refer to the new codebase would be Yarn 2+, or Yarn Berry (which is the codename I picked for the new codebase when I started working on it).
  2. Feb 2024
    1. Yes, but to what version? A patch version only, e.g. you released 1.0.0, so the "next" version is 1.0.1? Why not 1.1.0? You don't know ahead of time what version you'll be releasing until it's actually released
    2. The increment-after-release model makes sense for branching too. Suppose you have a mainline development branch, and you create maintenance branches for releases. The moment you create your release branch, your development branch is no longer linked to that release's version number. The development branch contains code that is part of the next release, so the version should reflect that.
    1. Read [[Martha S. Jones]] in A New Face for an Old Library Catalog

      Discussion on harmful content in library card catalogs and finding aids.

      The methods used to describe archive material can not only be harmful to those using them, but they also provide a useful historical record of what cataloguers may have been thinking contemporaneously as they classified and organized materials.

      This is another potentially useful set of information to have while reading into historical topics from library card catalogs compared to modern-day digital methods.

      Is anyone using version control on their catalogs?

  3. Oct 2023
    1. Take “soul” in the KJV’s Psalm 23: “The Lord is my shepherd […] He restoreth my soul.” Alter, who has by now become famous for taking the soul out of the Hebrew Bible, gives us: “The Lord is my shepherd […] My life He brings back.” Where has the soul gone? The answer is that the Hebrew didn’t really provide it in the first place. The word “nefesh” is more concrete, meaning “breath,” “life-breath,” “essential self,” and also “throat.” It suggests the material, the bodily, or, as the biblical scholar James Barr put it, “is not a separate essence and is more like the principle of life animating the person, acting in his actions, and touched by that which touches him.”
  4. Aug 2023
  5. Jul 2023
    1. ```js // Log the full user-agent data navigator .userAgentData.getHighEntropyValues( ["architecture", "model", "bitness", "platformVersion", "fullVersionList"]) .then(ua => { console.log(ua) });

      // output { "architecture":"x86", "bitness":"64", "brands":[ { "brand":" Not A;Brand", "version":"99" }, { "brand":"Chromium", "version":"98" }, { "brand":"Google Chrome", "version":"98" } ], "fullVersionList":[ { "brand":" Not A;Brand", "version":"99.0.0.0" }, { "brand":"Chromium", "version":"98.0.4738.0" }, { "brand":"Google Chrome", "version":"98.0.4738.0" } ], "mobile":false, "model":"", "platformVersion":"12.0.1" } ```

    1. ```idl dictionary NavigatorUABrandVersion { DOMString brand; DOMString version; };

      dictionary UADataValues { DOMString architecture; DOMString bitness; sequence<NavigatorUABrandVersion> brands; DOMString formFactor; sequence<NavigatorUABrandVersion> fullVersionList; DOMString model; boolean mobile; DOMString platform; DOMString platformVersion; DOMString uaFullVersion; // deprecated in favor of fullVersionList boolean wow64; };

      dictionary UALowEntropyJSON { sequence<NavigatorUABrandVersion> brands; boolean mobile; DOMString platform; };

      [Exposed=(Window,Worker)] interface NavigatorUAData { readonly attribute FrozenArray<NavigatorUABrandVersion> brands; readonly attribute boolean mobile; readonly attribute DOMString platform; Promise<UADataValues> getHighEntropyValues (sequence<DOMString> hints ); UALowEntropyJSON toJSON (); };

      interface mixin NavigatorUA { [SecureContext] readonly attribute NavigatorUAData userAgentData ; };

      Navigator includes NavigatorUA; WorkerNavigator includes NavigatorUA; ```

  6. Jun 2023
    1. There are now about 22,000 contributorsto the site, which charges between $1 and $5 per basic image

      This reminds me of the article "Wikipedia and the Death of an Expert" how there are also so many volunteers running the wikipedia page. I inserted an article that mentions how many active editors there are on wikipedia so we can really compare the similarities in contributors.

  7. May 2023
    1. This doesn't make any sense, though. Once you recognize that the two may represent different addresses, you're arbitrarily choosing the first one in your system as the right one, when the second one is just as right. Just give up at that point and lowercase ’em.

      which one should be considered the correct one?

  8. Apr 2023
    1. The Answer to the Original Issue

      This is the pinning solution.

    2. If channels were to be removed, it would be more clearer to everybody including new users, what the real use and power of nix is, and all documentation would go straight to pinning. Then only after understanding this, can people write higher level abstract tools to provide an auto-updating "channel" like interface to the underlying pinning concept.

      great channels vs pinning summary

    3. One effect of this is that you get reproducibility. Note that this is not binary reproducibility, since it's still possible for the compilation of code to give different resulting binaries. But it is reproducibility within the context of Nix universe.
    4. No pinning isn't about which channels you get. Pinning pins to a content addressed commit hash of nixpkgs or any nixpkgs even your own fork. Channels is orthogonal and probably a mistake and should be removed.

      Wonder what the consensus on channels nowadays?

    5. What's "pinning"? Is this ever explained to a new user? Where? (Yes, I understand what it is now, but new users probably will not, meaning it doesn't exist to them.) Your packages never get bug-fix updates Your packages never get security updates Broken package expressions never get fixed. If you need to update one package, you must update them all. This can be seen as either consistency, or limitation. It depends on the use case.

      Again, the entire comment is great.

    6. The main thing I want to accomplish is to distinguish between the versions of packages, and the versions of derivations.
    7. What I want is to let a channel have several derivations for a package, each with its own version. This is current behavior. I then want my derivation to depend on a specific package name and version. Since versions are currently put into some package names, this is current behavior, though it would be cleaner to just use separate names. I then want my derivation to pattern match a range of versions. This is not current behavior. The current channel will provide a package with the greatest version number matched by the range provided that it has defined.

      Another good one.

    8. To summarize what I have read here (correct me if I am wrong) @mayhewluke described some real drawbacks of how derivations are currently implemented: Derivations cannot use a specific version of a package. Derivations are limited to a very small subset of real versions for dependencies.

      This entire comment is gold

    9. You always have the option of adding missing versions of certain packages to your local database by means of an override as described in http://nixos.org/nixpkgs/manual/#how-to-create-nix-builds-for-your-own-private-haskell-packages. You can register aeson-0.8.1.1 in your copy of Nixpkgs without needing to change the Nixpkgs git repository at all.

      what

    10. So what happens if I have to fix a bug in an old project that was using 0.8.1.1? In that particular case, Nix won't help you and you are better off using a cabal-install sandbox or a stack build.

      Is this still the case?

    11. To be certain, you can just copy those by nix-copy-closure. Note that version numbers of direct dependencies don't contain all the information at all.
    12. We keep multiple versions in nixpkgs only when there's a good reason to. Nix is able to handle any number of versions/configurations, but on the other hand it's much more convenient when all (or most) use just a single one. It leads to better sharing of the effort in many respects: simplified maintenance, testing, sharing of the binaries, etc. It's what most distros do. (Only gentoo diverges from the big ones I know, and they pay a price for it.) When we do create more variants, we just name them (attribute paths), e.g. gcc48 and gcc49 or ffmpeg and ffmpeg-full.

      mull this over

  9. Jan 2023
    1. make dev

      I am getting this error while doing make dev I have installed the required version of python, still update with the required version please.

      File "/home/ec2-user/projects/h/h/search/config.py", line 213, in _ensure_icu_plugin

      names = [x.strip() for x in conn.cat.plugins(h="component").split("\n")]

      AttributeError: 'list' object has no attribute 'split'

    1. Hints for Preparing Documents Most documents go through several versions (always more than you expected) before they are finally finished. Accordingly, you should do whatever possible to make the job of changing them easy. First, when you do the purely mechanical operations of typing, type so subsequent editing will be easy. Start each sentence on a new line. Make lines short, and break lines at natural places, such as after commas and semicolons, rather than randomly. Since most people change documents by rewriting phrases and adding, deleting and rearranging sentences, these precautions simplify any editing you have to do later. — Brian W. Kernighan, 1974

      —Brian W. Kernighan, 1974 “UNIX for Beginners” [PDF] as Bell Labs Technical Memorandum 74-1273-18 on 29 October 1974.

      For easier editing and reuse of sentences, or even portions of lines of text, one can (and should) write sentences or sentence fragments on their own lines in digital contexts.

      This way future edits or the ability to more easily cut and paste will far easier in addition to keeping your version control files simpler and easier to read and visually track your changes. (That is in many version control systems, instead of a change appearing to affect an entire paragraph, it will only show on the single line that was changed thereby making the change easier to see.)

      This particular affordance may be a particularly useful one for note takers who expect to regularly reuse their notes in other contexts. Many forms of software (including Tex, LaTeX, and even markdown) will autowrap newlines so that a sentence broken up into clauses on multiple lines will properly wrap back into a proper looking single line when printed. Take care that in many Markdown versions adding two spaces at the end of a line will automatically create a newline in your text.

  10. Nov 2022
    1. Fifty years ago, coinciding with the centennial of the release of Darwin’s manuscript, author Morse Peckham collected all six editions into a single “variorum” text. Peckham painstakingly created a reference system that denotes the modifications and changes between editions. The text was created by Peckham’s careful enumeration of every sentence from every edition, copied onto index cards; from these cards, he carefully assembled them into a final text.
    1. We can also cultivate alternative narratives of what technological progress or improvement feels like. We can investigate, for instance, how much inconvenience users will accept in return for getting their digital objects in less carbon intensive ways, which users they are, and how that matters. We can explore how we got here in the first place: we can critique concepts like ‘user’ and ‘convenience’ from an ecocentric standpoint. We can investigate alternative futures of work, in the tech and tools that we develop and study, and in the ways we ourselves work. And we might – and probably should – do other things, and ask other questions, we haven’t yet thought of. Our ambitions cannot be just about how we conduct our research, or practices adjacent to it: our research can play its part. And to do that will require us to shift our points of inquiry.

      También es importante buscar en las narrativas alternativas qué implica el avance tecnológica de forma crítica. Investigar si los usuarios, ciudadanos están dispuestos a contar con objetos digitales bajos en uso de carbono. Conocer las expectativas y las motivaciones de esos usuarios de artefactos, archivos u objetos digitales. Investigar formas de trabajo futuro, tanto en el uso de las herramientas, las tecnologías y las metodologías que como humanistas digitales usamos.

      Y deberíamos seguramente hacernos otras preguntas, hacer otras cosas. Nuestras ambiciones o intereses no pueden ser solo sobre cómo llevamos a cabo nuestra investigación, o las prácticas afines: nuestra investigación puede desempeñar su rol en este contexto de crisis climática. Y hacer eso requerirá que cambiemos nuestros puntos de vista como comunidad HD.

    2. As a community we have responded. DH practitioners are advancing knowledge of historic and ongoing environmental oppression (e.g. Schuyler Esprit and Oonya Kempadoo’s Carisealand and Minimal Computing). At the same time, our communities can address the environmental emergency, and do so in ways that are feminist, anti-colonial, and aligned with diverse flourishings of the human and more-than-human world. We can resist the perceived ethereality of the digital, when the reality is stuff, power, and pollution. We can highlight the power consumption of some forms of play (e.g. a GPU attached Colab notebook) and to advocate for mitigations (e.g. energy-efficient programming practices). We might even ask whether our power consumption – set against all power consumption – even matters at all, is even worth worrying about, irrespective of the optics of us declaring that it doesn’t matter (Whitmarsh et al., 2021).

      Como comunidades hemos respondido. Los/as humanistas digitales están avanzando en los saberes sobre la histórica y actual opresión/colonización ambiental (Ejemplo Schuyler Esprit and Oonya Kempadoo’s Carisealand and Minimal Computing)

      Al mismo tiempo, nuestras comunidades están aproximándose a la emergencia ambiental desde una mirada feminista, anti colonial y alineada a un mundo más humano. Se resiste lo digital como meramente efímero o etéreo para concebirlo como material, poder y contaminación. Tenemos que señalar tanto las formas de consumo ( por ejemplo de la de una unidad de procesamiento de imágenes en una Colab notebook) hasta estrategias de mitigación ( ejemplo prácticas de programación no intensivas en uso de energía). Esas prácticas nos llevan incluso a preguntarnos si nuestro consumo energético importa en el marco del consumo general, a pesar de las lógicas que plantean lo contrario ( Whitmarsh et al , 2021)

    3. The Research We Do Like all academic disciplines, DH research has a particular environmental impact. Not the kind of impact typical of energy-intensive, high performance computing research elsewhere on campus, but intensive nevertheless: we buy new machines, we build infrastructures that host large digital objects, we run queries over large datasets. DH also works adjacent to fields whose research relates to environmental crises: to science and technology studies scholars concerned with their relations to L/land (Liboiron, 2021); to archivists describing how their approaches to digital preservation are environmentally unsustainable (Pendergrass et al., 2019); to artificial intelligence ethics scholars investigating the intersectional harms of large language models, some of whom have been fired by big tech for speaking out, others of whom only feel able to speak anonymously (Bender et al., 2021); to historians quantitatively analysing the disinformation tactics of big oil (Supran and Oreskes, 2021). Like much of this work, DH is increasingly – or has ambitions to be, or presents itself as – anti-colonial and feminist. At the very least it should be, given that it has been seven years since Bethany Nowviskie asked us, in an anti-colonial and feminist spirit, ‘What is the place of digital humanities (DH) practice in the new social and geological era of the Anthropocene?’ (Nowviskie, 2014).

      ¿Qué investigación hacemos desde las HD?

      Como toda disciplina académica, las HD tienen un impacto ambiental particular. Tal vez no uno energéticamente intensivo o de alto rendimiento como el informático, pero si intensivo: compramos nuevas máquinas, diseñamos infraestructuras digitales para el manejo de gran cantidad de datos y objetos. En paralelo, las HD trabajan cerca de campos de investigación relacionadas a la crisis ambiental: estudios de ciencia y tecnología preocupados por la tierra y el colonialismo de la contaminación (Liboiron, 2021) ,archivistas preocupados por el impacto ambiental de las prácticas de conservación digital (Pendergrass et al., 2019); a investigadores que analizan la intersección entre IA y los daños de los modelos de datos, algunos inclusive han sido expulsados de sus trabajos por evidenciar estos riesgos (Bender et al., 2021); a historiadores que investigan las prácticas de desinformación de las grandes petroleras, (Supran and Oreskes, 2021). En ese contexto, las HD ambicionan o se presentan a sí mismas, como decoloniales y feministas. Y mínimamente deberían ya serlo , ya que han pasado más de siete años desde la interpelación realizada por Bethany Nowviskie '¿Cuál es el rol de las humanidades digitales (HD) en un nueva era social y geológica llamada Antropoceno? ' (Nowviskie, 2014).

    4. As humanities researchers, it is also our role to probe the values, the power structures, and the future imaginaries that underpin sustainable solutions. Given, especially, the immense and monopolistic power wielded by the global tech sector, and the critiques of this power that are part of DH, our use of their resources should be informed by the ways corporate economic, cultural, and scientific power perpetuates and exacerbates the crisis. Choosing a hardware or hosting provider, for example, should mean considering direct environmental impacts, broader environmental policies and record of the provider, and more broadly still, the kinds of collective future that such a collaborative encounter presupposes. We should be able to candidly explore the complex and sometimes contradictory nature of our ecological impact: we should be able to measure and model where possible, while also creating context around our measurements, flagging uncertainties, and advocating for transforming wider conditions. This would require taking a step back from detailing the environmental impact of a resource, to ask whether the activities supported by that resource really serve ecological and social justice.

      Como investigadores de las Ciencias Humanas , nuestro rol es también evidenciar los valores, las estructuras de poder y los imaginarios futuros que fomentan otras soluciones más sostenibles. Teniendo en cuenta el inmenso y monopólico poder del sector tecnológico global, y las críticas que desde las HD a ellos se les confiere, no podemos obviar reconocer el impacto de nuestro uso de los recursos tecnológicos. Sin ello estaríamos exacerbando la crisis ambiental al reproducir las lógicas económicas y corporativas, culturales y científicas . Al elegir uno u otro hardware o hosting web, debemos hacer el ejercicio de considerar el impacto ambiental, las más amplias políticas ambientales y la "reputación" del proveedor o fabricante. Así también como la futura incidencia colectiva que éstas elecciones traen consigo.Debemos poder explorar la compleja y contradictoria naturaleza de los impactos ambientales derivados de nuestras decisiones, al mismo tiempo que creamos el escenario propicio para para nuevas medidas en un contexto de incertidumbre para la transformación. Lo anterior debería ser útil para reflexionar más allá del impacto ambiental del uso de un recurso, más bien interrogarnos acerca de si las prácticas detrás del recursos son ellas mismas socialmente justas y ecológicas.

    5. Some Context The digital is material. As digital humanists, every project we create, every software application we use, every piece of hardware we purchase impacts our environment. In this document we aim to surface the ecological impacts of our work while learning with and from our DH community about ways to reduce harm to the environment and to the people most impacted by environmental injustices.

      Contexto Lo digital es material. Como humanistas, debemos pensar que cada proyecto que creamos, cada software que usamos, cada elemento de hardware comprado impacta en el ambiente. El objetivo de este documento es poner en discusión los impactos ecológicos de nuestro trabajo mientras aprendemos con y desde nuestra comunidad HD sobre posibles reducciones del daño causado al ambiente y de las injusticias ambientales en las comunidades más afectadas.

    6. Preamble This manifesto emerged from a collective desire to foreground the climate crisis within digital humanities work. We are a group of digital humanists, in varying positions and career stages, from the Caribbean, Europe, and the United States, working within well-resourced academic institutions. We know that as individuals and as a community we contribute to the climate crisis. We believe that with our world in the midst of vast and borderless catastrophe, digital humanists have a responsibility to act.

      Preámbulo Este manifiesto nace de un interés colectivo en visibilizar la crisis climática en el marco de las Humanidades Digitales. Somos un grupo de humanistas digitales, provenientes de diversos espacios, trayectos académicos, desde el Caribe, Europa y Estados Unidos, trabajando en instituciones donde los recursos no son escasos. Sabemos que como individuos y como comunidad estamos contribuyendo en ésta crisis climática. Nosotros/as creemos que en un mundo devastado como humanistas digitales tenemos responsabilidades en accionar.

    1. For example, if using apt to install the main program for the image, be sure to pin it to a specific version (ex: ... apt-get install -y my-package=0.1.0 ...)
    2. Rebuilding the same Dockerfile should result in the same version of the image being packaged, even if the second build happens several versions later, or the build should fail outright, such that an inadvertent rebuild of a Dockerfile tagged as 0.1.0 doesn't end up containing 0.2.3.
  11. Aug 2022
  12. Mar 2022
    1. https://en.wiktionary.org/wiki/idle_hands_are_the_devil%27s_workshop

      Proverbs 16:27 "Scoundrels concoct evil, and their speech is like a scorching fire." (Oxford, NSRV, 5th Edition) is translated in the King James version as "An ungodly man diggeth up evil: and in his lips there is as a burning fire." The Living Bible (1971) translates this section as "Idle hands are the devil’s workshop; idle lips are his mouthpiece."

      The verse may have inspired St. Jerome to write "fac et aliquid operis, ut semper te diabolus inveniat occupatum" (translation: "engage in some occupation, so that the devil may always find you busy.”) This was repeated in The Canterbury Tales which may have increased its popularity.

    1. Here you can download despacito indian version song

      despacito indian version song download is a song by Puerto Rican entertainer Luis Fonsi highlighting Puerto Rican rapper Daddy Yankee from Fonsi's 2019 studio grouping Vida. followed through on 12 January, 2017, the song was framed by Fonsi, Erika Ender and Daddy Yankee, and complete by Mauricio Rengifo and Andrés Torres. A remix side highlighting Canadian Justin Bieber was out on April 17, 2017, which helped with getting top the song's chart appearance in changed countries, including an assortment of number-one positions. despacito indian version song download has been generally perceived by music include authors as being instrumental in pushing Spanish-language notable music in the standard market once more.

      It is a reggaeton and Latin pop song made for the most part time with lines about requiring a bond acted in a delicate and adoring manner. despacito indian version song download as a rule get exceptional investigations from music scholarly people, who endorsement the blend among Latin and metropolitan beat, its overwhelming quality, and its substance picture. It has gotten Latin Grammy Grants for Record of the Year, musci of the Year, top Metropolitan Combination/show, and top little Structure Music tape at the eighteenth Latin Grammy Grants. despacito indian version song download has been besides situated among the top songs ever and the top songs of 2017 by different transport, which hinted it as perhaps the most alluring Spanish-discourse tracks in notable tune record.

      despacito indian version song download beat the outline of 47 nations and displayed at the major ten of 6 others. In the US, it changed into the key music successfully in Spanish to top the Board Hot 100 as the time Los del Río's "Macarena" in 1996, coming about to tying the longest-overwhelming first on the Announcement Hot 100 at the time with around four months, comparatively as changing into the best running first on the Hot Latin Songs list with 56 weeks. despacito indian version song download moreover changed into the truly Latin song to get a diamond demand by the Recording creation Relationship of usa. The song video shows the two gifted laborers playing out the song in La Perla district of Old San Juan, Puerto Rico and general bar La Factoría. despacito indian version song download was the top-saw YouTube video ever from August 2017 to November 2020 and changed into the top video on the page to appear at the accomplishment of three, four, five, six, and seven billion perspectives.

      despacito indian version song download

    2. Get despacito indian version

      despacito indian version is a song by Puerto Rican entertainer Luis Fonsi highlighting Puerto Rican rapper Daddy Yankee from Fonsi's 2019 studio collection Vida. A remix side highlighting Canadian Justin Bieber was out on April 17, 2017, which helped with getting top the tune's format appearance in changed countries, including an assortment of number-one positions. despacito indian version has been generally perceived by music scholars as being instrumental in supporting Spanish-language notable music in the standard market once more.

      It is a reggaeton and Latin pop song made as a rule time with lines about requiring a bond acted in a delicate and adoring manner. despacito indian version usually get inconceivable surveys from music intellectuals, who acclaim the mix among Latin and metropolitan beat, its overpowering quality, and its substance picture. It has gotten Latin Grammy Grants for Record of the Year, Song of the Year, top Metropolitan Combination/show, and Best little Structure Music Video at the eighteenth Latin Grammy Grants. despacito indian version has been in like way situated among the top tunes ever and the top tunes of 2017 by different scattering, which suggested it as perhaps the most winning Spanish-language tracks in notable music record.

      despacito indian version beat the outline of 47 nations and displayed at the major ten of 6 others. In the US, it changed into the central music basically in Spanish to best the Board Hot 100 since the time Los del Río's "Macarena" in 1996, coming about to tying the longest-overall first on the Announcement Hot 100 at the time with around four months, also as changing into the best running first on the Hot Latin Songs list with 56 weeks. despacito indian version in addition changed into the standard Latin tune to get a jewel demand by the Recording creation Relationship of usa. The tune video shows the two specialists playing out the song in La Perla region of Old San Juan, Puerto Rico and general bar La Factoría. despacito indian version was the top-saw YouTube video ever from August 2017 to November 2020 and changed into the top video on the site page to appear at the achievement of 3, four, 5, six, and seven billion perspectives.

      despacito indian version https://sufiscore.com/artist/paras-nath/

  13. Jul 2021
  14. Jun 2021
  15. May 2021
    1. This post was originally published on my blog in french and on the Litmus forums in June 2015. It was updated with information about support in the new Outlook Web App in January 2016.
    1. The command nix-shell will build the dependencies of the specified derivation, but not the derivation itself. It will then start an interactive shell in which all environment variables defined by the derivation path have been set to their corresponding values, and the script $stdenv/setup has been sourced. This is useful for reproducing the environment of a derivation for development.

      QUESTION: What exactly does nix-shell execute from the Nix expression (i.e., shell.nix, default.nix, etc.)?

      ANSWER: Based on my current understanding, the answer is everything. It calls $stdenv/setup (see annotation below) to set up the most basic environment variables (TODO: expand on this), and "injects" the most common tools (e.g., gcc, sed) into it.

      It also defines the phases (TODO: verify this) and builder functions, such as genericBuilder. For example, the default builder is just two lines:

      source $stdenv/setup
      genericBuild
      

      TODO: pkgs/stdenv/generic/builder.sh is a mystery though.

      QUESTION: Once dropping into nix-shell, how do I know what phases to execute by looking at a default.nix? (E.g., [..]freeswitch/default.nix)

      ANSWER: As far as I can tell, one can override the phases in their Nix build expression (to build the derivation, see at the bottom), but they won't get executed as only the $stdenv/setup (see above) will get sourced, and no builders are called that, in return, invoke the phases (again, see above).

      So if one is using nix-shell

      • to create/hack on a package, the person has to manually invoke the builder or phases (TODO: still fuzzy on this subject)

      • to set up an environment, then one doesn't even have to worry about builders/phases because we just use nix-shell to clear the environment and to inject tools that we need for a given task

      QUESTION: When dropping into nix-shell, is this Nix expression (i.e., freeswitch/default.nix) executed? Or just parts of it?

      ANSWER: As stated above, all of the input Nix expression is evaluated, but no builders and build phases are called; although, nothing prevents one to override the phases, in case they are creating/hacking on a package.

      QUESTION:

      The command nix-shell will build the dependencies of the specified derivation, but not the derivation itself.

      What is the "derivation" here exactly? I know that it is a build expression, but does that mean the default.nix (or other Nix expression) nix-shell is invoked with?

      <sup>This statement also seems like a contradiction with how `nix-shell` works (i.e., if one issues `nix-shell -p curl`, then `curl` will be available in that sub-shell), but `-p` acts like a shortcut to as if `curl` had been listed in `buildInputs` so this is not the case.</sup>

      ANSWER: I have the feeling my confusion comes from the fact that the term "derivation" is used ambiguously in the manuals, sometimes to mean multiple things (see list below).

      TODO: Substantiate this claim, and make sure that it not coming from my misunderstanding certain topics.

      • Nix build expression (such as default.nix) whose output is going to become the store derivation itself (see last item at the bottom about the Nix manual's glossary definition)

      • store derivation.

      Had multiple cracks at unambiguously define what a derivation is, and here's a list of these:

      QUESTION: What is the difference between nix-shell -p and nix-shell invoked with a Nix expression of mkShell (or other that achieves the similar effect)?

      QUESTION: nix-shell does not create a sub-shell, so what does it do? (clarification: so nix-shell indeed does it; I confused it with nix shell)

  16. Apr 2021
    1. “Digital technology allows us to be far more adventurous in the ways we read and view and live in our texts,” she said. “Why aren’t we doing more to explore that?”

      Some of the future of the book may be taking new technologies and looking back at books.

      I wonder if the technology that was employed here could be productized and turned into an app or platform to allow this sort of visual display for more (all?) books?

  17. Mar 2021
    1. This is not a fork. This is a repository of scripts to automatically build Microsoft's vscode repository into freely-licensed binaries with a community-driven default configuration.

      almost without a doubt, inspired by: chromium vs. chrome

    1. this only applies to end products which are actually deployed. For my modules, I try to keep dependency version ranges at defaults, and recommend others do the same. All this pinning and packing is really the responsibility of the last user in the chain, and from experience, you will make their life significantly more difficult if you pin your own module dependencies.
  18. Feb 2021
    1. A Nix expression describes everything that goes into a package build action (a “derivation”)

      Come up with an ultimate definition for what a "derivation" is.

      So round up all the places where it is mentioned across Nix* manuals, and check out these:


      From Nix Pills section 6.1. The derivation function (see annotation):

      A derivation from a Nix language view point is simply a set, with some attributes. Therefore you can pass the derivation around with variables like anything else.

      So there is clearly an ambiguity between what derivations are perceived to be and what is stated in the Eelco Dolstra's PhD thesis. Or maybe I'm having issues with reading comprehension again...

    2. For each output declared in outputs, the corresponding environment variable is set to point to the intended path in the Nix store for that output. Each output path is a concatenation of the cryptographic hash of all build inputs, the name attribute and the output name. (The output name is omitted if it’s out.)

      QUESTION: So when I see $out in a builder script, it refers to the default output path because the output attribute in the Nix expression has never been explicitly set, right?

    3. A derivation causes that derivation to be built prior to the present derivation; its default output path is put in the environment variable.

      That is, if an input attribute is a reference to a derivation in the Nix store, then

      1. that derivation is built first (after a binary substitute is not found, I presume), and
      2. the path to the built package (for a better word) is handed to the shell build script.
    4. derivationA description of a build action. The result of a derivation is a store object. Derivations are typically specified in Nix expressions using the derivation primitive. These are translated into low-level store derivations (implicitly by nix-env and nix-build, or explicitly by nix-instantiate).

      Organically related to the annotation regarding my nix-shell confusion.

      The dissection of this definition to show why I find it lacking:

      A description of a build action.

      The first (couple) time(s) I read the manuals, this description popped up in many places, and I identified it with Nix expression every time, thinking that a derivation is a synonym for Nix expression.

      Maybe it is, because it clearly tries to disambiguate between store derivations and derivation in the last sentence.

      The result of a derivation is a store object.

      Is this store object the same as a store derivation?

      Derivations are typically specified in Nix expressions using the `derivation primitive. These are translated into low-level store derivations (implicitly by nix-env and nix-build, or explicitly by nix-instantiate).

      QUESTION: So, the part of the Nix build expression (such as default.nix) where the derivation primitive is called (explicitly or implicitly, as in mkDerivation) is the derivation, that will be ultimately be translated into store derivations?

      ANSWER: Start at section 15.4 Derivation.


      QUESTION: Also, why is typically used here? Can one define derivations outside of Nix expressions?

      ANSWER(?): One could I guess, because store derivations are ATerms (see annotation at the top), and the Nix expression language is just a tool to translate parameterized build actions into concrete terms to build a software package. The store derivations could be achieved using different means; e.g., the way Guix uses Guile scheme to get the same result))


      I believe, that originally, derivation was simply a synonym to store derivation. Maybe it still is, and I'm just having difficulties with reading comprehension but I think the following would be less misleading (to me and apart from re-writing the very first sentence):

      Derivations are typically the result of Nix expressions calling the derivation primitive explicitly, or implicitly usingmkDerivation`. These are translated into low-level store derivations (implicitly by nix-env and nix-build, or explicitly by nix-instantiate).

    5. $stdenv/setup

      QUESTION: Does this refer to pkgs/stdenv/generic/setup.sh? According to 6.5 Phases in the Nixpkgs manual?

      ANSWER: I'm pretty sure it does. It sets up the environment (not sure how yet; I see the env vars, but not the basic commands - sed, awk, etc. - that are listed below) and defines a bunch of functions (such as genericBuilder) but it doesn't call these functions!

    6. The function mkDerivation in the Nixpkgs standard environment is a wrapper around derivation that adds a default value for system and always uses Bash as the builder, to which the supplied builder is passed as a command-line argument. See the Nixpkgs manual for details.

      "Documented" in the Nixpkgs manual under 6.1 Using stdenv.

      Used the double-quotes above because I don't consider it well documted. Will give it a try too; worst case scenario is that I'll fail as well.

    7. C.12. Release 1.6 (2013-09-10)In addition to the usual bug fixes, this release has several new features:The command nix-build --run-env has been renamed to nix-shell.
    8. See annotations with the build-phases tag.


      Why are the build phases not enumerated in the Nix manual? If the instructions on how to create a derivation (and thus, a package) then why not go all in instead of spreading out information in different manuals, making the subject harder to grasp?...

      (By the way, it is documented in the Nixpkgs manual under 6.5 Phases; not sure why it is not called build phases when every page refers to them like that.)

    9. Chapter 14. A Simple Nix Expression

      This such a stupid move to go through a derivation example before introducing the language.

    10. Add the package to the file pkgs/top-level/all-packages.nix. The Nix expression written in the first step is a function; it requires other packages in order to build it. In this step you put it all together, i.e., you call the function with the right arguments to build the actual package.

      In addition to this rant, step 3. should be more generic, instead of tying it to Nixpkgs; at least, either show how to build your own Nix expression repo, or don't add this step, but it is not at all necessary to write a derivation. There is a Nixpkgs manual for a reason.

    11. $ nix-env -i firefox --substituters ssh://alice@avalon This works similar to the binary cache substituter that Nix usually uses, only using SSH instead of HTTP

      So a substitute is a built binary for a given derivation, and a substituter is a server (or binary cache) that serves pre-built binaries, right?

      Update: in the next line it says that "it will fall back to using the binary cache substituter", so I guess that answers it.

    12. substitute

      this is another key topic. Also:

      • substitute vs. substituter => this (I think)

      See annotations with the substitute tag

    13. When you ask Nix to install a package, it will first try to get it in pre-compiled form from a binary cache. By default, Nix will use the binary cache https://cache.nixos.org; it contains binaries for most packages in Nixpkgs. Only if no binary is available in the binary cache, Nix will build the package from source. So if nix-env -i subversion results in Nix building stuff from source, then either the package is not built for your platform by the Nixpkgs build servers, or your version of Nixpkgs is too old or too new.

      binary caches tie in with substitutes somehow; get to the bottom of it. See annotations with the substitute tag.

      Maybe this?

    14. closure

      Another gem: who knows what a "closure" is.

      [This highlight] (a couple lines below) implicitly explains it though:

      The command nix-copy-closure copies a Nix store path along with all its dependencies to or from another machine via the SSH protocol. It doesn’t copy store paths that are already present on the target machine.

      or this, also just a couple lines below:

      the closure of a store path (that is, the path and all its dependencies)

    15. the closure of a store path (that is, the path and all its dependencies)
    16. The command nix-copy-closure copies a Nix store path along with all its dependencies to or from another machine via the SSH protocol. It doesn’t copy store paths that are already present on the target machine. For example, the following command copies Firefox with all its dependencies:
    17. subscribes you to a channel that always contains that latest version of the Nix Packages collection.

      That is a misleading statement. The latest version is where the master branch points, isn't it?

      So a channel points to a Nixpkgs commit (on a branch named after the channel) where all packages inside are deemed stable, and all packages are built to have available binary substitutes by a (hydra) build farm.

    18. A Nix channel is just a URL that points to a place that contains a set of Nix expressions and a manifest.
    19. garbage collector roots

      Definitely avoid this, when a term is used but only introduced formally way later. (There is also a reference to "garbage collector roots" almost at the beginning as well.)

    20. $ nix-env --switch-profile /nix/var/nix/profiles/my-profile $ nix-env --switch-profile /nix/var/nix/profiles/default These commands switch to the my-profile and default profile, respectively. If the profile doesn’t exist, it will be created automatically.

      learn more about profiles; creating new profiles was new info

    21. Chapter 10. ProfilesProfiles and user environments are Nix’s mechanism for implementing the ability to allow different users to have different configurations, and to do atomic upgrades and rollbacks.
    22. user environment
    23. In Nix, different users can have different “views” on the set of installed applications. That is, there might be lots of applications present on the system (possibly in many different versions), but users can have a specific selection of those active — where “active” just means that it appears in a directory in the user’s PATH. Such a view on the set of installed applications is called a user environment, which is just a directory tree consisting of symlinks to the files of the active applications.
    24. nix-env -qas

      ... and it takes AGES to complete

    25. 4.3.1. Change the Nix store path prefix

      There is a lot of place in this manual (and probably in the others as well) where the prefix is referred to (usually with italics, such as "prefix/store"), so in the book

      • this should be linked to this section (or the one in the book), and

      • establish a clear and well-communicated notation to convey this

    26. At the same time, it is not possible for one user to inject a Trojan horse into a package that might be used by another user.
    27. Chapter 6. SecurityNix has two basic security models. First, it can be used in “single-user mode”, which is similar to what most other package management tools do: there is a single user (typically root) who performs all package management operations. All other users can then use the installed packages, but they cannot perform package management operations themselves.Alternatively, you can configure Nix in “multi-user mode”. In this model, all users can perform package management operations — for instance, every user can install software without requiring root privileges. Nix ensures that this is secure. For instance, it’s not possible for one user to overwrite a package used by another user with a Trojan horse.

      Would have been nice to link these to the install chapter where single- and multi-user modes were mentioned.

      How would this look in a topic-based documentation? I would think that his chapter would be listed in the pre-requisites, and it could be used to buld different reading paths (or assemblies in DocBook, I believe) such as practical, depth-first (if there are people like me who want to understand everything first), etc.

    28. reentrancy
    29. You can uninstall Nix simply by running: $ rm -rf /nix
    30. $ mkdir /nix $ chown alice /nix

      Traditionally, when a command should be invoked with sudo, it is either included in the example, or the shell indicator is # instead of $.

    31. To explicitly select a single-user installation on your system:

      It should be noted in this section also that since nix 2.1.0, single user install is the default.

    32. nix-shell '<nixpkgs>' -A pan

      What is happening here exactly?

      nix-shell's syntax synopsis always bugged because it looks like this

      SYNOPSIS
      nix-shell [--arg name value] [--argstr name value] [{--attr | -A} attrPath] [--command cmd] [--run cmd] [--exclude regexp] [--pure] [--keep name] {{--packages | -p} packages...  | [path]}
      

      and the canonical example is nix-shell '<nixpkgs>' -A pan; what tripped me up is that path is usually the first in examples, and I thought that the position of arguments are strict. As it turns out, nix-shell -A pan '<nixpkgs> is just as valid.

      Side note<br> Apparently there is no standard for man pages. See 1, 2.

      '<nixpkgs>' path is the one specified in the NIX_PATH environment variable, and -A pan looks up the pan attribute in pkgs/top-level/all-packages.nix in the Nixpkgs repo.

    33. since packages aren’t overwritten, the old versions are still there after an upgrade. This means that you can roll back to the old version:

      Wouldn't hurt to tell folks that this is a convenience layer, and one could also just use the old package from the /nix/store, even though that path would be long and obscure; one could use symlinks of course.

      Or, onc could just use nix-shell -p that specifies a specific version (that's already in the store), but, of course, it's not that simple...

      https://github.com/NixOS/nixpkgs/issues/9682

    1. Your Rails app Gemfile may have a line requiring sass-rails 5.0: gem 'sass-rails', '~> 5.0' # or gem 'sass-rails', '~> 5' These will prevent upgrade to sprockets 4, if you'd like to upgrade to sprockets 4 change to: gem 'sass-rails', '>= 5'
    1. The work put into Trailblazer 2.1 has been tremendous, it could easily have been TRB 3.0, or even TRB III, since Roman version numbering turns out to be quite a fancy thing to do. However, as much as the internals have been improved, as little has changed on the public APIs of Trailblazer, so we decided to go with a minor release.
    1. Specifying a name and a src is the absolute minimum Nix requires.

      Didn't they mean what mkDerivation requires?

      I have been jumping around in this manual, so not sure about what arguments does derivation require.

    2. For convenience, you can also use pname and version attributes and mkDerivation will automatically set name to "${pname}-${version}" by default.

      The error messages are not helpful when one messes up the input attribute set ofmkDerivation (i.e., either name, or pname and version attributes have to be present); see Nixpkgs issue #113520.

    3. 6.1. Using stdenv
    4. fetchpatch works very similarly to fetchurl with the same arguments expected. It expects patch files as a source and and performs normalization on them before computing the checksum. For example it will remove comments or other unstable parts that are sometimes added by version control systems and can change over time.
    5. 19.3. Submitting security fixes Security fixes are submitted in the same way as other changes and thus the same guidelines apply. If the security fix comes in the form of a patch and a CVE is available, then the name of the patch should be the CVE identifier, so e.g. CVE-2019-13636.patch in the case of a patch that is included in the Nixpkgs tree. If a patch is fetched the name needs to be set as well, e.g.: (fetchpatch { name = "CVE-2019-11068.patch"; url = "https://gitlab.gnome.org/GNOME/libxslt/commit/e03553605b45c88f0b4b2980adfbbb8f6fca2fd6.patch"; sha256 = "0pkpb4837km15zgg6h57bncp66d5lwrlvkr73h0lanywq7zrwhj8"; }) If a security fix applies to both master and a stable release then, similar to regular changes, they are preferably delivered via master first and cherry-picked to the release branch. Critical security fixes may by-pass the staging branches and be delivered directly to release branches such as master and release-*.
    6. 18.6. Patches Patches available online should be retrieved using fetchpatch. patches = [ (fetchpatch { name = "fix-check-for-using-shared-freetype-lib.patch"; url = "http://git.ghostscript.com/?p=ghostpdl.git;a=patch;h=8f5d285"; sha256 = "1f0k043rng7f0rfl9hhb89qzvvksqmkrikmm38p61yfx51l325xr"; }) ];

      ... and from Chapter 11:

      fetchpatch works very similarly to fetchurl with the same arguments expected. It expects patch files as a source and and performs normalization on them before computing the checksum. For example it will remove comments or other unstable parts that are sometimes added by version control systems and can change over time.

      ... and also adding highlight of 19.3. Submitting security fixes

      because these are the only places I've seen fetchpatch mentioned.

      From the wild in freeswitch/default.nix in Nixpkgs:

      stdenv.mkDerivation rec {
        pname = "freeswitch";
        version = "1.10.5";
        src = fetchFromGitHub {
          owner = "signalwire";
          repo = pname;
          rev = "v${version}";
          sha256 = "18dhyb19k28dcm1i8mhqvvgm2phsrmrwyjmfn79glk8pdlalvcha";
        };
      
        patches = [
          # https://github.com/signalwire/freeswitch/pull/812 fix mod_spandsp, mod_gsmopen build, drop when updating from 1.10.5
          (fetchpatch {
            url = "https://github.com/signalwire/freeswitch/commit/51fba83ed3ed2d9753d8e6b13e13001aca50b493.patch";
            sha256 = "0h2bmifsyyasxjka3pczbmqym1chvz91fmb589njrdbwpkjyvqh3";
          })
        ];
        postPatch = ''
          patchShebangs     libs/libvpx/build/make/rtcd.pl
          substituteInPlace libs/libvpx/build/make/configure.sh \
            --replace AS=\''${AS} AS=yasm
      
          # Disable advertisement banners
          for f in src/include/cc.h libs/esl/src/include/cc.h; do
            {
              echo 'const char *cc = "";'
              echo 'const char *cc_s = "";'
            } > $f
          done
        '';
      
    7. 6.5. Phases

      Not sure why this isn't called build phases... See also.

    1. undermine the integrity of the Version of Record, which is the foundation of the scientific record, and its associated codified mechanisms for corrections, retractions and data disclosure. 

      This misrepresents the situation. Authors accepted manuscripts (AAM) have been shared on institutional and subject repositories for around two decades, with greater prevalence in the last decade. Despite this the version of record (VoR) is still valued and preserves the integrity of the scholarly record. The integrity of the VoR continues to be maintained by the publisher and where well-run repository management are made aware, corrections can be reflected in a repository. The solution to this problem is the publisher taking their responsibility to preserving the integrity of the scholarly record seriously and notifying repositories, not asserting that authors should not exercise their right to apply a prior license to their AAM.

    2. the Rights Retention Strategy is not financially sustainable

      So far as I know this is not tested or based on any evidence. If the publishers think an open accepted manuscript would undermine the version of record, it doesn't demonstrate much confidence in their added value to me.

  19. Jan 2021
  20. Dec 2020
    1. Sucrase is an alternative to Babel that allows super-fast development builds. Instead of compiling a large range of JS features to be able to work in Internet Explorer, Sucrase assumes that you're developing with a recent browser or recent Node.js version, so it focuses on compiling non-standard language extensions: JSX, TypeScript, and Flow.
    2. Super-fast alternative to Babel for when you can target modern JS runtimes
    1. Everything Lives in GitWith a Jamstack project, anyone should be able to do a git clone, install any needed dependencies with a standard procedure (like npm install), and be ready to run the full project locally. No databases to clone, no complex installs. This reduces contributor friction, and also simplifies staging and testing workflows.
  21. Nov 2020
    1. Microbundle also outputs a modern bundle specially designed to work in all modern browsers. This bundle preserves most modern JS features when compiling your code, but ensures the result runs in 90% of web browsers without needing to be transpiled. Specifically, it uses preset-modules to target the set of browsers that support <script type="module"> - that allows syntax like async/await, tagged templates, arrow functions, destructured and rest parameters, etc. The result is generally smaller and faster to execute than the esm bundle
  22. Oct 2020
  23. Sep 2020
  24. Aug 2020
  25. Jul 2020
    1. RDFa is intended to solve the problem of marking up machine-readable data in HTML documents. RDFa provides a set of HTML attributes to augment visual data with machine-readable hints. Using RDFa, authors may turn their existing human-visible text and links into machine-readable data without repeating content.
    1. Every AMP document needs to have a link referencing the "canonical" version of that document. We'll learn more about what canonical pages are and different approaches to canonical linking in the Making your page discoverable step of this tutorial.
    1. Canonical linking in regular HTML pages is a common technique for declaring which page should be considered the preferred page when multiple pages include the same content.
  26. May 2020
    1. In the examples below, we are using Docker images tags to specify a specific version, such as docker:19.03.8. If tags like docker:stable are used, you have no control over what version is going to be used and this can lead to unpredictable behavior, especially when new versions are released.
  27. Apr 2020
  28. Mar 2020
    1. Q. What is up with the weird version scheme in Rubinius? A. Rubinius uses a simple epoch.sequence version scheme. For any sequence number N, N+1 will only add new capabilities, or remove something that has been listed as deprecated in <= N.
    2. Q. Why does Rubinius report the Ruby version as 10.0? A. Rubinius is a time machine. When you use it, you travel into the future. Even this README is in the future.
  29. Dec 2019
  30. May 2019
    1. But Jonah rose up to flee to Tarshish from the presence of the Lord. So he went down to Joppa, found a ship which was going to Tarshish, paid the fare and went down into it to go with them to Tarshish from the presence of the Lord.

      This is more than just a travel log. Here Jonah is saying no to God. He is refusing God’s plan for him. He is actually rejecting a direct request from the creator because of his own interests. Maybe he is afraid to prophesy repentance because his life could be at risk. There may be smooth sailing at first, but the wrath of God eventually catches up with him.

  31. Aug 2018
    1. Legislative staff members had finished rewriting AB 375, and a deal seemed imminent. That Friday, as he drank his morning coffee, Mactaggart decided to read the new bill — the fine print — one more time. He noticed a seemingly minor alteration in one section, the kind of thing most people would skip over. Mactaggart realized it would completely gut what remained of the private right of action. Furious, he called Hertzberg and Chau and told them the deal was off. Neither lawmaker could explain who made the change, Mactaggart told me, but Hertzberg scrambled to fix it. “In most negotiations, you are talking to all these different interest groups,” Hertzberg told me recently. “This is a situation where we had to go and reach out to everyone and bring that information to Mr. Mactaggart and ask him what he wanted to do.”

      Here's a case where we ought to consider creating our bills and laws via version control, so we can see exactly who, what, and when things changed along the way. It might mean much less gets done, but there'd be a lot more transparency and accountability.

    1. Contrary to paper notes, computer files do not display the traces or versions that led to their final state.

      This seems weirdly overstated. How do paper notes maintain versions and traces? Electronic documents contain rich sources of meta data for trace analysis, as well as various options to explicitly demonstrate temporal order and change through formatting.

  32. Jun 2018
    1. I think there is a need to develop a system to track the draft of a manuscript from the beginning to the end of the process.

      If you're drafting in WordPress you can set the number of revisions of your posts to infinite so that you can keep (archive) all of your prior drafts. see: https://codex.wordpress.org/Revisions

  33. Sep 2017
    1. a lack of version control over the vast majority of the research literature makes actually ‘adapting’ papers to include post-publication comments is impossible.

      This is also key.