2,977 Matching Annotations
  1. Feb 2023
    1. After running code to load all of the outages by loading zoomed-in parts of the map, we verify that the number of outages we found matches the summary’s number of total outages. If it doesn’t, we don’t save the data, and we log an error.

      NB: there may be a race condition here? In which case, running into errors should be (one) expected outcome.

    1. If HTML was all the things we wanted it to be, we designed it to be, if reality actually matched the fantasies we tell ourselves in working group meetings, then mobile apps wouldn't be written once for iOS and once for Android and then once again for desktop web, they'd be written once, in HTML, and that's what everyone would use. You wouldn't have an app store, you'd have the web.

      This is stated like unanimous agreement is a foregone conclusion.

      The Web is for content. Just because people do build in-browser facsimiles of mobile-style app UIs doesn't mean that the flattening of content and controls into a single stream is something that everyone agrees is a good thing and what should be happening. They should be doing the opposite—curbing/reigning it in.

    2. for all of the work that we've put into HTML, and CSS, and the DOM, it has fundamentally utterly failed to deliver on its promise

      You mean your promise—the position of the Web Hypertext Application Technology Working Group.

      Have you considered that the problem might have been you and what you were trying to do? You're already conceding failure at what you tried. Would it be so much worse to say that it was the wrong thing to have even been trying for?

    3. we will only gain as we unleash the kinds of amazing interfaces that developers can build when you give them the raw bedrock APIs that other platforms already give their developers

      You mean developers will gain.

    4. they're holding developers back

      Fuck developers.

    5. Jesus fucking Christ. Fuck this shit.

    6. Developers are scrambling to get out of the web and into the mobile app stores.

      This isn't new. Also: good—application developers shouldn't be the only ones holding the keys to the kingdom when it comes to making stuff available on the Web. Authors* and content editors should have those keys.

      * in the classic sense; not the post-millennium dilution/corruption where "authors" is synonymous with the tautologically defined "developers" that are spoken of when this topic is at the fore

    1. Checking your own repos on a new computer is one thing… inheriting someone else’s project and running it on your machine in the node ecosystem is very rough.

    Tags

    Annotators

    1. On HN, the user bitwize (without knowing he or she is doing so) summarizes (the first half, at least) of the situation described here:

      The appeal of JavaScript when it was invented was its immediacy. No longer did you have to go through an edit-compile-debug loop, as with Java, or even an edit-upload-debug loop as with a Perl script, to see the changes you made to your web-based application. You could just mash reload on your browser and Bob's your uncle!

      The JavaScript community, in all its wisdom, reinvented edit-compile-debug loops for its immediate, dynamic language and I'm still assmad about it. So assmad that I, too, forgo all that shit when working on personal projects.

      https://news.ycombinator.com/item?id=34827569

    1. more tips for no-build-system javascript

      Basically, ignore almost everything that Modern JS practitioners tell you that you need to be doing. You're halfway there with this experiment.

      One of the most interesting and high-quality JS codebases that has ever existed is all the JS that powers/-ed Firefox, beginning from its conception through to the first release that was ever called "Firefox", the Firefox 3 milestone release, and beyond. To the extent that there was any build system involved (for all intents and purposes, there basically wasn't), the work it performed was very light. Basically a bunch of script elements, and later Components.utils.import calls for JSMs (NB: not to be confused with NodeJS's embarrassing .mjs debacle). No idea what things are like today, but in the event that there's a lot of heavy, NodeJS-style build work at play, it would be wrong to conclude that it has anything to do with necessity e.g. after finally reaching the limits of what no-build/low-build can give you (rather than just the general degradation of standards across Mozilla as a whole).

    2. But my experience with build systems (not just Javascript build systems!), is that if you have a 5-year-old site, often it’s a huge pain to get the site built again.
    1. Together we seek the best outcome for all people who use the web across many devices.

      The best possible outcome for everyone likely includes a world where MS open sourced (at least as much as they could have of) Trident/EdgeHTML—even if the plan still involved abandoning it.

    1. The compiler recognizes the syntax version by the MODULE keyword; if it is written in small caps, the new syntax is assumed.

      Ooh... this might benefit from improvement. I mentioned to Rochus the benefits of direct (no build step) runnability in the vein of Python or JS, and he responded that he has already done this for Oberon+ on the CLI/CLR.

      For the same reasons that direct runnability is attractive, so too might be the ability to copy and paste to mix both syntaxes. (Note that this entirely a different matter from whether or not it is a good idea to commit files that mix upper and lower case; I'm talking about friction.) Then again, maybe not—how much legacy Oberon is being copied and pasted?

    1. I agree, of course, with the criticism of the price point. As I often say, $9.99/month (or even $4.99/month) is more expensive than premium email—and no matter how cool you think your thing is, it's way less important than email. You should always return something for ~$20, especially if you already have a free tier. (When I say "for $20" here, I'm talking about a one time payment, or on a subscription basis that maxes out at $20/yr.)

      The following musings are highly specific to the market for what's being sold here.

      Paying $20 should get you something that you aren't bothered about again for the next year. Maybe to make it even easier, enable anyone to request a refund of their $20 for any reason within the first 7 days. This gives a similar feel to a free trial, but it curbs abuse and helps target serious buyers in the first place. In the event that 7 days is not enough time even for people to convince themselves that they need it, maybe keep open the ability to use a severely limited version of the service for the remainder of the year. E.g. you can continue to log in and simulate what you'd get with the full version, but it's only accessible to you because you can't publish them and/or share links with anyone who doesn't have access to your account.

    1. Despite being a Rust apologist and the fact that this paper makes Rust look better than its competitors, Steve Klabnik says this paper is quite bad and that he wishes people would stop referencing it.

    2. We have CSV files as a kind of open standard

      The W3C actually chartered a CSV working group for CSV on the Web. Their recommendation resolves ambiguities of the various CSV variants, and they even went on to supercharge the format in various well-defined, nice-to-have ways.

    1. Here is a larger example. In this case, the directory structure of the modules corresponds to the import paths.

      Huh? This sounds like it's saying the opposite of what was said two paragraphs earlier.

    1. Another point made by Wirth is that complexity promotes customer dependence on the vendor. So there is an incentive to make things complex in order to create a dependency of the customer generating a more stable stream of income.
    1. Title

      In order to make it way easier to keep track of things in bookmarklet workspaces, there needs to be an option that adds an incrementing counter (timestamp?) to the bookmarklet title, so when dragging and dropping into your bookmarks library, you don't lose track of what's going on with a sea of bookmarklets all named the same thing.

    2. window.history.pushState(null, "unescaped bookmarklet contents");

      This has been giving me problems in the latest Firefox releases. It ends up throwing an error.

    1. @34:00

      In theory? RDF, it's awesome. I like it a lot. This is why I'm working on this. But in practice [...] the developer experience it's not great, and when people when they see Turtle and RDF and all these things, they don't like it, you know? My opinion is that it's because of lack of learning materials and the developer experience to get started.

    1. Usability and accessibility can impact where a technology falls on the spectrum: not paying attention to these dimensions makes it harder to move to higher levels of agency, staying more exclusive as "Look at what I/you/we can do, as the capable ones"
    1. Miguel de Icaza Jun 17, 2022 @migueldeicaza Replying to @migueldeicaza @markrendle and 2 others The foundation should fund, promote and advance a fully open source stack. And the foundation should remove every proprietary bit from the code in http://dotnet.org.

      Microsoft can and should compete on the open marketplace on their own. [...] And we should start with the debugger: we should integrate the Samsung one, it should be the default for OmniSharp and this is now we get contributions and improvements- not by ceding terrain to someone that can change the rules to their advantage at will.

      I tried (although perhaps not valiantly, but as an outsider) to convince Miguel and the then-Director of the .NET Foundation in 2015 that this state of affairs was probably coming and that he/they should reach out to the FSF/GNU to get RMS to lift the .NET fatwa, become a stakeholder/tastemaker in the .NET ecosystem, and encourage free software groupies to take charge so that FSF/GNU would be around as a failsafe for the community and would inevitably benefit greatly esp. from any of MS's future failure on this front. I tried the same in reverse, too. They seemed to expect me to be a liaison, and I couldn't get them to talk to each other directly, even though that's what needed to happen.

    1. Nobody but Eric would have thought of that shortcut to solve the problem.

      I find it odd that this is framed here as an example of an "unusual thinker". The solution seems natural, if underappreciated, for a domain where any tool's output target is one that was specifically crafted to intersect with what is ordinarily a (or in this case, the) "preferred form for modification".

      You can (and we probably all more often should) do the same thing with e.g. HTML+CSS build pipelines that sit untouched for years and in that course become unfashionable and undesirable...

    1. you can't hang useful language features off static types.For example, TypeScript explicitly declares as a design goal that they don't use the type system to generate different code

      This is a good thing. https://bracha.org/pluggableTypesPosition.pdf

      Refer to §6 #3.

    1. It would even use eye contact correction software to make it feel like you were actually looking at each other.

      If this were using professionally installed videoconferencing hardware, there would be no need for "eye contact correcting software" if done right. The use of such software would be an indicator of failure elsewhere.

    1. the NABC model from Stanford. The model starts with defining the Need, followed by Approach, Benefits, and lastly, Competitors. Separating the Need from the Approach is very smart. While writing the need, the authors have to understand it very well. The approach and benefits sections are pretty straightforward, where authors define their strategy and list down the advantages. Since most people focus on them when they talk about ideas, it's also easy to write. Then the competition section comes. It is the part the authors have to consider competitors of their proposal. Thinking about an alternative solution instead of their suggestion requires people to focus on the problem instead of blindly loving and defending their solutions. With these four parts, the NABC is a pretty good model. But it's not the only one.
  2. Jan 2023
    1. Publish content to your website using Indiekit’s own content management system or any application that supports the Micropub API

      "... assuming you rebase your site on top of Indiekit beforehand" (which is a big leap).

    2. I’m formally launching Indiekit, the little Node.js server with all the parts needed to publish content to your personal website and share it on social networks. Think of Indiekit as the missing link between a statically generated website and the social web protocols developed by the IndieWeb community and recommended by the W3C.

      Now just get rid of the server part.

      The real missing link between (conventional) static sites and the W3C's social protocols is still static. This post itself already acknowledges the reasons why1.

      See also https://news.ycombinator.com/item?id=30862612

    3. Still, installing Indiekit on a web server can be a bit painful.
    1. Publishing them to the modern web is too hard and there are few purpose-built tools that help
    2. It’s too hard to build these kinds of experiences

      "... with current recommended practices", that is.

    1. On the other hand, it means that you now need to trust that Apple isn’t going to fuck with the podcasts you listen to.

      There really is no substantial increase in trust. You were already trusting their player to do the right thing.

    2. One convenience feature is that if you paste the Apple Podcasts directory listing instead of the feed URL, I’ll look up the feed URL from the listing and treat it as a redirect.

      Some thoughts:

      • This is indeed good UX.

      • What's not good UX—and which I discovered/realized this week—is that there seems to be no easy/straightforward way to map a podcasts.apple.com podcast page to either its feed URL or to the original publisher/podcast's Web site. In reality, this would be as trivial as embedding a <link>.

      Additionally, there's a missed opportunity in the podcasting space to make judicious use of the Link header—every piece of media served as part of a podcast should include in the HTTP response a link back to the canonical feed URL and/or the original site! (And they should also have CORS enabled, while they're at it.) Why isn't this already a thing? Answer: because it's a trivial detail; podcasters could do this, but what's the point?, they'd say—almost no one is going to attach a network inspector to the requests and check to see whether they're sending these headers for only the sake of steadfast and adherence to hypermedia ideals. Worth noting that this is the exact opposite of Jobs's principle of carrying out good craftsmanship throughout e.g. a chest of drawers or when building a cabinet, even for the parts that no one will see, in order to "sleep well at night". Maybe this could be used to shame Apple?

    1. the author complains that an Apple Podcast user has to go through the app (and all its restrictions), but again, not that different from Instagram posts. As a user, you must go through Instagram to see photos.

      And cyclists need to make sure they have wheels attached before riding a bicycle.

      This is one of those things that superficially seems like a relevant retort, but as a reply it's actually total nonsense.

      Or, if you wanted to put it more abrasively: Instagram photos are not podcasts, dumbass.

    1. this is misplaced outrage on the author's part, since Apple has never produced RSS feeds

      Another casualty of the Copenhagen interpretation of ethics (or close enough, at least)?

      https://hn.algolia.com/?query=copenhagen%20interpretation%20of%20ethics%20strikes%20again&type=comment

    1. Premium feeds are rehosted by Apple and it's huge PITA because we have ad-supported public feeds and ad-free premium feeds and need to build them twice.

      The author here makes it sound like they have to reach out and grab content stream chunks, stitch them together with their own hands, and then plonk them down on the assembly line for 14 hours a day or something.

      It's a program. You write a program that does the building.

    1. There is no predetermined correlation between this import path and the file system, and the imported module doesn’t have to know anything about the import path used in an importing module.

      This is not a good approach. It's the opposite of what you want. Module resolution remains easy for computers (because of their speed), but tedious for humans.

      As a writer, maybe there's some benefit for no correlation. As a reader trying to understand a foreign codebase, esp. one who is in the moment trying to figure out, "Where is this thing defined? Where can I read the source code?" when jumping through procedure definitions, not being able to trivially ascertain which file a given thing is in is unnecessary friction. Better to offload a tiny bit of work onto the author who knows the codebase (or at least their own immediate intention) well rather than to stymie the progress of dozens/hundreds of readers trying to work things out.

    2. it quickly became clear that this approach would reach its limits as soon as several people contributed modules

      Worth noting that for many of Rochus's own projects (esp. ones related to Oberon/Oberon+), it's just a bunch of .cpp and .h files in one directory...

    1. igal needs Perl to run and it also relies on a few other programs that come standard with most Linux distributions.
    1. Hyperdocumentsmay be submitted to a library-like service (an adminis-tratively established, AUGMENT Journal) that catalogsthem, and provides a permanent, linkable address andguaranteed as-published content retrieval. This Jour-nal system handles version and access control, pro-vides notifications of supercessions and generallymanages open-ended document collections.

      Imagine an arxiv-like depository that dealt with native hypertext, rather than TeX/PDF.

      Food for thought: PWP as a prereq? What about RASH+SingleFileZ

    2. Meta-level referencing (addresses on linksthemselves) enables knowledge workers to commentupon links and otherwise reference them.
    3. Individualapplication subsystems (graphical editors, programlanguage editors, spreadsheets) work with knowl-edge products, but do not “own” hyperdocuments inthe sense of being responsible for their storage

      The opposite of current Web app norms (contra desktop).

    1. The overriding class Shape hasadded a slot, color. Since Shape is the superclass of all other classes in ShapeLi-brary, they all inherit the new slot.

      This is the one thing so far where the procedural syntactic mechanism isn't doing obvious heavy lifting.

    2. The slot definition of List fills the role of an import statement, as do thoseof Error and Point.

      ... at some expense to ergonomics.

      It's odd that they didn't introduce special syntax for this. They could have even used import to denote these things...

    3. The factory method is somewhat similar to a traditional constructor. How-ever, it has a significant advantage: its usage is indistinguishable from an ordinarymethod invocation. This allows us to substitute factory objects for classes (orone class for another) without modifying instance creation code. Instance cre-ation is always performed via a late bound procedural interface.

      The class semantics for new in ES6 really bungled this in an awful way.

    4. Newspeak programs enjoy the property of representation independence

      Is that mechanism or culture?

    5. All names are late bound

      Not all names, I think. Local identifiers, for example...?

    Annotators

    1. Patch based systems are idiotic, that's RCS, that is decades old technology that we know sucks (I've had a cocktail, it's 5pm, so salt away).Do you understand the difference between pass by reference and pass by value?

      Larry makes a similar analogy (pass by value vs pass by reference) to my argument about why patches are actually better at the collaboration phase—pull requests are fragile links. Transmission of patch contents is robust; they're not references to external systems—a soft promise that you will service a request for the content when it comes. A patch is just the proposed change itself.

    1. Literate programming worked beautifully until wegot to a stage where we wanted to refactor theprogram. The program structure was easy tochange, but it implied a radical change to thestructure of the book. There was no way we couldspend a great deal of time on restructuring thebook so we ended up with writing appendices andappendices to appendices that explained what wehad done. The final book became unreadable andonly fit for the dustbin.The lesson was that the textbook metaphor is notapplicable to program development. A textbook iswritten on a stable and well known subject whilea program is under constant evolution. Weabandoned literate programming as being toorigid for practical programming. Even if we got itright the first time, it would have failed in thesubsequent maintenance phases of the program’slife cycle.

    Tags

    Annotators

    1. How do we package software in ways that maximize its reusability while minimizing the level of skill required to achieve reuse?

      Is that really the ultimate, most worthy goal? It seems that "minimizing the level of skill required[...]" is used as a proxy here for what we're really after—minimizing the total cost of producing the thing we want. Neither the minimization of skilled use nor reuse should be held as a priori goals.

    1. if you are running a software business and you aren't at, like, Google-tier scale, just throw it all in a monorepo

      The irony* of this comment is that Google and Google engineers are famously some of the most well-known users/proponents of monorepos.

      * not actual irony; just the faux irony—irony pyrite, or "fool's irony", if you like

    1. I would argue that it’s simply more fun to engage with the digital world in a read-write way, to see a problem and actually consider it fixable by tweaking from the outside

      He doesn't exactly say it here, but many others making the same observations will pair it with the suggestion that this is because of some intrinsic property of the digital medium. If you think about it, that isn't true. If you consider paper, people tend to be/feel more empowered to tweak it for their own use (so long as they own the copy); digital artifacts seem more hands-off, despite their potential, because the powers involved are reserved for wizards, largely thanks to the milieu that those who are the wizards have cultivated to benefit themselves and their livelihood first, rather than empowering the ordinary computer user.

    2. Software should be a malleable medium, where anyone can edit their tools to better fit their personal needs. The laws of physics aren’t relevant here; all we need is to find ways to architect systems in such a way that they can be tweaked at runtime, and give everyone the tools to do so.

      It's clear that gklitt is referring to the ability of extensions to augment the browser, but: * it's not clear that he has applied the same thought process to the extension itself (which is also software, after all) * the conception of in-browser content as software tooling is likely a large reason why the perspective he endorses here is not more widespread—that content is fundamentally a copy of a particular work, in the parlance of US copyright law (which isn't terribly domain-appropriate here so much as its terminology is useful)

    3. a platform with tremendous potential, but somewhat disorganized and neglected under current management

      This has almost always been the case—at least as far back as 10+ years ago with addons.mozilla.org, too.

    4. CSS classes

      NB: there's no such thing as a "CSS class". They're just classes—which you may use to address things using CSS's selector language, since it was conveniently (and wisely) designed from the beginning to incorporate first-class* support for them.

      * no pun intended

    5. it’s getting harder to engineer browser extensions well as web frontends become compiled artifacts that are ever further removed from their original source code
    6. because it’s building on an unofficial, reverse-engineered foundation, there are no guarantees at all about when things might change underneath

      This is an unfortunate reality about the conventions followed by programmers building applications with Web-based interfaces: no one honors the tradition of the paper-based forms that their digital counterparts are supposed to mimic; they're all building TOSS-style APIs (and calling that REST) instead of actual, TURN-style REST interfaces.

    1. too much focus on the ‘indie’ (building complicated self-hosted everything-machines) and not enough on the ‘web’
    1. a special/reserved GET param could be used in order to specifying the version hash of the specific instance of the resource you want
      • MementoWeb
      • WebDAV
    2. we have one of the most powerful languages for manipulating everything in the browser (ecmascript/javascript) at our disposal, except for manipulating the browser itself! Some browsers are trying to address this (e.g. http://conkeror.org/ -- emacs styled tiled windows in feature branch!) and I will be supporting them in whatever ways I can. What we need is the bash/emacs/vim of browsers -- e.g. coding changes to your browser (emacs style) without requiring recompiling and building.

      That was what pre-WebExtensions Firefox was. Mozilla Corp killed it.

      See Yegge's remarks on The Pinocchio Problem:

      The very best plug-in systems are powerful enough to build the entire application in its own plug-in system. This has been the core philosophy behind both Emacs and Eclipse. There's a minimal bootstrap layer, which as we will see functions as the system's hardware, and the rest of the system, to the greatest extent possible (as dictated by performance, usually), is written in the extension language.

      Firefox has a plugin system. It's a real piece of crap, but it has one, and one thing you'll quickly discover if you build a plug-in system is that there will always be a few crazed programmers who learn to use it and push it to its limits. This may fool you into thinking you have a good plug-in system, but in reality it has to be both easy to use and possible to use without rebooting the system; Firefox breaks both of these cardinal rules, so it's in an unstable state: either it'll get fixed, or something better will come along and everyone will switch to that.

      Something better didn't come along, but people switched anyway—because they more or less had to, since Mozilla abandoned what they were switching from.

    1. Sciter. Used for rendering the UI of apps. There's no browser using Sciter to display websites, and the engine is Closed source.

      Worth noting that c-smile, the creator of Sciter, put out an offer during COVID lockdowns to make Sciter open source if someone would fund it for $100k. That funding never came through.

    1. https://michaelkarpeles.com/math.html
    2. https://michaelkarpeles.com/essays/philosophy/what-the-browser-is-missing.html
    3. My central goal is to further Paul Otlet, et al's, vision and head toward an amalgamous World Wide Web (a Universal Knowledge Repository) freed of arbitrary, discrete "document" boundaries.

      My central goal is a universal knowledge repository freed of discrete "document" boundaries

    1. Readers must learn specific reflective strategies. “What questions should I be asking? How should I summarize what I’m reading?” Readers must run their own feedback loops. “Did I understand that? Should I re-read it? Consult another text?”

      I generally don't have to do that when reading except when reading books or academic papers. This suggests that there's not really anything wrong with the form of the book, but rather its content (or the stylistic presentation of that content, really).

      I've said it a bunch the biggest barrier to accessibility of academic articles specifically is the almost intolerable writing style that almost every non-writer adopts when they're trying to write something to the standards for acceptance in a journal. Every journal article written for joyless robots should be accompanied by a blog post (or several of them) on the author's own Web site that says all the same things but written for actual human beings.

    2. Readers can’t just read the words. They have to really think about them. Maybe take some notes. Discuss with others. Write an essay in response. Like a lecture, a book is a warmup for the thinking that happens later.

      What if, when you bought a book, included was access to a self-administered test for comprehension? Could this even solve the paying-for-things-limits-access-to-content problem? The idea would be to make the thing free (ebooks, at least), but your dead tree copy comes with access to a 20-minute interactive testing experience (in a vendor-neutral, futureproof format like HTML and inline JS—not necessarily a Web-based learning portal that could disappear at any moment).

    1. I saw this tech talk by Luis Von Ahn (co-creator of recaptcha) and learned about the idea of harnessing human computation

      Consider: a version of the game 20 Questions that helps build up a knowledge base that can be relied upon for Mek's aforementioned Michael Jackson-style answers.

    2. How did it work? GNUAsk (the aspirational, mostly unreleased search engine UI) relied on hundreds of bots, running as daemons, and listening in on conversations within AOL AIM, IRC, Skype, and Yahoo public chat rooms and recording all the textual conversations.
    1. and developers are required to have the Ruby runtime in their environment, which isn’t ideal.
    2. In one of our early conversations with developers working on CLIs outside of Shopify, oclif came up as an excellent framework of tools and APIs to build CLIs in Node. For instance, it was born from Heroku’s CLI to support the development of other CLIs. After we decided on Node, we looked at oclif’s feature set more thoroughly, built small prototypes, and decided to build the Node CLI on their APIs, conventions, and ecosystem. In hindsight, it was an excellent idea.

      Minority(?) viewpoint: oclif-based command-line apps (if they're anything like Heroku's, at least) follow conventions that are alien and make them undesirable.

    3. There’s a caveat that we’re aware of—while Hydrogen and App developers only require one runtime (Node), Theme developers need two now: Ruby and Node.

      Well, you could write standards-compliant JS... Then people could run it on the runtime everyone already has installed, instead of needing to download Node.

    4. Of all the programming languages that are used at Shopify, Ruby is the one that most developers are familiar with, followed by Node, Go, and Rust.

      Node is not a programming language.

    1. And my mom is getting older now and I wish I had all the comments, posts, and photos from the past 14 years to look back on and reminisce. Can’t do that now.

      This reminds me of, during the height of the iPod era, when someone I know was gifted* an non-Apple music player and some iTunes gift cards—their first device for their first music purchases not delivered on physical media. They created an iTunes account, bought a bunch of music on the Music Store, and then set about trying to get it onto their non-Apple device, coming to me when it wasn't going well trying to get it to work themselves. I explained how Apple had (at the time) made iTunes Music Store purchases incompatible with non-Apple devices. Their response was baffling to me:

      Rather than rightly getting pissed at Apple for this state of affairs, they did the opposite—they expressed their disdain about the non-Apple MP3 player they were given** and resolved to get it exchanged for credit so they could buy a (pricier, of course) Apple device that would "work". That is, they felt the direct, non-hypothetical effects of Apple's incompatibility ploy, and then still took exactly the wrong approach by caving despite how transparently nefarious it all was.

      Returning to this piece: imagine if all that stuff hadn't been locked up in the social media silo. Imagine if all those "comments, posts, and photos from the past 14 years" hadn't been unwisely handed over for someone else to keep out of reach unless you assimilated. Imagine just having it delivered directly to your own inbox.

      * NB: not by me

      * NB: not as a consequence for mimetic desire for the trendiest device; they were perfectly happy with the generic player before they understood the playback problem

    2. It’s not feasible to constantly be texting and phone calling Paula from 10th grade geometry, etc.

      This was initially confusing. What makes texting infeasible, but doing it through Facebook is feasible? I realized upon reaching the end of the next paragraph: "I cant make a new Facebook in 2023 and add all these old friends. Literally psychotic behavior."

      When this person talks about "keeping up", they don't mean "interacting". They mean non-interactively keeping tabs on people they once knew but that they don't really have an ongoing relationship with.

    1. It's interesting how few comments are engaging with the substance of the piece. They are encountering for the first time the idea that Rikard is providing a commentary on—that is, giving students their own big kid Web site, an idea that "belongs" to the "Domain of One's Own" effort—and expressing enthusiasm for it here as comments nominally about this piece, which is really rather intended to express a specific, reflective/critical response to overall idea, and does not pretend to present that idea as a novel suggestion for the first time...

    1. References to "the World Wide Wruntime" is a play on words. It means "someone's Web browser". Viz this extremely salient annotation: https://hypothes.is/a/i0jxaMvMEey_Elv_PlyzGg

    1. the patriotic or religious bumper-stickers

      College graduates in 2005 could understand what this meant. I'm skeptical that college graduates in 2023 can really grok this allusion, even if it were explained.

      See also:

      this previous comment thread with a minority detractor view on Idiocracy [...] argues it’s a little more dated to it’s specific Bush-era cultural milieu than everyone remembers

      https://news.ycombinator.com/item?id=29738799

      E.g.:

      [Idiocracy's] "you talk faggy" [...] sadly was common in real life during the mid-00s [...] but would be completely taboo now

      https://news.ycombinator.com/item?id=18489573

    2. how annoying and rude it is that people are talking loudly on cell phones in the middle of the line. And look at how deeply and personally unfair this is

      That's actually not (just) seemingly "personally unfair"—it's collectively unfair. The folks responsible for these things serve as the better example of self-centeredness...

    3. Because my natural default setting is the certainty that situations like this are really all about me. About MY hungriness and MY fatigue and MY desire to just get home, and it’s going to seem for all the world like everybody else is just in my way.

      The fact that we're not talking about a child here but that it was considered a normal for a 43-year-old man in 2005 to have this as his default setting perhaps explains quite a lot about the evident high skew of self-centeredness in folks who are now in their sixties and seventies.

      I didn't notice this in 2005, but maybe I wasn't paying close enough attention.

    4. clichés

      thought-terminating ones, even

    5. there is actually no such thing as atheism. There is no such thing as not worshipping. Everybody worships

      Very sophomoric argument, and it's hard not to point out the irony between this claim and everything preceding it wrt self-assuredness.

      Is it impossible for there to exist people to whom this description doesn't apply, or is it merely annoying and inconvenient to consider the possibility that they might?

    6. none of this is likely, but it’s also not impossible
    7. Everyone here has done this, of course. But it hasn’t yet been part of you graduates’ actual life routine, day after week after month after year.
    8. By way of example, let’s say it’s an average adult day, and you get up in the morning, go to your challenging, white-collar, college-graduate job, and you work hard for eight or ten hours, and at the end of the day you’re tired and somewhat stressed and all you want is to go home and have a good supper and maybe unwind for an hour, and then hit the sack early because, of course, you have to get up the next day and do it all again. But then you remember there’s no food at home. You haven’t had time to shop this week because of your challenging job, and so now after work you have to get in your car and drive to the supermarket. It’s the end of the work day and the traffic is apt to be: very bad. So getting to the store takes way longer than it should, and when you finally get there, the supermarket is very crowded, because of course it’s the time of day when all the other people with jobs also try to squeeze in some grocery shopping. And the store is hideously lit and infused with soul-killing muzak or corporate pop and it’s pretty much the last place you want to be but you can’t just get in and quickly out; you have to wander all over the huge, over-lit store’s confusing aisles to find the stuff you want and you have to manoeuvre your junky cart through all these other tired, hurried people with carts (et cetera, et cetera, cutting stuff out because this is a long ceremony) and eventually you get all your supper supplies, except now it turns out there aren’t enough check-out lanes open even though it’s the end-of-the-day rush. So the checkout line is incredibly long, which is stupid and infuriating. But you can’t take your frustration out on the frantic lady working the register, who is overworked at a job whose daily tedium and meaninglessness surpasses the imagination of any of us here at a prestigious college. But anyway, you finally get to the checkout line’s front, and you pay for your food, and you get told to “Have a nice day” in a voice that is the absolute voice of death. Then you have to take your creepy, flimsy, plastic bags of groceries in your cart with the one crazy wheel that pulls maddeningly to the left, all the way out through the crowded, bumpy, littery parking lot, and then you have to drive all the way home through slow, heavy, SUV-intensive, rush-hour traffic, et cetera et cetera. Everyone here has done this, of course. But it hasn’t yet been part of you graduates’ actual life routine, day after week after month after year.
    9. how to keep from going through your comfortable, prosperous, respectable adult life dead, unconscious, a slave to your head and to your natural default setting of being uniquely, completely, imperially alone day in and day out

      "All of humanity's problems stem from man's inability to sit quietly in a room alone" —Blaise Pascal

    10. It is not the least bit coincidental that adults who commit suicide with firearms almost always shoot themselves in: the head. They shoot the terrible master.
    1. For better or worse, people will continue to run things to inspect the results manually—before grumbling about having to duplicate the effort when they make the actual test. The ergonomics are too tempting, even when they're an obvious false economy.

      What about a test-writing assistant that let you just copy and paste your terminal session into an input field/text file which the assistant would then process and transform into a test?

    1. My first startup, Emu, was a messaging app. My second startup, Miter, was not a messaging app but sometimes acted suspiciously like one. Am I obsessed with communication software? Well…maybe. I’m fascinated by the intersection of people, communities, and technology

      and, apparently, business—which definitely explains the author's overall position, specific recommendations, and the fact that this blindspot (about failing to mention the intersection of business with their interest in messengers).

    2. anything is better than SMS

      Happy to hear this is the author's position, at least, because delta.chat and/or something like it is really the only reasonable way forward.

      (This isn't to say the current experience with delta.chat doesn't have problems itself. I'm not even using it, in fact.)

    3. perpetuates SMS’s dependence on the mobile carrier and device

      This isn't true: where does the "and device" part come in?

    4. you can message anyone, anywhere, without thinking about it too much

      But we have already heard plenty of evidence about why this isn't true...

    5. Your Mom has an iPhone so she loved it. Your brother has an Android, so he saw a blue-green blob with a couple tan blobs in the middle.

      "I'll blame Apple" is both an acceptable and reasonable response to this.

    6. typing indicators, read receipts, stickers, the ability to edit and delete messages.

      Yeah, I don't want any of those. It's not that I'm merely unimpressed—I am actively opposed to several of them for good reasons.

    7. Messages get lost.

      The only reason why I "switched" to Signal ~5 years ago, was because it became clear that some of my messages weren't coming/going through.

      When I switched to Signal, the experience was even worse. Someone would send the message or attempt a voice call, but Signal would not reliably notify me that this happened. I'd open the app to find notifications for things that should've been delivered hours/days earlier.

      This had nothing to do with my app/phone settings. Signal did deliver some notifications, but it would do so unreliably. Eventually, I switched back to SMS in part because I was baffled by how the experience with Signal could be so much worse—as well as a bunch of dodgy decisions by the Signal team (which was actually the main catalyst for the switch back, despite the deliverability problems).

    8. might lose all your messages if you switch phones

      As a point of fact, this has nothing to do with SMS per se....

  3. thecomputersciencebook.com thecomputersciencebook.com
    1. That’s pretty much it

      The lack of emphasis on the original design motivations for the Web as an analog for someone sitting at the reference desk in e.g. a corporate library who will field your request for materials is something that should be corrected.

    1. The usefulness of JSON is that while both systems still need to agree on a custom protocol, it gives you an implementation for half of that custom protocol - ubiquitous libraries to parse and generate the format, so the application needs only to handle the semantics of a particular field.

      To be clear: when PeterisP says parse the format, they really mean lex the format (and do some minimal checks concerning e.g. balanced parentheses). To "handle the semantics of a particular field" is a parsing concern.

    1. We rebuilt Cloudflare's developer documentation - here's what we learned

      This post is one massive derp. (Or rather, the problem solved by the changes documented here is... The post itself would instead be best described as one massive "duh".)

      Spoiler alert: anti-wikis with heavyweight PR-based workflows that ignore how much friction this entails for user contributions don't beget many user contributions! (And it also sucks for the people getting paid to write and edit the content, too.)

    1. If the differentiator is the ease of putting a new application out into the world, that value prop is competing against the value prop of e.g. today’s full-stack web frameworks, right?
    1. Considerations

      What about chained dotted access? foo.bar.baz is probably okay as bar.baz @ (the Foo) (or even @the Foo), but probably not if it takes the form bar.baz from the Foo. (It just doesn't look reasonable to me.)

      Alternatively, what about @bar.baz for the Foo?

    2. this the

      should be a "to" here

    3. constructs involving a lone ASCII single quote can make the job of the parser more difficult, when single quote is already significant within the language (such as for denoting character or string literals)

      NB: not actually that much harder. In fact, my prescription today would probably be to omit the trailing s and allow only a bare single quote, which altogether would be incredibly easy to parse. (Omitting the s would also solve the it-doesn't-look-contrived-enough problem.)

    1. it’s ambiguous whether x-y is the expression x minus y or the invocation of the x-y function. Seems like a bad tradeoff, though. How often do you use -, and how often do you write multiword functions?
    2. In Lua you can write raw, multiline strings with [[]]: [[ Alice said "Bob said 'hi'". ]]

      This is indeed very good (for the reasons stated here).

    1. a lot of people think they understand the pictures but they don't

      See also: REST (compare: Fielding-style vs Valley-style)

    2. we need to reunite model language and programming languages this was the great vision of Simula of beta and Delta L o Delta was not designed to

      We need to reunite model language and programming languages. This was the great vision of Simula[...] We need to stop believing that we can document programs by some well-written code or "clean code". Clean code is great for small programs. Systems need more than comments and a few diagrams—systems need the voice of the designer in them with multimedia, but they also need more expressive paradigms for putting these in our programs.

    1. Although this episode is listed on Lex's own podcast page https://lexfridman.com/podcast/, it isn't actually available in the podcast RSS feed.

      I guessed the URL.

      And even though the latest episode right now is title #350, there are only "300 episodes" listed for the show on the Apple Podcasts page https://podcasts.apple.com/us/podcast/lex-fridman-podcast/id1434243584

    1. how important is the concrete syntax of their language in contrast to

      how important is the concrete syntax of their language in contrast to the abstract concepts behind them what I mean they say can someone somewhat awkward concrete syntax be an obstacle when it comes to the acceptance

    1. global.getProcessControl = new ServiceProcurement(page)

      This can be migrated to a utility method (for ServiceProcurement); viz:

      static initialize(slotted, page, key) {
        const { OverrideFailure } = ServiceProcurement;
      
        let override = new ServiceProcurement(page, key);
        if (!(slotted.name in override.global) ||
            override.global[slotted.name] != slotted) {
          throw new OverrideFailure(slotted, page, override);
        }
      
        override.global[slotted.name] = override;
      }
      

      (Alternatively, omit the override failure checking?)

    2. ExportProcessControl

      This can (should) be parameterized—not just here, but in the procurement constructor.

    1. with Xcode you can't resume the download if it fails
    2. The above (image) is the opcodes that it had.

      Hardly small or simple. Huge departure from Oberon...

    3. It is less and less the case now, but for a while you could inspect websites to see how it was put together.

      It should not go unmentioned that a big reason why this is the case is the types of folks from the presenter's social circles being hostile to this and trafficking in dubious maxims about how it somehow has to be this way...

    1. Theproduct of this is data that are more nearly accurate than could besecured with the distractions and many variables of shop conditions.
    2. Success in handling the human element, like success inhandling the materials element, depends upon knowledge of theelement itself and knowledge as to how it can best be handled.
    3. Laboratory practice has taught that whthe immediate results are important, the standardization of tmethod is more important, since the unexpected ultimate resultssometimes called by-products, are often by far the most valuableoutcome of the work.
    4. Scientificmanagement is simply management that is based upon actualmeasurement

      and yet Moneyball took almost a century, and the human-level processes behind semiconductor fabrication remain astoundingly inefficient (doubly ironic given the task at hand...)

  4. Dec 2022
    1. as a developer, writing against the Win32 APIs allows your software to run on over 90 percent of the computers in the world

      (Something else that has changed in the intervening years; most computers in the world—or a plurality of them, at least—are now running Android, not Windows, but Win32 is useless on Android. It's no help on iOS, either.)

    2. web apps are just so damned easy to use

      Despite the number of times it's been submitted over the years (most recently two months ago), this post has received very little commentary on HN. I think it suffers for its choice of title. This snippet is a better candidate, I think.

    1. It’s fairly clear now that the current catalog process is too heavyweight. I hope we can move to a lighter workflow in the future that feels more like editing a wiki.
    1. web standards are so inscrutably complex and fast-moving that building and maintaining a new browser from scratch requires the resources of a medium-sized nation-state

      no

    1. technology-driven development

      another term for resume-driven development

    2. designers are fickle beasts, and for all their feel-good bloviation about psychology and user experience, most are actually just operating on a combination of trend and whimsy

      the attitude of software designers that gripped the early 2010s described succinctly

    3. you should use this idea to guide your app’s architecture and your class design too. Start from the problem, then work through solving that problem to building your application.
    1. Bun is written in Zig, a low-level programming language with manual memory management.

      Very awkwardly stated.

    1. The battle for convivial software in this senseappears similar to other modern struggles, such as the battle toavert climate disaster. Relying on local, individual rationality aloneis a losing game: humans lack the collective consciousness thatcollective rationality would imply, and much human activity hap-pens as the default result of ‘normal behaviour’. To shift this meansto shift what is normal. Local incentives will play their part, butsocial doctrines, whether relatively transactional notions such asintergenerational contract, or quasi-spiritual notions of our evolvedbond with the planet, also seem essential if there is to be hope ofsteering humans away from collective destruction.
    2. Consider how many web applications con-tain their own embedded ‘rich text’ editing widget. If linking weretruly at the heart of the web’s design, a user (not just a developer)could supply their own preferred editor easily, but such a feat isalmost always impossible. A convivial system should not containmultitudes; it should permit linking to them.
    1. It feels weird to say this in 2020, when the idea was presented as fait accompli in 1997, but an enabling open source software movement would operate more like a bazaar than a cathedral. There wouldn’t be an “upstream”, there would be different people who all had the version of the software that worked best for them. It would be easy to evaluate, compare, combine and modify versions, so that the version you end up with is the one that works best for you, too.
    2. I just give them the thing, and they use it.
    3. Currently modes of software development, including free and open source software, are predicated on the division of society into three classes: “developers” who make software, “the business” who sponsor software making, and “users” who do whatever it is they do. An enabling free software movement would erase these distinctions, because it would give the ability (not merely the freedom) to study and change the software to anyone who wanted or needed it.
    1. Since all reading at that time occurred out loud rather than inside one’s head, the study rooms were a modern librarian’s nightmare

      The modern library is the quiet reader's nightmare; I've been to many noisy libraries over the last decade.

    1. aren’t really pens at all, in fact, but tributes to pens

      Nice turn of phrase.

      See also: mop handles that flex and bend like rubber when subjected to the downward pressure that is typical during mopping.

    2. the cute yellow mittens my wife picked up at Target which unraveled the second time she wore them

      I've said it before: we focused too much on the dream of a 3D printer in every home when we should have focused on personal electric looms.

    3. For the remains of the Pyrex casserole that shattered when I removed it from the oven, strewing the floor with blade-like shards, some so tiny I probably won’t find them for another couple of months, and only when they lodge in my bare feet.

      Would it be possible to adulterate glassware to glow underneath a blacklight?

    1. People seem to think it's the browser's job to block ads, but my perspective is that if a business owner wants to make their business repulsive, the only sensible response is to stop using the business. Somehow once technology is involved to abstract what's happening, people start talking about how it's their right to unilaterally renegotiate the transaction. Or for another analogy that will likely make you upset: "I hate how this store charges $10 for a banana, so I am just going to pay $2 and take the banana anyway".

      terrible analogy is terrible—and I say this as someone who doesn't even fall in line with the general anti-copyright sentiment that is prevalent on sites like HN and Reddit

    1. programs with type errors must still be specified to have a well-defined semantics

      Use this to explain why Bernhardt's JS wat (or, really, folks' gut reaction to what they're seeing) is misleading.

    1. Fielding’s dissertation (titled “Architectural Styles and the Design of Network-based Software Architectures”) is not about how to build APIs on top of HTTP but rather about HTTP itself.

      I'm a big fan of how REST is explained in jcrites's comment as part of the discussion that arose on HN in response to this piece: https://news.ycombinator.com/item?id=23672561

    2. Many more people know that Fielding’s dissertation is where REST came from than have read the dissertation

      in other words, widely referenced but rarely read

    1. "By definition" is not an argument.

      Yes it is, dumbass.

      Also:

      I am not sure why you think that having an obscure URI format will somehow give you a secure call (whatever that means). Identifiers are public information.

      Fielding. 2008. https://roy.gbiv.com/untangled/2008/rest-apis-must-be-hypertext-driven#comment-806

    1. non-concrete ideas are very hard to falsify

      Maybe this is just a regional thing, but something I really began to notice several years ago is that (a) what Jamie is saying is true, but (b) it's evident that people actually love the unfalsifiability aspect of non-concrete claims—it's safe refuge that they actively seek out.

      Two opposing examples from real life that particularly stuck out: * "Slow down. [You're going too fast.]" * "[...] since you always take so long"

      (These were two different instances/contexts with, I think, a year+ in between them; it wasn't the contrast between them that made me notice the phenomenon. Rather, I recognized each one at the time as being part of this phenomenon. They just serve as good examples because of how easily they could be made concrete—and therefore falsified in light of the facts: "without defining exactly what 'too fast' means, what is an acceptable speed?", "without defining what it means to take too long, what is an acceptable amount of time to take?"—both arising from wholly disingenuous complaints that were forms of externalized gut reactions rather than anything that would hold up under scrutiny...)

    1. Building programs by reusing generic components willseem strange if you think of programming as the act ofassembling the raw statements and expressions of a pro-gramming language. The integrated circuit seemed just asstrange to designers who built circ uits from discrete electronic com ponents. What is truly revolutionary aboutobject-oriented programming is that it helps programmersreuse existing code. just as the silicon chip helps circuitbuilders reuse the work of chip designers.

      Oh man, this metaphor really fell apart and, if anything, works against itself.

      "If integrated circuits are superior to discrete components, why exactly are we supposed to be recreating the folly of reaching for reusable components in creating software?"

    1. The only difference is that standard data repre- sentations (XML schemas) eliminate the need for custom parsers

      They don't, though. Things like JSON, XML, etc. mean you don't need to write a lexer--not that you don't need to write a parser. People land themselves in all sorts of confused thoughts over this (smart people, even).

    1. From the 1976 edition of Naur, Randell, and Buxton's "Software Engineering: Concepts and Techniques":

      Get some intelligent ignoramus to read through your documentation and try the system; he will find many "holes" where essential information has been omitted. Unfortunately intelligent people don't stay ignorant too long, so ignorance becomes a rather precious resource. Suitable late entrants to the project are sometimes useful here.

      Burkinshaw on correctness and debugging. p. 162. (Only in the 1976 edition.)

    2. Good case for S4/BYFOB+bookmarklets.

    1. as forking Electron to make Min wouldn't make any sense, and the replier knew this, reading it to mean that seems like a mistake to me

      Right. If there are two ways to take a statement, one which is absurd because it's inconsistent with actual fact and another which isn't, it's a bad idea to make an argument that hinges on it being the former—that would mean you're insisting on an absurdity.

    1. A world in which I wouldn’t feel like my app is inadequate because I’m not using the [insert name] technology.

      they keep coming

    2. A world where I wouldn’t need dozens of third-party linters, type checkers and configurations just to make sure that my code is correct

      another you-problem

    3. unified module system and I wouldn’t need to worry about file name extensions

      There is. It was standardized in ES6 aka ES2015. 2015! Direct your ire in the appropriate direction, i.e., at your vendor—and, ultimately, at yourself regarding your own lack of foresight.

    4. I just can’t stop dreaming about a perfect world where I could go back to any of my old JavaScript projects with an ease of mind and knowing that everything just works. A perfect world where my program is going to stand the test of time.

      That's a you-problem. The pieces are there—the language is stable, and there's a ludicrously backwards compatible World Wide Wruntime that you can trust to be around—it's on you if it fails.

    5. perhaps we should take a few steps back before making decisions to reflect on our long-term vision and to recognize that how every change, even the tiny ones, could affect a whole group of users

      The truest thing I've read in this post so far.

    6. being developed on their own to push the boundaries

      I don't think so. I go back look at what boundaries were being pushed in historical projects like some of the stuff that Tolmasky was doing and I see way more than today.

      The last 10 years are pretty stagnant, in comparison. React sucks up a lot of resources to ultimately do little more than reinvent the wheel (poorly). Same with the adjacent tools mentioned here.

    7. JavaScript has been evolving and growing so fast it’s unlike any other tech I’ve seen.

      This statement belies some ignorance of history, I think, or some form of intellectual dishonesty.

      JS is objectively and demonstrably slower on the uptake than what happened with Java.

    8. we mean it as a whole and how it gets used by the user, not just the language specifications alone

      Well, "as a whole", JS includes non-Node-affiliated silliness; it does get used by people in ways that doesn't involve any of this stuff. So use it that way, too, if you have a problem with the way you're using it now (and you should have a problem with it—so stop doing that!)

    9. The language is responsible for providing the context and the environment in which things happen and get shaped.

      No. Well, yes, this is true, strictly speaking. But, again, this is a true-in-some-sense-that-isn't-relevant-here sort of way.

      Parents are not responsible for the crimes that their children grow up and commit.

      The NodeJS community is responsible for the NodeJS community's problems. And people outside the NodeJS community who choose to align themselves with the NodeJS community are responsible for their choices.

    10. JavaScript is evolving too rapidly

      JS is evolving too rapidly—the ECMAScript group is putting too much in the language too often—but that's not the problem described here.

      As noted in the comments on HN[1], the complaints here are not about JS, but about NPMJS/NodeJS specifically and how the NPM approach to engineering fails.

      1. https://news.ycombinator.com/item?id=33965914
    11. Six months passes and while you had almost forgotten about your little project, you now have got some new ideas that could make it even better. The project also has a few open issues and feature requests that you can take care of. So you come back to it. “Let’s start”, you whispered to yourself with excitement. You run npm install in your terminal like an innocent man and switch to your browser to scroll on Twitter while you are waiting for the dependencies to be installed. Moments later you return to your terminal and see… an error!
    1. @15:40

      His point is that a lot of software design has failed to be even as good as print design--that software design really has focused so much on interaction, we kind of treat the computer as this machine that we need to manipulate and less about displaying information--for people to make decisions, to come to conclusions, to learn something--and that by focusing so much on interaction design, by focusing so much on the computer as a mechanical thing, we've really made bad software.

      And so he wants to say part of the problem here is that the people who design can't make these context-sensitive magic ink--right? It's like: print design but now it knows something about your context. And so designers really aren't able to make these rich information things that are dynamic but not interactive. And so you could really kind of call this "Interaction Considered Harmful", and the idea is that our software should only be interactive when it has to be and really our software should be context-aware, good, print-like displays of information.

    1. Bellheads believed in "smart" networks. Netheads believed in what David Isenberg called "The Stupid Network," a "dumb pipe" whose only job was to let some people send signals to other people
    1. This brings interesting questions back up like what happens to your online "presence" after you die (for lack of a better turn of phrase)?

      Aaron Swartz famously left instructions predating (by years IIRC) the decision that ended his life for the way that unpublished and in-progress works should be licensed and who should become stewards/executors for the personal infrastructure he managed.

      The chrisseaton.com landing page has three social networking CTAs ("Email me", etc.) Eventually, the chrisseaton.com domain will lapse, I imagine, and the registrar or someone else will snap it up to squat it, as is their wont. And while in theory chrisseaton.github.io will retain all the same potential it had last week for much longer, no one will be able to effect any changes in the absence of an overseer empowered to act.

    1. In our way of delivering orders we emphasise explaining the context two levels up. I may tell my soldiers to raid a compound, but I would also tell them that the reason for this is to create a distraction so that the Colonel can divert the enemy away from a bridge, and that the reason the Brigadier wants the Colonel to divert the enemy is so that the bridge is easier to cross. Not only do the soldiers then know why it’s important to raid the compound (so that others can cross the bridge), but they know that if for some reason they can’t raid the compound, creating any other diversion or distraction will do in a pinch, and if they can’t do that they can still try to do something to make it easier to cross the bridge. It lets everyone adapt to change as it happens without additional instruction if they aren’t able to get in touch with me. Again I think tech could possibly learn from that.

      def

    1. By Brad J. Cox, December 06, 2004

      NB: the footnote at the end indicates that this was originally published in Byte Magazine (October 1990). By a reasonable guess, the 2004 date here is when this online copy was published to drdobbs.com?

    1. McIlroy envisioned a world where we would be constructing software by picking components off a shelf, and snapping them together like Legos. The hard work would be building the right blocks, and then it would be easy to snap them together.

      See also: Brad Cox on software ICs

    2. Briefly, Taylorism has two central tenets:Measurement: associate metrics with all aspects of work.The separation of thinking and doing: An educated class of managers measures and plans all the work, and is responsible for the overall process, while a class of laborers carries out the implementation of those plans.

      I find it difficult to reconcile these two tenets with the claim that "Taylorism is so deeply ingrained in every aspect of not just modern commerce but life in general".

      Many (most?) places—even big engineering orgs like Samsung—are failing on the first principle alone and are doing a lot of wasteful, gut-driven operational stuff.

    1. we believe that in the long term thekey to the reuse of software is to reuse analysis and design; not code

      cf akkartik

    1. A paper doesn’t even contain all the information and data required to reproduce a result. Because if it did, it would be the size of a book.
    1. The migration would not be complete without calling out that I was unable to build the Mastodon code base on our new primary Puma HTTP server.
    1. If you write an algorithm in a straightforward way in Node, you can expect it to run about as fast as if you write it in a vectorized way using Numpy, or twenty times as fast as if you write it in a straightforward way in CPython.
    2. it also has always had a binding problem with this

      No, it doesn't. The "binding problem with this" is a problem that (some) people have. JS binds it the right way.

    3. Also those error messages are super confusing, which I guess is one way JS has always been worse than Python.

      That's Node, not "JS".