We haven't heard back after replying last night
Geez.
We haven't heard back after replying last night
Geez.
Compatible
"Interoperable", you mean.
It was a bit of a surprise
... why? Maybe it was only clear in hindsight, but Gruber's lack of response to the request is functionally equivalent to a pocket veto.
In Markdown, we literally built a Tower of Babel.
No.
As a relatively new C++ developer at the time, the number of syntax errors I made was high.
The thing about syntax errors is that even if the build takes a long time (project is huge), syntax errors can be detected quickly.
One failure mode, particularly common in academia, comes from lacking a serious context of use
The final keystone was when the program that a computer runs was moved to where the data is stored, rather than being represented or input physically. This effectively created what we now know of as software. Obvious in hindsight, yet almost impossible to see from the past’s vantage point.
Good way to describe ANPD.
Our vision is to free the world from technological and legal barriers for all software and cultural works to be free
Social barriers important, too, but underappreciated.
I'd argue it's slightly different--
It is different. However, they're similar enough to draw lessons from.
I use similar, non-specced canaries all the time. E.g. small fixes for things in projects that are nonetheless obvious errors, or determining whether someone is going to try to frame my attempt to contribute by "upperhanding" me, whether they're hostile to messages attached to a name that they don't recognize, etc.
For example, if it takes too much back and forth to get a typo or link fixed in the docs (or any sort of review process for content on what is purported to be a "wiki"), then odds are, things are messed up at a greater level that are going to be a source of frustration later. At that point, I'm out.
A surprisingly large number of projects fail these, in what we are otherwise expected to consider the present Renaissance of FOSS...
Now we update as needed and make good use of the Internet Archive WayBack Machine for legacy or potentially unstable URLs.
Stanford runs their own archive instance (https://swap.stanford.edu/). Why shouldn't the LOC, too?
there's no job that I can't--that I won't do. I like to have--I have this little saying that, "The successful people in the world are the people that do the things that the unsuccessful ones won't." So I've always been this person that, you know, I will build the system, I will fix the bugs, I will fix other people's bugs, I will fix build breaks. It's all part of getting the job done.
And she’s actually still working a customer-facing job, not promoted into a corner office management position where she would never be exposed to a real-world problem like mine.
It absolutely takes some getting used to
Does it? It's pretty much just as easy or easier than doing it the way that everyone else insists is correct. I'm more than half convinced, in fact, that the npm install way being unnatural is the reason why it's sacrosanct. You can't just let things be easy—people dislike any state of affairs where their experience/participation isn't some combination of necessary and valuable.
I've tried making this case. People flip their shit and then for lack of a good argument they start doing that thing where they try to shut you down because of how weird it is—and you don't want anyone thinking you're weird, do you?
At first this struck me as unusual
It is unusual. It's not a bad thing to do, but it is still, in the literal sense of the word, unusual. That doesn't say anything about the practice itself so much as it says something about how bad the "usual" way of doing things is.
If you try to export the document in an internet-compatible format like HTML, you get a mess.
I've noted elsewhere that despite the reputation of WYSIWYG editors' tendencies for handling HTML, modern mainstream Web development practices are so bad today that just typing a bunch of junk into LibreOffice and saving as HTML results in one of the most economical ways to do painless authoring of Web content...
Who does that server really serve?
This gets it right. The similar essay "The JavaScript Trap" is anathema. As Richard was later cajoled into clarifying:
to be philosophically clear, the language JavaScript, as such, is not morally better or worse than any other language. What's bad is running code immediately as it arrives from someone else's server
The clarification is needed. With the existence of "The JavaScript Trap", people are under the (silly) impression that JS or programs written for the browser and executed by its JS engine are inherently bad.
Contributor License Agreement
Broken link. (And it's an antipattern, anyway.)
Use URIs as names for things
In "FactForge: A fast track to the Web of data", Bishop et al give this criterion as, "Use URIs as identifiers for things".
There may need to be a zeroeth step here, which is "don't make the mistake of designing systems in such a way that things can't be identified by name".
Once one has gotten used to the idea of no moving' parts, he is ready for the idea of no keyboard 'at all: Suppose the display panel covers the full extent of the notebook surface. Any keyboard arrangement one might wish can then be displayed anyWhere on the surface.
WebKit is way behind the 2 major browser engines
Weird statement. WebKit is an element in the set defined as "the 2 major browser engines".
Two things that still need to be addressed in section 7:
/// import { LineChecker } from "./LineChecker.src"
That's not right... should be "./src/LineChecker.src". (The fact that the compiler isn't throwing an error on this is a bug in and of itself...)
Adding an implementation of the system interface)
Spurious close paren here.
Next, we discuss the implementation strategy.
s/.*/Let's discuss an implementation strategy/
adding a new website should not require someone to go through the cumbersome process of forking the repo and sending a pull request.
Someone should launch a "No regressive GitHub bullshit club".
This loop showcases a UI pattern that I think could be improved. There is an "edit" button visible, which opens the sidebar. The principles should more closely resemble the Hypothesis sidebar. Instead of requiring an explicit edit button which the user clicks, the editor should operate on object selections. Merely clicking any of the displayed values should select it, which should provide a handle to the underlying object, which should reveal the editor sidebar (with, ideally, the relevant field focused).
With that in mind, I'm trying something new, the guided tour for Mu. Ironically, it atomizes my previous docs by linking repeatedly into anchors in the middle of pages. Proceed if you dare.
The current incarnation of the tutorial (https://raw.githubusercontent.com/akkartik/mu/7195a5e88e7657b380c0b410c8701792a5ebad72/tutorial/index.md) starts by describing "Prerequisites[:] You will need[...]", and then goes on to list several things, including various software packages—assuming a Linux system, etc.
This is the idea I'm trying to get across with the self-containedness I've been pursuing (if not with triple scripts then at least with LP docs).
That prerequisites list should be able to replace with two requirements, i.e.:
"You will need: (1) this document, and (2) the ability to read it (assuming you have an appropriate viewer [which in 2021 is nowhere close to the kind of ask of the old world])"
JavaScript is actually surprisingly fast because the demand for fast web browsers is huge
Another way of saying that the use of V8 means that JS isn't actually an "interpreted language" (not that that's even a real thing).
I have heard that Oracle's cloud has a free tier that even includes your own virtual private servers, so I may look into that eventually. Planning to use Oracle is something I never thought I'd be doing as a hobbyist, but these are interesting times.
$4.33/month. A little more than I'd like to spend on silly hobby projects
My ideal implementation would be a tool that I unleash on the output HTML files, crawling relative links to ensure they're all valid and that all pages are reachable. It would also ensure that any links to anchors within pages exist. Such a tool probably exists, but I haven't found it yet.
Fielding's MOMspider, one of the very first web tools, does this, albeit not at build time in a static site generator.
content\index.md
Just say no to backslash for path separators, even on Windows.
(I can't believe the Powershell people got this right originally and then chose to fuck it up.)
I do wonder if this will eventually become a burden in the future when Node inevitably falls out of favor.
"burden"
since when have I enjoyed webpack
looks like I need a full blown Ruby environment. No thanks!
on Windows, since that's what I'm using
Good example of why leveraging the browser's runtime is better. i wouldn't have guessed that the md2blog creator was using Windows. (And I didn't. I just assume that everyone is using a Mac, even though I'm on neither.)
If a piece of software (or a web site) gets in my way, I usually just give up and move on because that first irritation is usually just the first drip of an approaching cascade of frustration.
First(?) published in JEP Vol. 1 Issue 1: https://doi.org/10.3998/3336451.0001.137
public type registry
Any known previous occurrences to this phrase? It's used today in the Solid Project.
From the WebNet 96 Conference Proceedings (San Francisco, California, October 15-19, 1996). https://eric.ed.gov/?id=ED427649
This paper covers various shades of Semantic Web problems (not yet coined at the time) and several problems being dealt with by the Solid Project folks today.
See Market-like task scheduling in distributed computing environments (Malone, 1987).
open & close a thank-you issue on GitHub if you can't contact them any other way
Fuck this shit. Would you think it was a good idea to show your appreciation to someone by tipping them -$1?
It's stupidly easy to achieve the intended effect without fucking it up. If you want to send your thanks but there is no obvious way to do that, then congratualations you have found a bug. File a bug report about that instead of subjecting the recipient and everyone else in the project orbit to your public wankery.
Every item filed in a bugtracker should correspond to a defect.
Stop encouraging people to misuse project infrastructure.
Un-derailing
The single best piece of advice in a column filled with bad advice.
Create a rule specifying that the Github issue tracker is only used for bug reports, feature requests, or other discussion - not support inquiries
This should go without saying. The fact that it doesn't is an indicator of a bigger issue that probably calls for a drastically different approach.
If you're using GitHub
"... then consider not".
It's such a big lift for this project and I really appreciate you taking the time to make it.
The problem with lots of these examples is that they read like a robot emulating what they think genuine sentiment sounds like. This example is no different.
!
Just like politics, it's best to exhaust the diplomatic options before considering the rest.
Again: maybe modelling your approach off something that is broken is not the best thing to do.
Maintaining an open source project, like other jobs that are public and often involve a lot of work, can be mentally draining.
If these techniques are supposed to work and this is the result, maybe consider the possibility that all the advice here is not the way to go about doing things.
Thanks for opening this pull request!
Bad execution of mostly acceptable advice, in contrast to much of the earlier stuff, which constitutes bad advice.
Things that you say should be true.
Write in the genial voice using people's first names and friendly introductions
Please, no (unless you're actually writing an email and introducing yourself).
Bugtrackers are not message boards.
The easiest way to say thank you is... simply
I'm going to close this pull request, but I hope you can contribute in the future! If you need this change, feel free to maintain a fork.
On the contrary, this kind of PR speak is a good example of how to piss people off by sounding like you think they're too stupid to recognize this "trick".
Firing users
"Open source" doesn't make this hard. How you're advocating that open source has to be done is what makes that hard. So stop doing that.
All in all, open sourcing a thing means taking responsibility for it. You're making a statement that the thing will be available, updated, and real.
Gross.
You can't suggest they leave your store.
You can.
People have expectations that software will work, that issues with software will quickly be fixed, and that you'll answer their questions.
Gross.
Very successful projects with thousands of users quickly accumulate hundreds of support issues.
If the costs are high, then start charging people for it.
an issue tracker
Please, no. This is a big reason why the relevant problems here exist in the first place.
Gamified indicators tell a simple story
It's almost like the way that GitHub is typically used is an example of how things shouldn't be done!
Why is reporting a bug so hard that it justifies so many words?
Because with the rise of GitHub, the behavior of most other bug reporters makes for a plethora of bad role models and a dearth of good ones.
in certain job roles, there are expectations about another person's emotions and tone. In America, baristas are usually expected to be bright and cheerful.
Yeah, and that could certainly use fixing. I'm getting a dreadful sense of foreshadowing here that making contact with this observation has led the author to an entirely different conclusion—e.g., that this is an argument by analogy where what follows is going to be a bunch of suggestions that ultimately make other things worse.
Unfortunately, people are so conditioned to conflate "feels like it should work" with "actually works"
Having a giant, flat namespace also seems wrong - Wikipedia seems especially strange in this regard, having Thingy_(Star_Wars) where Thingy is both a real thing and also present in some specific context; I frequently think it should be Star_Wars/Thingy
Strong disagree. The tendency to the latter is awful. developer.mozilla.org committed this sin in the early days and then never shook it, despite many opportunities to do so and the many problems it caused.
Guessable URLs for stuff like reference works especially are an amazing affordance on the user end but also a much more economical use of resources on the implementor's end, because it eliminates both the fixed upfront creation cost as well as the recurring cost associated with waffling or bikeshedding about the hierarchy.
Probably one of the best things that any single person or organization can do is to identify stuff that looks like it's productive work but isn't, and then eliminate both it and the conditions that allowed it to occur, made it seem like a good idea, etc.
We need more holistic frameworks that incorporate more facets of the web experience
No. Geez.
server
Should be "serve".
there
Should be "their".
website
Should be "websites".
Memex began and remained as an ambiguous an not too original concept
I think memex is valued as a touchstone rather than an invention, and that's how it should be treated.
Suddenly we’d come full circle. The fastest way to launch programs was to type their name into a box, only a box that looked a bit more stylish than the terminal of old.
"Oh, Joe is taking feature X? He never ships anything reasonable. Looks like we can't depend on it because that's never going to work. Let's do Y instead of Z since that won't require X to actually work". The roadmap creation and review process maintains the polite fiction that people are interchangeable, but everyone knows this isn't true and teams that are effective and want to ship on time can't play along when the rubber hits the road even if they play along with the managers, directors, and VPs, who create roadmaps as if people can be generically abstracted over.
The movie "Moneyball" is a good example of how this can go wrong. People like Philip Seymour Hoffman's character create self-fulfilling prophecies of failure.
Let's go further than this: details matter. It's weird how many people don't get this, despite the fact that we have an aphorism ("the devil is in the details") baked into our language.
Download the file (eg, PDF) that you want to add to your local computer, and calculate the SHA-1 hash of the file using a tool like sha1sum on the command line. If you aren't familiar with command line tools, you can upload to a free online service.
Instead of mentioning sha1sum(1) or linking to some seedy third-party service, dding a button right here to do calculate the SHA-1 for a given file would be trivial...
input type="email" placeholder="user@domain.tdl..."
Should be "tld", not "tdl". (But really should be something RFC 2606-compliant like "you@mailhost.example".)
This is a port of Michael Schierl's OberonXref from Java. Instead of requiring a JVM, you can run it directly in the browser.
Mod/ subdirectory and a resources/ subdirectory. The latter should be a copy of the directory of the same name from the original repository https://github.com/schierlm/OberonXref/tree/master/src/main/resources. The former should contain the sources for the modules of interest.And what happens to that bytecode? First thing that happens is they build a tree out of it, because the bytecode verifier has to go in and make sure you're not doing anything [illegal].
Lars Bak makes this point in a Channel 9 interview. I think it was this one (can't tell; dead link): http://channel9.msdn.com/Shows/Going+Deep/Expert-to-Expert-Erik-Meijer-and-Lars-Bak-Inside-V8-A-Javascript-Virtual-Machine
One of his interlocutors can't wrap his head around it. He makes a "but C# and Java are compiled languages"-type of argument comes off like Limmy's confused protest that "steel is heavier than feathers...".
Instead, [Peter was saying] they do it all probablistically.
The rise of non-algorithmic "algorithms".
A question of ontology: Does battling the build system and the various tools mentioned count as "debugging"?
Where the Action Is and Was in Information Science
Wrong. Licklider is not an author on this. No one is except for Cawkell. Cawkell mentions the others in his letter.
Missing:
Y. Bar-Hillel, Director of Research Institute for Scientific Information 132 High Street Uxbridge, Middlesex England UB8 1DDSearch for more papers by this authorR. Carnap, Director of Research Institute for Scientific Information 132 High Street Uxbridge, Middlesex England UB8 1DDSearch for more papers by this authorE. C. Cherry, Director of Research Institute for Scientific Information 132 High Street Uxbridge, Middlesex England UB8 1DDSearch for more papers by this authorE. Garfield, Director of Research Institute for Scientific Information 132 High Street Uxbridge, Middlesex England UB8 1DDSearch for more papers by this authorD. W. King, Director of Research Institute for Scientific Information 132 High Street Uxbridge, Middlesex England UB8 1DDSearch for more papers by this authorF. W. Lancaster, Director of Research Institute for Scientific Information 132 High Street Uxbridge, Middlesex England UB8 1DDSearch for more papers by this authorJ. C. R. Licklider, Director of Research Institute for Scientific Information 132 High Street Uxbridge, Middlesex England UB8 1DDSearch for more papers by this authorD. M. Mackay, Director of Research Institute for Scientific Information 132 High Street Uxbridge, Middlesex England UB8 1DDSearch for more papers by this authorJ. W. Perry, Director of Research Institute for Scientific Information 132 High Street Uxbridge, Middlesex England UB8 1DDSearch for more papers by this authorD. J. De S. Price, Director of Research Institute for Scientific Information 132 High Street Uxbridge, Middlesex England UB8 1DDSearch for more papers by this authorG. Salton, Director of Research Institute for Scientific Information 132 High Street Uxbridge, Middlesex England UB8 1DDSearch for more papers by this authorC. Shannon, Director of Research Institute for Scientific Information 132 High Street Uxbridge, Middlesex England UB8 1DDSearch for more papers by this authorM. Taube, Director of Research Institute for Scientific Information 132 High Street Uxbridge, Middlesex England UB8 1DDSearch for more papers by this authorB. C. Vickery,
These are all wrong. Cawkwell is the only author for this. The other names come from a list of others appearing in Cawkwell's letter (a list of the most cited authors, excluding self citations).
Teh
Should be "The".
extention
Should be "extension".
bizzare
Should be "bizarre".
Modern JS development is rampant with this, with the profligate misuse of triple equals (===) almost everywhere where double equals would be appropriate as one example.
Avoid it.
"Resist it" might be a better way of putting it.
From p. 145:
I would hope that it would extend in an atmosphere of professional interchange and frankness which is characteristically American.
As John Dickerson recently put it on Slate, describing his attempt to annotate books on an iPad: “It’s like eating candy through a wrapper.”
The metaphor itself is pretty interesting considering that the premise already involves using an iPad. I remember when in the first few years after (capacitive) touchscreen devices became available to the mainstream, someone quipped that tablets are convenient and all until they're not—that trying to get real work done, especially when it requires typing, is like being forced to "think through a straw".
CONTINUED ON P. 103
wat. We're starting on page 104! This is running backwards.
"It was impossible to explain to people what the Web would be like then, and now when you talk to millennials they can't understand what the problem was."
The state of the world is worse than what is implied here. The implication, as I understand it, is that it's so difficult to conceive of a world that works any differently from the way Tim considers that it works today—that someone could live in a world without the Web and not know what's missing, and just as easy to exist in a world with the Web and not see it as anything other than obvious—because why would it ever have worked any differently, right? But I know empirically (and painfully) that people (whether millennial types of people or not) exist in a world where we do have this stuff and don't even understand what we have. I've had firsthand dealings with knowledge workers who still have the "I don't get it; so what?" reaction, to an exasperating degree. We are way, way short of Engelbart's vision.
Librari Networks: Should TheyDear With Containersor Contents of Knowledge
"Library Networks: Should They Deal With Containers or Contents of Knowledge?"
Searching around, it doesn't seem like the outside world (e.g. Google Scholar, bibliographers) are aware that this piece Licklider wrote even exists, despite it being digitized and sitting here in the open.
This piece presages the Internet Archive and, given the juxtaposition of its lofty goals with the piece's own obscurity, there's a perverse irony here.
Licklider writes in section 2:
It is high time that librarians reach out into computer networks to create order and functionality out of what is now chaos. (The author would- estimate that 90 per cent of the computerized information in EDUCOM universities "trickles down" to back-up or dead storage tapes within two years and that less than one per cent ever "perks up" again.)
"Perking up" is more than what I'm talking about here—again, I can't find evidence that this piece is even catalogued anywhere.
Is that failure or is that a bad zip file? The APPNOTE.TXT does not say. I think it should be explicit here and I think it's one of those unstated assumptions.
Pretty sure it's because the ZIP files were expected to be written to multiple disks (floppies), and as alluded to earlier, if you wanted to delete a file, you could just insert the last disk containing the directory and "delete" it, therefore not requiring you to insert the disk(s) containing the actual file record to null it out (or overwrite with some other file record, potentially requiring 3 disk swaps). Thus, the ZIP format constitutes something like a filesystem implemented in userspace. 30 years ago, this was "obvious" and that's why you were expected to know this. There was no assumption that tradition and path dependencies would lead to ZIP still being widespread for cross-platform data interchange among machines capable of fast writes to local disks that have terabytes (although sometimes "merely" gigabytes) available.
Lots of weird reactions to this post.
In How to Stop Endless Discussion https://candost.blog/how-to-stop-endless-discussions/, the author explains the NABC model—"The model starts with defining the Need, followed by Approach, Benefits, and lastly, Competitors. Separating the Need from the Approach is very smart."
Commenting in support, Simon Willison writes:
We implemented a RFC-style process at a previous employer. [...] One thing that was particularly valuable was ensuring every proposal came with a "alternatives" section (called "competitors" in the NABC model). We also made sure that every proposal included "do nothing" as one of the alternatives
https://news.ycombinator.com/item?id=25623388
I scarcely think that the majority of people who are behaving as if React and/or the NPM-heavy workflow are things that simply must be dealt with have ever done on honest NABC and evaluated the foregone conclusions biased towards their preferred workflow against the do-nothing alternative of just not bringing all the heavyweight NodeJS-centric tooling into the picture.
Until last week, I found this completely mystifying – I had no idea what import was doing and I didn’t understand how to use libraries in that way.
The build tools are irrelevant (or rather, a red herring).
JS has import statements. The build tools are (or at least should be) approximating what you get without the build tools interposing themselves into the development process, but doing it in a way that reinforces The Iron Law of Bureaucracy.
The difference between tryingto imagine it working and having it do so is roughly like the difference between abearskin rug and a bear.
See also: Knuth on Dijkstra in "Coders at Work".
"You have tosubmit out of technical necessity"
The computer, its accessories, and terminology, can givethe semblance of validity to all sorts of procedures or statistics
This is the cybercrud problem: advice and creation of systems, supposedlybased on technical requirements, whose categories and rigidities are unnecessary.In the worst cases they are not only unnecessary but wrong.
Still a problem.
After the conference dinner, Theodor Nelson of The Nelson Organization, Inc.,described a vision of what the computer's use in instruction might become, if onlywe could see beyond the "trivial horizons" of most computer peop:.e
Still a problem.
Missing:
More annotations (to the HTML version): https://hyp.is/IEwiCEjEEeyw_H-lrHY4-Q/www.theatlantic.com/magazine/archive/1945/07/as-we-may-think/303881/
More annotations (to the PDF): https://hyp.is/IVIKog0bEeinX_so5P4fLA/csclassics.com/papers/Bush%20-%20As%20We%20May%20Think.pdf
Also available at http://csis.pace.edu/~marchese/CS835/Lec3/nelson.pdf (apparently with Ted's blessing; see `https://archive.org/details/HardAndFastThoughts1966`).
Missing:
Fractal Deferment of Responsiblity
That's just how the software works.
You're looking for https://doi.org/10.18352/lq.8036 (https://liberquarterly.eu/article/view/10623 as of this writing).
You're looking for https://doi.org/10.18352/lq.8036 (https://liberquarterly.eu/article/view/10623 as of this writing).
Missing:
Missing:
Once we learn how to create abstractions, it is tempting to get high on that ability, and pull abstractions out of thin air whenever we see repetitive code.
DRY (Don't Repeat Yourself) < TRY (Try Repeating Yourself)
I was aghast. The old code was a mess, and mine was clean!
Engelbart has said he did not actually thinkabout Bush’s article before he started writing in1959 about the augmentation system.
it is important to noteEngelbart’s further statement that, after read-ing theLifearticle in the Philippines, thearticle’s impact gradually receded in Engelbart’smemory
; http://www.e-papyrus.com/hypertext_review/chapter1.html
This is a porn site today...
As a matter of fact,Bush had written a preliminary version of anarticle describing the system as early as 1939;moreover, he had a sketch of it as early as1933.
missing: BURKE, COLIN B. 1994. Information and Secrecy: Vannevar Bush, Ultra, and the Other Memex. Metuchen, NJ: Scarecrow Press; 1994. 466p. ISBN: 0-8108-2783-2.
"Over the past few years I've come to appreciate that freedom of [mental] movement is the key," he said, highlighting the nature of liquidity in putting thoughts to the page. "When you look about the freedom of your own hands moving, you have such incredible freedom of movement."'
It was the introduction of the “like” button
I think the early 2000s transition to actor-based indexing from topic-based indexing should be examined in depth. I think stronger identitarianism is at the root of polarization, and actor-based indexing laid the foundation for it. Likes were the fuel.
FIRE
Foundation for Individual Rights in Education http://www.thefire.org/
A lot less nice, that example, isn't it?
No idea what point Troy thinks he's making here, but he certainly sounds very satisfied with it.
Stuff that’s already in standard Markdown works the way it should.
Except, notably, for GitHub Flavored Markdown's terrible handling of ASCII CR and LF.
GitHub’s implementation for code blocks, for example, looks like something a group of programmers would want to have; it looks more like code than text.
Readability, however, is emphasized above all else. A Markdown-formatted document should be publishable as-is, as plain text, without looking like it’s been marked up with tags or formatting instructions.
Most Markdown that gets published to GitHub is clearly written by people who have no problem grinding their muddy shoes all over this design principle.
(Vanilla Markdown's syntax for inline links doesn't do a particularly good job upholding this principle itself.)
the meta information is at the same level in the document as the content itself so that even if the document is printed and scanned back (with OCR), reader software will still be able to use it
my word processor Author could already see extra metainformation on the macOS Clipboard when text copied from the Apple Safari web browser, that the text came from a specific web address etc
the use
When I was about 10 there was nothing I wanted more than to have a program that took minutes to build into an .EXE. In my mind that implied complexity and 'real' programming.
I've long contended that this is one of the main reasons that the NodeJS ecosystem has evolved in the direction that it has and snubbed the original strengths of JS.
This query doesn't behave as expected. Consider by way of comparison to a Google search for anything -site:pinterest.com.
On finding a potential improvement, they are not slowed by the speed bump of switching from a document viewer to a document editor because they work from the editor at the start.
Alternatively: wikis. (The document editor is trivially reachable from the document viewer.)
connection
Unused. Caller can/should be passing this in so we can handle the quota check internally.
The various designs such a WebDav's propfind which use HTTP methods apart from GET to retreive information suffer from this same problem. the information does not have a URI: it is not on the web.
Should the content of an HTTP "transcript" itself be addressable?
anything should be able to have a URI
runbook
See https://en.wikipedia.org/wiki/Runbook:
In a computer system or network, a runbook is a compilation of routine procedures and operations that the system administrator or operator carries out. System administrators in IT departments and NOCs use runbooks as a reference.
(Great sounding word, by the way.)
Unfortunately, if the location to which you wish to move the text is off-screen, you may have to use some other mechanism to find the target location
almost all tutorials assume that you already know the writer’s previous programming language, and new concepts are explained in terms of that language
I've noticed that Python programmers are particularly bad about this, believing their language to be some sort of lingua franca, unable to see the dark corners, and unaware that the times that the program meanders into those corners can't be understood on first sight.
None of this is necessary. It is merely customary.
Excellent turn of phrase.
In my method, program segments are embedded in the midst of a word processor document--like raisins in cake--so that the emphasis is on the explanation rather than the code.
In most institutions there are too many layers of "no"
Chesterson’s Fence investigation
Specifically, answering the queston "Chesterton's fence or Rhesus ladder?". See https://hypothes.is/a/viXUiPZOEeu5Ty9zljmwFg
shovel-ready solution
Excellent turn of phrase that I'm only just now hearing.
But when our hypothetical Blub programmer looks in the other direction, up the power continuum, he doesn't realize he's looking up.
We have a word for this: "sophomoric".
Any language or system that does not allow full flowing and arbitrarily long comments is seriously behind the times. That we use escape characters to “escape” from code to comment is backwards. Ideally, comment should be the default, with a way to signal the occasional lines of code.
Sounds like literate programming.
I very much agree with you that the “American” practice of forcing punctuation inside a close quote is absurd.
if you don’t write about an idea, you’ll never have a three-dimensional perspective on it
One of several recurring claims that make me feel like an alien. Do so many people really think this? It sure does end up getting said a lot.
I'm comfortable using text to communicate, but I hardly ever write in the sense people mean when they speak of it.
At Stanford University - Doug's lab notebooks, correspondence, reports, memos, papers - available at the MouseSite Archive page, Stanford Libraries Special Collections, with links to their Annotated Table of Contents
broken links
Program On Human Effectiveness
broken link
I re-discovered your article about three years ago, and was rather startled to realize how much I had aligned my sights along the vector you had described. I wouldn't be surprised at all if the reading of this article sixteen and a half years ago hadn't had a real influence upon the course of my thoughts and actions.
This passage and blurbs earlier in the letter goes against the common narrative that Engelbart's happening upon the article was life-altering as a result of being made into an instant convert to Bush's vision.
Doug's got a story and he told the story many times and you know when he tells story many times it becomes realer and realer to you
I don't know if Howard is trying to give us a nudge and wink here, but if so then the subsequent retelling that Doug was "by his own story very influenced by Vannevar Bush's As We May Think article" is certainly relevant to what Howard is suggesting, given that in Doug's own letter to Bush in 1962, he claims not to have even had Bush's ideas on his radar (no pun intended) until having rediscovered the Atlantic article after already having his own serious pursuits underway at SRI (and getting on at SRI was no small feat according to Howard's telling in Tools For Thought).
the accusation of Taylorism
Taylorism gets a bum rap. Taylor's fans know it and seem to be at work on Wikipedia. See scientific management.
the institutions for the production of knowledge rewards specialization
DTIC’s public technical reports have migrated to a new cloud environment. The link you used is outdated.
Fuck off.
Full version: https://www.dougengelbart.org/content/view/266/177/
Almost any time you interpret the past as "the present, but cruder", you end up missing the point. But in the case of Engelbart, you miss the point in spectacular fashion.
See also Ted Nelson's opening line from Xanalogical Structures, Needed Now More Than Ever:
Project Xanadu, the original hypertext project, is often misunderstood as an attempt to create the World Wide Web.
you couldn't get an undergraduate degree in computing back then it was so good you had to learn something
a way to transparently discover related blogs that avoids hidden algorithms
Bristly pop cultural misuse of the term "algorithm" notwithstanding, a better solution that I haven't come across anyone else mention: make them explicit, not hidden.
When it look like Dat might've had enough steam to take off (ca. 2017–2018), I wrote a draft straw proposal for how to solve the "discovery problem" with e.g. Fritter—i.e. the problem that because you only "receive" replies and other messages by checking the feeds of the people you're following, you'll be unable to reply engage with strangers who appear (or have a stranger engage with you in their thread) unless something happens like a mutual acquaintance alerting you out of band ("hey, did you see @foo's reply dat://foo.example.net/posts/ukifxdgbh.json?").
The idea is that there is a special kind of feed operated by some service provider that specializes in doing exactly that. If you find Facebook valuable, for example, then you are free to opt in and subscribe to the Facebook analog that pays attention to all feeds and works to surface interesting content for you. Under this model, unlike the Facebook regime, "leaving" is as simple as unsubscribing to that feed (and going with a different provider if you want).
Dear reader: you want https://www.rfc-editor.org/rfc/rfc1630.html.
Around @0:25:52
Krouse: Another subset of "shit just works" would be—
Leung:"No installation required"?
Krouse: Yeah. "No installation required". [...] as I was just telling you, I spent the last, like... I spent 5 hours over the last two days installing... trying to install software to get something to run. And it's just ridiculous to have to spend hours and hours. If you want to get Xcode to run, it takes— first of all you need a Mac, which is crazy, and then second of all it takes, depending on your internet connection, it could take you a whole day just to get up and running. Why isn't it
xcode.com/create?
overly-educated
Is "overly-educated" the right thing here? As a substitute for the traditional euphemism "well-heeled", I think something like "highly credentialed" fits better.
When logged in, the page in question requests (using XHR) the URL https://www.pbs.org/watchlist/page/1/, which returns something with a shape that looks like this:
{"currentPage": 1, "videos": [{"slug": "american-experience-abolitionists-promo", "title": "The Abolitionists", "title_sortable": "The Abolitionists", "url": "/video/american-experience-abolitionists-promo/", "duration": "30s", "description_short": "Premiering January 8 2013. Turning a fringe movement into a force that changed the nation.", "description_long": "The story of how abolitionist allies William Lloyd Garrison, Frederick Douglass, Harriet Beecher Stowe, John Brown and Angelina Grimke turned a despised fringe movement against chattel slavery into a force that literally changed the nation.", "video_type": "Preview", "encore_date": "2013-01-08T00:00:00-05:00", "expire_date": "", "availability": "available", "images": {"asset-mezzanine-16x9": "https://image.pbs.org/video-assets/pbs/american-experience/42958/images/Mezzanine_222.jpg"}, "flags": {"is_new": false, "is_mvod": false, "has_captions": false, "is_expiring_soon": false, "is_fully_watched": false}, "item_type": "video", "parent_type": "episode", "show": {"slug": "american-experience", "title": "American Experience", "season": 25, "episode": 12, "seasons_count": 31, "display_episode_number": true, "url": "/show/american-experience/"}, "summary": "Preview: S25 Ep12 | 30s", "ancestor_title": "American Experience", "ancestor_slug": "american-experience", "ancestor_type": "show", "image": "https://image.pbs.org/video-assets/pbs/american-experience/42958/images/Mezzanine_222.jpg", "legacy_tp_media_id": 2274405136, "cid": "2af4a426-df86-4e2e-ae84-858b737fbd5c", "air_date": "2013-01-08T00:00:00-05:00", "percent_complete": 0, "description": "Premiering January 8 2013. Turning a fringe movement into a force that changed the nation.", "timecode": ""}, {"slug": "american-experience-rise-and-fall-penn-station-preview", "title": "The Rise and Fall of Penn Station Preview", "title_sortable": "Rise and Fall of Penn Station Preview", "url": "/video/american-experience-rise-and-fall-penn-station-preview/", "duration": "30s", "description_short": "The engineering feat would last, the architectural masterpiece did not. Premiering Feb 18.", "description_long": "The Pennsylvania Railroad built tunnels under New York City's Hudson and East Rivers, connecting New York with New England, and terminating in one of the greatest architectural achievements of its time, Pennsylvania Station. It covered nearly eight acres, extended two city blocks, and housed one of the largest public spaces in the world. But 53 years later, the monumental building was destroyed.", "video_type": "Preview", "encore_date": "2014-02-18T00:00:00-05:00", "expire_date": "", "availability": "available", "images": {"asset-mezzanine-16x9": "https://image.pbs.org/video-assets/pbs/american-experience/113257/images/mezzanine_388.jpeg"}, "flags": {"is_new": false, "is_mvod": false, "has_captions": false, "is_expiring_soon": false, "is_fully_watched": false}, "item_type": "video", "parent_type": "episode", "show": {"slug": "american-experience", "title": "American Experience", "season": 26, "episode": 5, "seasons_count": 31, "display_episode_number": true, "url": "/show/american-experience/"}, "summary": "Preview: S26 Ep5 | 30s", "ancestor_title": "American Experience", "ancestor_slug": "american-experience", "ancestor_type": "show", "image": "https://image.pbs.org/video-assets/pbs/american-experience/113257/images/mezzanine_388.jpeg", "legacy_tp_media_id": 2365164811, "cid": "5052e26b-2fbd-4b63-af12-711a1b8d28ea", "air_date": "2014-02-18T00:00:00-05:00", "percent_complete": 0, "description": "The engineering feat would last, the architectural masterpiece did not. Premiering Feb 18.", "timecode": ""}, {"slug": "american-experience-forgotten-plague-preview", "title": "The Forgotten Plague Preview ", "title_sortable": "The Forgotten Plague Preview ", "url": "/video/american-experience-forgotten-plague-preview/", "duration": "30s", "description_short": "The battle against TB had a lasting impact on America. Premieres February 10 on PBS. ", "description_long": "By the dawn of the 19th century tuberculosis had killed one in seven of all the people who had ever lived. The battle against the deadly bacteria had a profound and lasting impact on the US, shaping medical and scientific pursuits, social habits, economic development, western expansion, and government policy. Premieres February 10 on PBS American Experience.", "video_type": "Preview", "encore_date": "2015-02-10T00:00:00-05:00", "expire_date": "", "availability": "available", "images": {"asset-mezzanine-16x9": "https://image.pbs.org/video-assets/pbs/american-experience/158684/images/mezzanine_700.jpg"}, "flags": {"is_new": false, "is_mvod": false, "has_captions": false, "is_expiring_soon": false, "is_fully_watched": false}, "item_type": "video", "parent_type": "episode", "show": {"slug": "american-experience", "title": "American Experience", "season": 27, "episode": 5, "seasons_count": 31, "display_episode_number": true, "url": "/show/american-experience/"}, "summary": "Preview: S27 Ep5 | 30s", "ancestor_title": "American Experience", "ancestor_slug": "american-experience", "ancestor_type": "show", "image": "https://image.pbs.org/video-assets/pbs/american-experience/158684/images/mezzanine_700.jpg", "legacy_tp_media_id": 2365409222, "cid": "452b093e-8d4e-48be-8a6e-211f837e2160", "air_date": "2015-02-10T00:00:00-05:00", "percent_complete": 0, "description": "The battle against TB had a lasting impact on America. Premieres February 10 on PBS. ", "timecode": ""}, {"slug": "american-experience-blackout-preview", "title": "Blackout Preview", "title_sortable": "Blackout Preview", "url": "/video/american-experience-blackout-preview/", "duration": "30s", "description_short": "What happened when the lights went out in New York in 1977? Premieres 7/14 at 9/8c on PBS.", "description_long": "On the night of July 13, 1977, lightning strikes took out several critical power lines, causing a catastrophic power failure and plunging some 8 million people into darkness in the New York City area. First responders, journalists, shop owners, Con Edison employees, and other New Yorkers tell about what happened when the lights went out. Premieres 7/14 at 9/8c on PBS American Experience.", "video_type": "Preview", "encore_date": "2015-07-14T00:00:00-04:00", "expire_date": "", "availability": "available", "images": {"asset-mezzanine-16x9": "https://image.pbs.org/video-assets/pbs/american-experience/177488/images/mezzanine_254.jpg"}, "flags": {"is_new": false, "is_mvod": false, "has_captions": false, "is_expiring_soon": false, "is_fully_watched": false}, "item_type": "video", "parent_type": "episode", "show": {"slug": "american-experience", "title": "American Experience", "season": 27, "episode": 7, "seasons_count": 31, "display_episode_number": true, "url": "/show/american-experience/"}, "summary": "Preview: S27 Ep7 | 30s", "ancestor_title": "American Experience", "ancestor_slug": "american-experience", "ancestor_type": "show", "image": "https://image.pbs.org/video-assets/pbs/american-experience/177488/images/mezzanine_254.jpg", "legacy_tp_media_id": 2365511610, "cid": "669efdf1-55bf-47b2-96f8-c21b05627a9a", "air_date": "2015-07-14T00:00:00-04:00", "percent_complete": 0, "description": "What happened when the lights went out in New York in 1977? Premieres 7/14 at 9/8c on PBS.", "timecode": ""}, {"slug": "american-experience-edison-preview", "title": "Edison Preview", "title_sortable": "Edison Preview", "url": "/video/american-experience-edison-preview/", "duration": "30s", "description_short": "The story of the Father of Invention. Premieres January 27, 2015 on PBS. ", "description_long": "Premiering January 27, 2015 on PBS American Experience. Thomas Edison achieved glory as the genius behind such revolutionary inventions as sound recording, motion pictures, and electric light. The holder of more patents than any other inventor in history, \"The Wizard of Menlo Park\" was also intensely competitive and was often neglectful in his private life.", "video_type": "Preview", "encore_date": "2015-01-27T00:00:00-05:00", "expire_date": "", "availability": "available", "images": {"asset-mezzanine-16x9": "https://image.pbs.org/video-assets/pbs/american-experience/149280/images/mezzanine_664.jpg"}, "flags": {"is_new": false, "is_mvod": false, "has_captions": false, "is_expiring_soon": false, "is_fully_watched": false}, "item_type": "video", "parent_type": "episode", "show": {"slug": "american-experience", "title": "American Experience", "season": 27, "episode": 3, "seasons_count": 31, "display_episode_number": true, "url": "/show/american-experience/"}, "summary": "Preview: S27 Ep3 | 30s", "ancestor_title": "American Experience", "ancestor_slug": "american-experience", "ancestor_type": "show", "image": "https://image.pbs.org/video-assets/pbs/american-experience/149280/images/mezzanine_664.jpg", "legacy_tp_media_id": 2365357999, "cid": "4a4ebbfd-ca64-4217-97c9-3c7f33b8f075", "air_date": "2015-01-27T00:00:00-05:00", "percent_complete": 0, "description": "The story of the Father of Invention. Premieres January 27, 2015 on PBS. ", "timecode": ""}, {"slug": "american-experience-mine-wars-20jkwo", "title": "American Experience: The Mine Wars", "title_sortable": "American Experience: The Mine Wars", "url": "/video/american-experience-mine-wars-20jkwo/", "duration": "30s", "description_short": "Monday, September 9 at 8PM", "description_long": "This show tells the overlooked story of the miners in the mountains of southern West Virginia \u2013 native mountaineers, African American migrants, and European immigrants \u2013 who came together in a protracted struggle for their rights.", "video_type": "Preview", "encore_date": "2019-09-09T00:00:00-04:00", "expire_date": "", "availability": "available", "images": {"asset-mezzanine-16x9": "https://image.pbs.org/video-assets/DY9f0vQ-asset-mezzanine-16x9-2fR2YaZ.png"}, "flags": {"is_new": false, "is_mvod": false, "has_captions": false, "is_expiring_soon": false, "is_fully_watched": false}, "item_type": "video", "parent_type": "special", "show": {"slug": "wxel-presents", "title": "WXEL Presents", "season": null, "episode": "", "seasons_count": 0, "display_episode_number": true, "url": "/show/wxel-presents/"}, "summary": "Preview: Special | 30s", "ancestor_title": "WXEL Presents", "ancestor_slug": "wxel-presents", "ancestor_type": "show", "image": "https://image.pbs.org/video-assets/DY9f0vQ-asset-mezzanine-16x9-2fR2YaZ.png", "legacy_tp_media_id": 3032488536, "cid": "588ffe4d-b626-43c1-8cae-61555fa89b4a", "air_date": "2019-09-09T00:00:00-04:00", "percent_complete": 0, "description": "Monday, September 9 at 8PM", "timecode": ""}, {"slug": "part-1-our-game-c6e5nn", "title": "Our Game", "title_sortable": "Our Game", "url": "/video/part-1-our-game-c6e5nn/", "duration": "1h 54m 53s", "description_short": "Inning One: Our Game looks at the origins of baseball in the 1840s and up to 1900.", "description_long": "In New York City, in the 1840s, people need a diversion from the \"railroad pace\" at which they work and live. They find it in a game of questionable origins. Inning One, Our Game, looks at the origins of baseball in the 1840s and takes the story up to 1900. Burns refutes the myth that Abner Doubleday invented baseball in Cooperstown and traces its roots instead to the earliest days of the nation.", "video_type": "Episode", "encore_date": "2020-03-13T00:00:00-04:00", "expire_date": "12/31/23", "availability": "available", "images": {"asset-mezzanine-16x9": "https://image.pbs.org/video-assets/ymkNEJ4-asset-mezzanine-16x9-IgUM0b6.jpg"}, "flags": {"is_new": false, "is_mvod": true, "has_captions": true, "is_expiring_soon": false, "is_fully_watched": false}, "item_type": "video", "parent_type": "episode", "show": {"slug": "baseball", "title": "Baseball", "season": 1, "episode": 1, "seasons_count": 1, "display_episode_number": true, "url": "/show/baseball/"}, "summary": "Ep1 | 1h 54m 53s", "franchise": {"title": "Ken Burns", "slug": "ken-burns", "logo": null, "logo_cropped": null, "logo_cropped_white": null, "image": null}, "ancestor_title": "Baseball", "ancestor_slug": "baseball", "ancestor_type": "show", "image": "https://image.pbs.org/video-assets/ymkNEJ4-asset-mezzanine-16x9-IgUM0b6.jpg", "legacy_tp_media_id": 3040088774, "cid": "0f1a8715-8b7f-4f07-9093-6eca06a57cfd", "air_date": "2020-03-13T00:00:00-04:00", "percent_complete": 0, "description": "Inning One: Our Game looks at the origins of baseball in the 1840s and up to 1900.", "timecode": ""}, {"slug": "growing-up-poor-in-america-z4g8k5", "title": "Growing Up Poor in America", "title_sortable": "Growing Up Poor in America", "url": "/video/growing-up-poor-in-america-z4g8k5/", "duration": "54m 22s", "description_short": "The experience of childhood poverty against the backdrop of the pandemic.", "description_long": "The experience of childhood poverty against the backdrop of a pandemic and a national reckoning with racism. Set in Ohio, the film follows children and their families navigating issues of poverty, homelessness, race and new challenges due to COVID-19.", "video_type": "Episode", "encore_date": "2020-09-08T00:00:00-04:00", "expire_date": "", "availability": "available", "images": {"asset-mezzanine-16x9": "https://image.pbs.org/video-assets/CvLWNfw-asset-mezzanine-16x9-nxEEDak.jpg"}, "flags": {"is_new": false, "is_mvod": false, "has_captions": true, "is_expiring_soon": false, "is_fully_watched": false}, "item_type": "video", "parent_type": "episode", "show": {"slug": "frontline", "title": "FRONTLINE", "season": 2020, "episode": 18, "seasons_count": 29, "display_episode_number": true, "url": "/show/frontline/"}, "summary": "S2020 Ep18 | 54m 22s", "ancestor_title": "FRONTLINE", "ancestor_slug": "frontline", "ancestor_type": "show", "image": "https://image.pbs.org/video-assets/CvLWNfw-asset-mezzanine-16x9-nxEEDak.jpg", "legacy_tp_media_id": 3046413469, "cid": "b8a389ef-97bb-412a-879c-0b4efbfd5380", "air_date": "2020-09-08T00:00:00-04:00", "percent_complete": 0, "description": "The experience of childhood poverty against the backdrop of the pandemic.", "timecode": ""}]}
There are four JS resources referenced:
The watchlist script is minified, but production has a sourceMappingURL pointing to the watchlist.js.map. I have taken extra steps to make sure that the Wayback Machine has archived this as well. The Firefox devtools are able to use the map to reproduce the original (pre-bundled, unminified) sources for inspection.
First Archive
Lots of stuff to dig through here. (35 items that the Wayback Machine is describing with class iconochive-First.)
How OpenVSCode Server turns VS Code into a web IDE
This news item was submitted only 17 days ago, and yet it's already returning a 404. This is a casualty of the "our code host's presentation of our repo is our website".
As of this writing (i.e. commit fb662ab0), the working link is https://github.com/gitpod-io/openvscode-server/blob/docs/sourcedive.snb.md.
Lots of good examples about why not: https://news.ycombinator.com/item?id=28867878
Good example why it's a good idea in technical discussions to disallow responses trying to refine a proposed analogy. See https://www.colbyrussell.com/2020/12/07/new-rule-for-technical-discussions.html
Ungar, around @1:00:00:
I try to explain to people that the notion of compiler is broken. Of course I learned this from Smalltalk, but what we want to build is experiences--artificial realities that convince you that your source code is real. It's directly executed. There's no lag between editing and running[...] The environment stresses things in your program, not tools--which is another rant I have. It's this whole idea that we want to put you in an artificial reality--I got that from Randy [Smith]--in which it's easy and natural and low-cognitive-burden to get the computer to do what you want it to do, rather than running language translators that turn weird strings of text into bits the machine can run
I no longer know how it works. I don't care to maintain it. It needs big changes to handle something like embedding a Jupyter notebook. And it depends on Python 2.6(!).With hundreds of pages, and its own custom URL layout that I don't want to break, I dread migrating
lense
lense
Lense
Should be "lens".
you data
Should be "your data".
if you have an older version of node/npm I can't guarantee, that the following scripts will work correctly
Make your projects' metatooling accessible
This should be full black.
have are meant
it's a rare project which carries those notifications within the source files themselves
That doesn't affect whether there is merit to the claim.
if you're going to try to claim he's somehow "privileged"
Good case study for the theory of affordances.
More directly about this gilded path through the tool. So, in an initial version of Sketch-n-Sketch and the version that we demoed at Strange Loop, there were many requirements about the syntactic structure of the program that if they weren’t satisfied certain interactions in the output would no longer be available. So a simple example is in that initial milestone, the main expression, the main definition of the program essentially had to be a list literal of shapes. And each of those shapes had to be a top level definition in your program and only then could certain interactions be available to users.
We would prefer: stay within single host language, but make code lookas declarative as possible.
Kartik Agaram's essay on "habitability"
Kartik points out that "http://akkartik.name/post/habitability is not written by me. It's a long quote from an essay by Richard Gabriel."
I, as your employer, don't actually know what you will or won't do
This can be totally true and still be a problem completely with you.
I suspect dealing with this type of person would become a huge issue. This work request hints at overblown entitlement.
Working for $5 an hour with the only condition being that you demonstrate some diligence by making sure that the work you're asking for is actually meaningful? This is "entitlement"? Yeah, okay. Look into a reality check, shithead, and in the meantime, go fuck yourself.
Private links One must be able to add one's own private links to and from public information. One must also be able to annotate links, as well as nodes, privately.
The reason is CORS.
A user on HN asks
Two things to observe: people are willing to keep tabs open (esp. for dashboards), and the unreliability of Twitter's website tells us that users will put up with a lot of brokenness.
Here's a thought: use this polling strategy combined with a dashboard combined with a "FCSCORS" (forced client-side CORS). The idea is to embed an iframe from an origin associated with the site to poll, and then use S4/postMessage to communicate with it.
From a neutral, bookmarklet-controlled page (e.g. with about:blank's origin), embed an iframe pointing to some page with the same origin. Force the iframe to open a secondary window (tab) with window.open. A window.opener link now exists between the iframe and the secondayr tab. Set the iframe's location to a page from the desired origin, and then do the same with the secondary tab. A second invocation of the bookmarklet in the secondary tab should be able to use window.opener to get unrestricted access the iframe-loaded document. Use this to install a message handler. The secondary tab can now be destroyed, and the embedding document (dashboard) can merrily communicate with desired origin, with no cooperation necessary (e.g. to enable CORS) by the site operator, and no need for the user to install an add-on.
The question now is, "How common is it for a website operator to set x-frame-options: deny?"
Hover over your username at the top left
Discoverability/accessibility issue, as noted in https://discuss.write.as/t/how-do-i-back-up-my-site/3296/5
Would be cool if you could export it as a static site
Dump out a ZIP with a viewer similar to the Google Takeout archive browser.
You have 3 posts that aren't synced to your account yet. Sync them now.
What does this mean?
There's nothing really here. We should say so.
custom java
More than a few people have made this error. Call it "JS".
I’ll put that on our roadmap.
Where?
The JSON / “prettified” JSON is mostly meant for people who need their posts in a format that’s more easily read by computers. If you’re just creating a backup, the normal “JSON” format should be fine for this.
Polish the UI on this to eliminate the confusion. Communicate that the JSON option is for people who know what JSON is. If you don't know what JSON is, it's not the option you want.
Something related to this would actually work as a good springboard for the convolutions involved with the New Post Emails Not Being Sent to Email Subscribers problem.
Direct manipulation via manual overrides and explicit feedback, etc.
I know I would love the ability to add plausible analytics. They’re a great ethical choice for site analytics.
Looks like it's AGPL, so it would not be possible to use on write.as...
I’ll open a bug report for this
Where?
this is an issue with our Markdown parser
Let's rev it. Let's port Bob Nystrom's Dart package.
pasting in the Markdown directly won’t work – instead, you’ll want to use the “Insert Image” feature
This is an opportunity to delight the user by intercepting their "bad" behavior and steering them in the right direction.