- May 2022
-
theconvivialsociety.substack.com theconvivialsociety.substack.com
-
There seem to be two distinct concerns for Illich. The first is that we lose the commons of which silence is an integral part and thus a measure of freedom and agency. The second, concurrent with the first, is that you and I may find it increasingly hard to be heard even as we are given more and more tools with which to speak. Alternatively, we might also distinguish between silence as a space of possibility and silence as itself a good to be defended, something we need for its own sake. Illich is reminding us yet again that what we may need is not more of something, in this case words, but less.
-
-
theconvivialsociety.substack.com theconvivialsociety.substack.com
-
We Fray Into the Future
-
Laughter In Dark Times
Tags
Annotators
URL
-
-
theconvivialsociety.substack.com theconvivialsociety.substack.com
-
So why spill all these words, then? I suppose it is because I want to read Chalmers against the grain, which is to say as the unwitting emissary of a fantastical cautionary tale, which invites us not to fear some uncertain future but to reflect more critically on existing trends and patterns. For example, I wonder for how many of us the experience of the world is already so attenuated or impoverished that we might be tempted to believe that a virtual simulation could prove richer and more enticing? And how many of us already live as if this were in fact the case? How much of my time do I already devote to digitally mediated images and experiences? How often am I lured away from the world before my eyes by the one present through the screen?
-
the race to take flight from the earth as the material habitat of the human being now takes at least two forms: a literal flight from the planet and a virtual flight into digital realms.
-
But even if we grant, for arguments sake, that the experience of virtual reality will be indistinguishable from the experience of non-virtual reality and that it may even be somehow richer and more pleasant, as with certain dreams, it will still be the case that the experience will cease. The VR equipment will come off, you will disconnect from the virtual world and come back to the non-virtual world. Unless, of course, you posit an even more disturbing Matrix-like scenario in which human beings can choose to remain hooked up to virtual worlds indefinitely with their biological needs somehow serviced artificially. And this I would offer as the reductio ad absurdum of Chalmers’ argument. I don’t see what grounds he would have to object to such a development given his premises.
-
The good life is envisioned merely as artificially triggered neuronal activity.
-
Interestingly, and suggestively with virtual reality in mind, I think most of us can attest to the fact that a dream can sometimes linger in our imagination, for good or for ill. It may haunt us with its sweetness or with its strangeness. We may wake relieved or disappointed, and through the day we remember it with longing that wrecks our heart or an uneasiness that disquiets our mind. So, while a dream is a different sort of reality from that which I experience outside of the dream state, it can nonetheless permeate the waking world.
-
In other words, I’m suggesting that we take Chalmers’s dubious claims about the future of VR as an unintentional critique of contemporary society.2 We should take his argument, then, not quite at face value, but rather as a symptom of present disorders that we ought to diagnose. Now allow me to proceed in a rather roundabout way.
-
-
theconvivialsociety.substack.com theconvivialsociety.substack.com
-
Malik went on to write that “it is time to add an emotional and moral dimension to products,” by which he seems to have meant that tech companies should use data responsibly and make their terms of service more transparent. In my response at the time, I took the opportunity to suggest that we needn’t add an emotional and moral dimension to tech, it was already there. The only question was as to its nature. As Langdon Winner had famously inquired “Do artifacts have politics?” and answered in the affirmative, I likewise argued that artifacts have ethics. I then went on to produce a set of 41 questions that I drafted with a view to helping us draw out the moral or ethical implications of our tools. The post proved popular at the time and I received a few notes from developers and programmers who had found the questions useful enough to print out post in their workspaces.
-
-
-
In the middle of the turbulent 1930s, with Nazism, Fascism, and Stalinism flourishing, T. S. Eliot wrote of men who dreamed “of systems so perfect that no one will need to be good.”11 The ideology of technology tempts us in a similar manner. In the end we always find that such dreams yield nightmares that are all too real. If America’s ongoing experiment in democracy and economic freedom is to endure, we will need to think again about cultivating the necessary habits of the heart and resisting the allure of the ideology of technology.
-
Progress came to be understood as the advance of technology for technology’s sake.
-
Beyond this, however, there is also the matter of habits and assumptions and how these in turn shape individuals who together comprise the political and economic culture of the nation. What sorts of habits, then, are inculcated by a technological environment ordered around this general tendency? Certainly not the kind of habits that sit well with the venerable notion of delayed gratification. Nor, it would seem, would these habits leave one well suited for the demands of citizenship. The framers of our political order knew that its success would hang on what they understood as classical republican virtues—thrift, hard work, a measure of austerity, moderation, self-sufficiency, and self-government. It is easy to imagine how different our current political and economic circumstance might be if these virtues were in greater supply—perhaps too easy and facile as well.
-
To be precise, Tocqueville titled the tenth chapter of volume two, “Why The Americans Are More Addicted To Practical Than To Theoretical Science.” In Tocqueville’s day, the word technology did not yet carry the expansive and inclusive sense it does today. Instead, quaint sounding phrases like “the mechanical arts,” “the useful arts,” or sometimes merely “invention” did together the semantic work that we assign to the single word technology.1 “Practical science” was one more such phrase available to writers, and, as in Tocqueville’s case, “practical science” was often opposed to “theoretical science.” The two phrases captured the distinction we have in mind when we speak separately of science and technology.
Tags
Annotators
URL
-
-
www.thenewatlantis.com www.thenewatlantis.com
-
It is possible that the Promethean aspirations that characterized the modern self and modern society may now yield to a more sober assessment of the limits within which genuine human flourishing might occur. It is possible, too, that we may learn once again the necessity of virtues, public and private — that we will no longer, as T. S. Eliot put it, be “dreaming of systems so perfect that no one will need to be good.” Topics Disenchantment Social Media History of Technology Link copied to clipboard! Subscriber Only Sign in or Subscribe Now for access to PDF downloads
-
The machine-like behavior of people chained to electronics constitutes a degradation of their well-being and of their dignity which, for most people in the long run, becomes intolerable. Observations of the sickening effect of programmed environments show that people in them become indolent, impotent, narcissistic and apolitical. The political process breaks down, because people cease to be able to govern themselves; they demand to be managed.
-
Consider, too, how we present ourselves online. As sociologist Erving Goffman observed in The Presentation of Self in Everyday Life (1959), we have always managed our impressions in keeping with the nature of our social settings. An attentive observer who followed me around from one such setting to another might be able to identify these often subtle modulations of my self-presentation, modulations to which I myself might have become oblivious. But now that social life has been digitized, I become keenly aware of myself engaging in the work of impression management, and I know, or at least I suspect, that everyone else is involved in the same activity. As a result, we experience the digital self as an artificial construct or, worse, as a self-interested manipulation of social relations.
-
Just as what is uncovered by film was unknown but always there, so, too, does digital media reveal aspects of our social experience that were always there but not always perceived. But what digital media reveals about the character and quality of social life, it also transforms.
-
in the age of digital reproducibility, the self no longer appears unique; the romantic idea of some ineffable essence that is me loses its power. The self that is rendered computable, and thus legible to the tools of computation, is an impoverished self, one whose aura has dissipated.
-
But although digital media appears to sustain memory, it is more like oral communication in its evanescence. The feed of our tweets and status updates recedes not as quickly and decisively as the spoken word, but with a similar effect. Under the guise of pervasive documentation, the architecture of digital platforms sanctions forgetting, while preoccupying us with instantaneity. It is not currently this era or this year, but rather this day or even this hour. To live on social media is to be sucked into a hyper-extended present, upon which the past only occasionally impinges.
-
We have ever more access to the past, but we are unable to bring it meaningfully to bear on the present.
-
Not only can written knowledge outlive a particular individual, it can outlive a whole culture. By outsourcing the task of remembrance to written texts, literate societies are relieved of the conservative and traditionalist pressures of orality. Thus, whereas premodern societies tended to look back to a distant and glorious past, many modern societies, with their imaginations liberated, were utopian in their expectations, assuming the best was still to come.
-
Language itself, especially in what we would think of as its more poetic manifestations, was focused on mnemonic efficiency: proverbs, parables, vivid images, rhythmic utterances, rituals, formulaic expressions, repetitions — much of what became unpalatable to the literate sensibility.
-
The triumph of shared time and the demise of shared place in the Digital City changes the experience of social belonging. While the modern state is not going anywhere anytime soon, the relationship of citizens to the nation is evolving. Loyalty to the community that is the nation state, already detached to some degree from local communities, yields to the shifting loyalties of digital attachments.
-
It is not only our experience of the present that digital media refashions but also our relationship, through memory, to the past. “What anthropologists distinguish as ‘cultures,’” Ivan Illich wrote, “the historian of mental spaces might distinguish as different ‘memories.’ The way to recall, to remember, has a history which is, to some degree, distinct from the history of the substance that is remembered.” We are, at least in part, what we remember, both as individuals and as a society, and what we remember is a function of how we remember, of our tools for remembering — texts, images, monuments, social media feeds. A change in tools is also a change of the self and its relation to society. There are few more important effects of digital technologies than their propensity to reorder how and what we remember.
-
The modern Analog City, particularly its print-based ecosystem of knowledge and the growing success of the techno-scientific project of mastery over nature, engendered the ideals of robust, confident, self-sufficient individualism. The Digital City disabuses its citizens of such notions. They know they are dependent and vulnerable, enmeshed in systems beyond their capacity to master.
-
The buffered self is sealed off from the world; its boundaries are less fuzzy; meaning resides neatly within its own mind; and it occupies a world of inert matter. It is autonomous and self-possessed, the ideal type of the modern individual.
-
in the enchanted world things and spirits have the “power of exogenously inducing or imposing meaning,” a meaning that is independent of the perceiver and that we may be forced to reckon with whether we would like to or not. Objects in the enchanted world can also have a causal power. These “charged” objects, Taylor explains, “have what we usually call ‘magic’ powers,” and they can be either benevolent or malevolent. They may bring blessing or trouble, cure or disease, rescue or danger.
-
Nothing external to the human mind bears any meaning in itself.
-
In his account of the nature of secular society, Charles Taylor argues that an important part of the emergence of the modern age was the disenchantment of the world and the rise of what he describes as the “buffered self.” Unlike the old “porous self,” the new buffered self no longer perceives and believes in sources of meaning outside the human mind. This new self feels unperturbed by powers beyond its control. We might say that in the Digital City the self becomes in some ways “porous” once again. It is subject to powers that we perceive as impinging on us, powers now algorithmic rather than spiritual.
-
In the Digital City the word is reanimated, recovering from the written much of the vitality of the spoken word. Digital media reintegrates the word into a dynamic situation. The digital audience is not always visible, but it can be present with a degree of immediacy that is more like a face-to-face encounter than are print writing and reading. Discourse on digital media platforms, from comment boxes to social media, is infamously combative. Words are active, and any negative effects are not easily contained.
-
But free-speech maximalism flourishes in print culture; in the Digital City it appears less desirable, for two reasons. First, print culture sustained the belief that, given a modicum of good sense and education among people, truth would triumph in the marketplace of ideas. Writing and reading are slow and deliberate, encouraging the belief that false ideas will eventually be rejected by anyone trained to think. Second, we experience the written word as an inert reality — it is the “dead letter,” it has lost the force and immediacy of the spoken word. Because writing is less volatile than speech, it makes freedom of expression seem relatively harmless.
-
Online venues, whether social media platforms, messaging apps, or forums, are not simply places we go to express our political opinions; they are places where our political habits and sensibilities are formed.
-
But let us think more about the last of these three alterations — about how new technologies alter the nature of community. The human self, as philosophers have long noted, emerges in relation to others, or to the Other, if you like. The character of the self develops under the gaze of this Other, and is shaped by it. In the Digital City, we are under the gaze of an algorithmically constituted, collective Other. This audience, composed of friends, strangers, and non-human actors, is unlike anything we might have encountered in the Analog City. Like the gaze of God, it is a ubiquitous face looking down upon us, whose smile we dearly desire. We seek its approval, or, failing that, at least its notice, and we subtly bend our self-presentation to fit our expectations of what this audience desires of us.
-
Postman’s second alteration — in the things we think with — can be glossed as follows: It is one thing to think with a pamphlet, another to think with a newspaper, yet another to think with a televisual image, and still another to think with a meme. Writing encourages a heightened precision of expression and a sequential and systematic ordering of ideas and arguments. The television image does both more and less than what writing can accomplish, operating at a different emotional register. It can communicate wordlessly, directly to the heart, as it were, but it cannot easily support nor does it encourage sustained argumentation.
-
The first alteration — in the things we think about — can be described in terms of attention, which has so vexed our public discourse about technology in the last few years. (Although as early as 2008, Nicholas Carr had already raised lonely alarms and even warned against the dangers of micro-targeted advertising.) The structure of a medium of communication guides and directs our attention. Television, for example, directs our attention to the physical characteristics of a political figure in a way that print and radio don’t. The telegraph made it possible to bring far away events to broad public attention on a daily basis, thereby altering the topics and pace of political discourse. Digital media now makes it possible to attend to an even wider array of phenomena, near and far, and often in so-called “real time.” And, insofar as our social media feeds are shaped by opaque algorithmic processes and engineered to maximize engagement, they play an especially obvious role in altering what we think about.
-
New technologies alter the structure of our interests: the things we think about. They alter the character of our symbols: the things we think with. And they alter the nature of community: the arena in which thoughts develop.
-
The invention of writing, especially the invention of the phonetic alphabet, “restructured consciousness,” in Walter Ong’s memorable phrase, and so restructured societies as well. It did so, in part, by making new forms of expression, organization, and remembering possible. Writing, for instance, made it possible to detach the act of communication from the context of the face-to-face encounter, thereby tempering what Ong called the “agonistic” tendencies of such encounters — their resemblance to combat — particularly in the political realm. Writing also “separates ‘administration’ — civil, religious, commercial, and other — from other types of social activities.” And it externalized thought and memory, engendering the novel experience of the mediated self, marked by heightened self-awareness.
-
The anodyne insistence on fact-checking to bridge chasms in worldview misunderstands the nature of our new media environment; it fails to see the difference between the economics of information scarcity and the economics of information abundance.
-
Communication technologies are the material infrastructure on which so much of the work of human society is built. One cannot radically transform that infrastructure without radically altering the character of the culture built upon it. As Neil Postman once put it, “In the year 1500, fifty years after the printing press was invented, we did not have old Europe plus the printing press. We had a different Europe.” So, likewise, we may say that in the year 2020, fifty years after the Internet was invented, we do not have old America plus the Internet. We have a different America.
-
We are caught between two ages, as it were, and we are experiencing all of the attendant confusion, frustration, and exhaustion that such a liminal state involves. To borrow a line from the Marxist thinker Antonio Gramsci, “The crisis consists precisely in the fact that the old is dying and the new cannot be born; in this interregnum a great variety of morbid symptoms appear.”
-
The challenges we are facing are not merely the bad actors, whether they be foreign agents, big tech companies, or political extremists. We are in the middle of a deep transformation of our political culture, as digital technology is reshaping the human experience at both an individual and a social level. The Internet is not simply a tool with which we do politics well or badly; it has created a new environment that yields a different set of assumptions, principles, and habits from those that ordered American politics in the pre-digital age.
-
As late as 2011, journalists and technologists were praising social media’s emancipatory power in light of the role of Facebook and Twitter in the Arab Spring revolts. But, as has been noted many times, after the U.S. presidential election in 2016, such optimism increasingly appeared naïve and misguided.
-
As late as 2011, journalists and technologists were praising social media’s emancipatory power in light of the role of Facebook and Twitter in the Arab Spring revolts. But, as has been noted many times, after the U.S. presidential election in 2016, such optimism increasingly appeared naïve and misguided.
-
-
www.forbes.com www.forbes.com
-
AI started to shift paradigms, from symbolism to connectionism, from defining (and programming) every aspect of learning and thinking, to statistical inference or finding connections or correlations leading to learning based on observations or experience.
-
Llull’s system was based on the belief that only a limited number of undeniable truths exists in all fields of knowledge and by studying all combinations of these elementary truths, humankind could attain the ultimate truth. His art could be used to “banish all erroneous opinions” and to arrive at “true intellectual certitude removed from any doubt.”
-
-
-
In the 17th century, his works Ars magna and Ars brevis gained greater influence through the system of a perfect philosophical, universal language described therein. This system is based on a combination of philosophical basic concepts. Llull’s thoughts were taken up by Gottfried Wilhelm LeibnizGottfried Wilhelm LeibnizGottfried Wilhelm von Leibniz (German: [ˈɡɔtfʁiːt ˈvɪlhɛlm fɔn ˈlaɪbnɪts] or [ˈlaɪpnɪts]) (July 1, 1646 – November 14, 1716) was a German mathematician and philosopher. He occupies a prominent place in the history of mathematics and the history of philosophy.Leibniz developed the infinitesimal calculus independently of Isaac Newton, and Leibniz's mathematical notation has been widely used ever since it was published. It was only in the 20th century that his Law of Continuity and Transcendental Law of Homogeneity found mathematical implementation (by means of non-standard analysis). He became one of the most prolific inventors in the field of mechanical calculators. While working on adding automatic multiplication and division to Pascal's calculator, he was the first to describe a pinwheel calculator in 1685 and invented the Leibniz wheel, used in the arithmometer, the first mass-produced mechanical calculator. He also refined the binary number system, which is at the foundation of virtually all digital computers.In philosophy, Leibniz is most noted for his optimism, e.g., his conclusion that our Universe is, in a restricted sense, the best possible one that God could have created. Leibniz, along with René Descartes and Baruch Spinoza, was one of the three great 17th century advocates of rationalism. The work of Leibniz anticipated modern logic and analytic philosophy, but his philosophy also looks back to the scholastic tradition, in which conclusions are produced by applying reason to first principles or prior definitions rather than to empirical evidence.Leibniz made major contributions to physics and technology, and anticipated notions that surfaced much later in philosophy, probability theory, biology, medicine, geology, psychology, linguistics, and computer science. He wrote works on philosophy, politics, law, ethics, theology, history, and philology. Leibniz's contributions to this vast array of subjects were scattered in various learned journals, in tens of thousands of letters, and in unpublished manuscripts. He wrote in several languages, but primarily in Latin, French, and German. There is no complete gathering of the writings of Leibniz.birth year1646death year1716era18th-century philosophyeraAge of Enlightenmentis influenced by ofAlfred North Whiteheadinfluenced byAristotle, the founder of mathematical logic. In the 19th century William Stanley JevonsWilliam Stanley JevonsWilliam Stanley Jevons, LL.D., M.A., F.R.S. (/ˈdʒɛvənz/;1 September 1835 – 13 August 1882) was an English economist and logician.Irving Fisher described Jevons' book A General Mathematical Theory of Political Economy (1862) as the start of the mathematical method in economics. It made the case that economics as a science concerned with quantities is necessarily mathematical. In so doing, it expounded upon the "final" (marginal) utility theory of value. Jevons' work, along with similar discoveries made by Carl Menger in Vienna (1871) and by Léon Walras in Switzerland (1874), marked the opening of a new period in the history of economic thought. Jevons' contribution to the marginal revolution in economics in the late 19th century established his reputation as a leading political economist and logician of the time.Jevons broke off his studies of the natural sciences in London in 1854 to work as an assayer in Sydney, where he acquired an interest in political economy. Returning to the UK in 1859, he published General Mathematical Theory of Political Economy in 1862, outlining the marginal utility theory of value, and A Serious Fall in the Value of Gold in 1863. For Jevons, the utility or value to a consumer of an additional unit of a product is inversely related to the number of units of that product he already owns, at least beyond some critical quantity.It was for The Coal Question (1865), in which he called attention to the gradual exhaustion of the UK's coal supplies, that he received public recognition, in which he put forth what is now known as Jevon's paradox, i.e. that increases in energy production efficiency leads to more not less consumption. The most important of his works on logic and scientific methods is his Principles of Science (1874), as well as The Theory of Political Economy (1871) and The State in Relation to Labour (1882). Among his inventions was the logic piano, a mechanical computer.birth year1835death year1882influencedAlfred Marshallis notable student ofAugustus De Morgandeath placeBexhill-on-SeafieldEconomics tried to realize the idea of a logical machine. Llull studied both syllogism and induction. He was the first to devote himself to the systematic study of material implication, which is one of the fundamental operations of mathematical logic, analyzing logical operations with the copula “and” (conjunction) and the copula “or” (disjunction).
Tags
Annotators
URL
-
-
theconvivialsociety.substack.com theconvivialsociety.substack.com
-
So, to summarize this point, as the text detaches from the book, or the image from the photograph, so the self detaches from the community.
-
What Illich is picking up on here is the estrangement of the self from the community that was analogous in his view to the estrangement of the text from the book. “What I want to stress here,” Illich claims at one point, “is a special correspondence between the emergence of selfhood understood as a person and the emergence of ‘the’ text from the page.” Illich goes on at length about how Hugh of St. Victor likened the work of the monk to a kind of intellectual or spiritual pilgrimage through the pages of the book. Notice the metaphor. One did not search a text, but rather walked deliberately through a book. At one point Illich writes, “Modern reading, especially of the academic and professional type, is an activity performed by commuters or tourists; it is no longer that of pedestrians and pilgrims.”
-
There comes a point when our capacity to store information outpaces our ability to actively organize it, no matter how prodigious our effort to do so.
-
What do we imagine we are doing when we are reading? How have our digital tools—the ubiquity of the search function, for example—changed the way we relate to the written word? Is there a relationship between our digital databases and the experience of the world as a hot mess? How has the digital environment transformed not only how we encounter the word, but our experience of the world itself?
-
To be honest, I delight in this kind of encounter with the past for its own sake. But I also find that these encounters illuminate the present by giving us a point of contrast. The meaning and significance of contemporary technologies become clearer, or so it seems to me, when I have some older form of human experience I can hold it up against. This is not to say that one form is necessarily better than the other, of course. Only that the nature of each becomes more evident.
-
The reader’s order is not imposed on the story, but the story puts the reader into its order.
-
“The book has now ceased to be the root-metaphor of the age; the screen has taken its place. The alphabetic text has become but one of many modes of encoding something, now called ‘the message.’”
-
As a result, the book was no longer the window onto nature or god; it was no longer the transparent optical device through which a reader gains access to creatures or the transcendent.
-
This [technical] breakthrough consisted in the combination of more than a dozen technical inventions and arrangements through which the page was transformed from score to text. Not printing, as is frequently assumed, but this bundle of innovations, twelve generations earlier, is the necessary foundation for all stages through which bookish culture has gone since. This collection of techniques and habits made it possible to imagine the ‘text’ as something detached from the physical reality of a page. It both reflected and in turn conditioned a revolution in what learned people did when they read — and what they experienced reading to mean.”
-
Some of these innovations also made it easier to read the book silently—something that was unusual in the scriptoriums of early medieval monasteries, which could be rather noisy places.3 And, of course, this reminds us that the transition from orality to literacy was not accomplished by the flipping of a switch. As Illich puts it, the monkish book was still understood as recorded sound rather than as a record of thought. Just as we thought of the web in terms of older textual technologies and spoke of web pages and scrolling, readers long experienced the act of reading by reference to oral forms of communication.
-
The text is information that has lost its body, i.e. the book. According to Illich, until these textual innovations took hold in the 12th century, it was very hard to imagine a text apart from its particular embodiment in a book
-
What might not be as well known is that many features that we take for granted when we read a book had not yet been invented. These include, for example, page numbers, chapter headings, paragraph breaks, and alphabetical indexes. These are some of the dozen or so textual innovations that Illich had in mind when he talks about the transformation of the experience of reading in the 12th century. What they provide are multiple paths into a book. If we imagine the book as an information storage technology (something we can do only on the other side of this revolution) then what these new tools do is solve the problems of sorting and access. They help organize the information in such a way that readers can now dip in and out of what now can be imagined as a text independent of the book.
-
It’s commonly known that the invention of printing in the 15th century was a momentous development in the history of European culture. Elizabeth Eisenstein’s work, especially, made the case that the emergence of printing revolutionized European society. Without it, it seems unlikely that we get the Protestant Reformation, modern science, or the modern liberal order. Illich was not interested in challenging this thesis, but he did believe that the print revolution had an important antecedent: the differentiation of the text from the book.
-
In the Vineyard of the Text, which is itself a careful analysis of a medieval guide to the art of reading written by Hugh of St. Victor, sets out to make one principle argument that goes something like this: In the 12th century, a set of textual innovations transformed how reading was experienced by the intellectual class. Illich describes it as a shift from monkish reading focused on the book as a material artifact to scholastic reading focused on the text as a mental construct that floats above its material anchorage in a book. (I’m tempted to say manuscript or codex to emphasize the difference between the artifact we call a book and what Hugh of St. Victor would’ve handled.) A secondary point Illich makes throughout this fascinating book is that this profound shift in the culture of the book that shaped Western societies for the rest of the millennium was also entangled with the emergence of a new experience of the self.
-
Ong, who is best remembered today for his work on orality and literacy, cut his teeth on Ramus, his research focusing on how Ramus, in conjunction with the advent of printing, pushed the culture of Western learning further away from the world of the ear (think of the place of dialog in the Platonic tradition) toward the world of the eye. His search for a universal method and logic, which preceded and may have prepared the way for Descartes, yielded a decidedly visual method and logic, complete with charts and schemas. Perhaps a case could be made, and maybe has been made, that this reorientation of human learning and knowing around sight finds its last iteration in the directory structure of early personal computers, whose logic is fundamentally visual. Your own naming of files and folders may presume another kind of logic, but there is no logic to the structure itself other than the one you visualize, which may be why it was so difficult for these professors to articulate the logic to students. In any case, the informational milieu the student’s describe is one that is not ordered at all. It is a hot mess navigated exclusively by the search function.
-
Not the media, of course, but media of human communication, from language itself to the alphabet and the whole array of technologies built upon the alphabet. Media do just that: they mediate. A medium of communication mediates our experience of the world and the self at a level that is so foundational to our thinking it is easy to lose sight of it altogether. Thus technologies of communication shape how we come to understand both the world and the self. They shape our perception, they supply root metaphors and symbols, they alter the way we experience our senses, they generate social hierarchies of value, and they structure how we remember. I could go on, but you get the point.
-
“Today’s virtual world is largely a searchable one; people in many modern professions have little need to interact with nested hierarchies.”Similar arguments have been made to explain how some people think about their inboxes. While some are quite adept at using labels, tags, and folders to manage their emails, others will claim that there’s no need to do because you can easily search for whatever you happen to need. Save it all and search for what you want to find. This is, roughly speaking, the hot mess approach to information management. And it appears to arise both because search makes it a good-enough approach to take and because the scale of information we’re trying to manage makes it feel impossible to do otherwise. Who’s got the time or patience?
-
Mental categories tend to be grounded in embodied experiences in a material world. Tactile facility with files, folders, and filing cabinets grounded the whole array of desktop metaphors that appeared in the 1980s to organize the user’s experience of a computer. And I think we ought to take this as a case in point of a more general pattern: technological change operates on the shared material basis of our mental categories and, yes, the “metaphors we live by.”1 Consequently, technological change not only transforms the texture of everyday life, it also alters the architecture and furniture of our mental spaces.
-
This suggests, of course, the reality that metaphors make sense of things by explaining the unknown (tenor) by comparison to the known (vehicle), but, when the known element itself becomes unknown, then the meaning-making function is lost. Which is to say, that files and folders are metaphors that help users navigate computers by reference to older physical artifacts that would’ve been already familiar to users. But, then, what happens when those older artifacts themselves become unfamiliar?
-
-
theconvivialsociety.substack.com theconvivialsociety.substack.com
-
What Do Human Beings Need?
Tags
Annotators
URL
-
-
www.getrevue.co www.getrevue.co
-
I’d like to offer my own take on user-centric streaming that I’ll call the OC Streaming Model. The basic idea would be to introduce a streaming cap that would function similar to a marginal tax rate (if you’re wondering how the name was derived). Until a certain threshold, earnings from an individual song’s streams would be distributed to the songwriters and recording artists involved, while remaining earnings would be distributed to the artist’s respective label and/or publisher. (I’ll admit that I’m flexible when it comes to how the latter pool of revenue might be allocated, but I’ll stick with my initial suggestion going forward.) For example, if Post Malone received 592 million streams in a single month, a cap of 250 million streams would mean the money made from the remaining 342 million streams would be up for redistribution. The excess streaming revenue would then be distributed to other acts signed with the same label/publisher, with a majority going towards smaller acts and less going towards more successful ones. One concern discussed in the book Rockonomics was that music success is often only for the 0.1% of artists. User-centric streaming, as currently suggested, could make that worse, so this is an idea for how to truly help smaller artists.
-
-
www.lynalden.com www.lynalden.com
-
Is there any particular reason why investors’ money should be 3x more concentrated in Japan than Canada? Or 6x more concentrated in Japan than South Korea? Or 11x more concentrated in Japan than Brazil? Just because Japan’s stock market is the biggest There’s no compelling reason for international stock funds to be weighted strictly by the stock market size each country. It makes far more sense to broaden and diversify more evenly, so that you’re not so heavily tied to the fate of just one country. Especially a shrinking country.
Tags
Annotators
URL
-
-
www.lynalden.com www.lynalden.com
-
Many companies charge far higher prices for drugs in the United States than they charge elsewhere, because we lack the same level of price negotiation as many other countries have. In this sense, we subsidize other countries.
-
There are over 44 million Americans with student loans, and the average per-student debt at graduation among students that have loans is over $35,000. And although it dis-proportionally affects people under 40, approximately one third of total student debt is held by people over 40 years of age. Plus, parents are often co-signers for the loans of their children, which exposes them to the risks.
-
-
www.lynalden.com www.lynalden.com
-
Bonds and stocks do well in goldilocks periods of strong real growth, with low inflation and high productivity gains. Between the two, stocks can do better. Commodities and gold are prone to poor performance in this type of environment.
-
If temporary deflation is caused by a deflationary shock, such as a recession or depression, bonds are the place to be, and outperform everything else. Gold also generally does fine compared to other assets. Stocks and commodities usually perform very poorly.
-
During periods of moderate to high inflation, gold and commodities tend to do extremely well. Equities outperform bonds more often than not, but it depends on the type of equities and their starting valuations, and therefore have a huge variance. Real estate does well, mainly because leverage attached to it gets melted away from inflation. Bonds do poorly in inflationary environments.
-
When interest rates rise, it puts downward pressure on most asset prices, as we saw in the inflationary decade of the 1970s. When interest rates remain low, then monetary inflation remains a decent environment for asset prices, as we saw in the inflationary decade of the 1940s from the market bottom in 1942.
-
Monetary inflation, meaning a rapid increase in the broad money supply, is driven either by an increase in bank lending or large fiscal deficits.
-
However, if you bought a Treasury note leading up to any of the three major inflationary periods in this 150-year history and held until maturity, you lost purchasing power at up to a -5% annualized compounded rate for a decade, which led to approximately a -40% loss in purchasing power by the end. In other words, Treasury buyers in those periods basically made giant donations to Uncle Sam.
-
More specifically, the top 1% of households in the United States have $39.4 trillion in assets and less than $0.8 trillion in debts, which gives them a net worth of $38.6 trillion. So, they have a debt/equity ratio of just 2%. Almost their their entire balance sheet consists of assets. Meanwhile, the bottom 50% of households have $7.6 trillion in assets and $5.1 trillion in debts, resulting in just $2.5 trillion in net worth. So, they have a debt/equity ratio of 200%. Their balance sheets have a lot of debt relative to equity, and almost as much in debt as in assets.
-
This is something to keep in mind when you are told that inflation is transitory. It’s often transitory in rate of change terms, but not absolute terms. With inflation that is transitory in rate of change terms but not absolute terms, broad prices stop accelerating to the upside, but don’t come back down.
-
The top 10% have increased their share from 60% of the wealth to 70% of the wealth. The top 1% alone hold slightly more household wealth than the bottom 90% combined. So naturally, most people don’t feel like price inflation is as low as CPI says it was. Their actual ability to buy a good living doesn’t match with what it says.
-
One of the biggest price growth outliers was tuition, with a 4x (300%) increase compared to 2x (100%) for the broad CPI from 1990 to 2020. Decades ago, a 4-year or 6-year college education wasn’t necessary to have strong wages. Right out of high school, young adults could start earning money without student debt assigned to them. More prestigious careers of course generally needed the higher degrees. But due to globalization, offshoring, automation, and reduced union participation, there has been considerable downward wage pressure on many types of non-college jobs. The median male income has been virtually flat in inflation-adjusted terms over the past four decades, but he increasingly needs a college degree and student debt in order to earn that median income. And the cost of that degree has gone up at twice the pace of broad CPI.
-
So, if the majority of things that Americans spend money on went up faster than broad CPI, it brings into question the accuracy of broad CPI. That basket of housing, education, food, healthcare, and transportation represents 60% of total average household spending according to the BLS, or about 70% of total ex-pension and ex-charity spending.
-
Based on this funny but actually kind of relevant metric, CPI has fallen short over the past three decades, and particularly within the second half of that period. Back in the late 1990s and early 2000s, commodities were very cheap, and so the Big Mac rose in price more slowly than CPI. Starting in 2003, commodity prices had a big rise up, and Big Mac prices caught back up to CPI and eventually outpaced it by a noticeable margin.
-
Certainly we can allow for a substantial amount of quality adjustment in the new car CPI calculation. The new Camry has power everything, a navigation system, better safety features, and better gas mileage. But is the quality/size adjustment enough to reduce the actual price appreciation from 100% down to just 22% on a quality-adjusted basis during this three-decade period, as official new car CPI says was the case?
-
The new Camry has power everything, a navigation system, better safety features, and better gas mileage.
-
Certainly we can allow for a substantial amount of quality adjustment in the new car CPI calculation.
-
I dug up an old Chicago Tribune article that had some data to cross-reference that. According to that article, the average price of a new car in 1990 was $15,472. In 2020, the average price of a new car crossed over $40,000. That’s more than a 2.5x or 150%+ increase in price, during which the new vehicle CPI says that it effectively rose only 22%.
-
- Apr 2022
-
-
Another potential direction is for browsers to proactively lowerthe cost of organizing tabs by tasks by computationally identifyinglatent task topics and automatically suggesting existing workspacesfor newly created tabs
Over-engineering
-
- Feb 2022
-
www.theverge.com www.theverge.com
-
What tabs would be particularly useful for is new music discovery, like an album you’ve been wanting to get around to hearing but haven’t yet had the time. I bump into this problem quite a bit. Adding a new album to my “Liked” songs on Spotify shuffles it in with all of my favorite stuff, and decluttering that playlist later is a hassle. Making a new album a playlist almost assures that it’ll be forgotten about. My decrepit, goldfish memory doesn’t have the space to remember to return to a playlist of an album two weeks later. As my colleague Victoria notes, she’s always “forgetting what I’m supposed to listen to next.” You know what would help with that? Tabs.
-
- Jan 2022
-
bravenewcoin.com bravenewcoin.com
-
Although web3 clients don’t yet exist for non-Ethereum chains, they’re being built and will be available soon, and they’ll support multiple chains. In time, browsers will natively implement multi-chain web3 clients.
What is the difference in UX between homogenous and heterogenous multi-chain wallets?
-
-
multicoin.capital multicoin.capital
-
The design space for competition among these systems is open-ended and almost entirely unexplored.
I would expect that the path to $100t would include investment funds: index, ETFs, and DAOs.
-
Many of the most ardent supporters of this hypothesis tend to be economists, or at least those with an economics background. In their eyes, the “soundness” of money as an immutable, secure, censorship resistant, store of value trumps all notions of more practical, day-to-day utility.
What economist would say this? Economist are scratching there heads as to why commodity money should be reproduced in digital form.
-
-
multicoin.capital multicoin.capital
-
Having very liquid markets
Need to incentivize. Perhaps with coin age.
-
Offering the lowest trading costs (at the application and protocol level)
Should just be free
-
limited leverage available
pro
-
- Mar 2021
-
stratechery.com stratechery.com
-
Seventeen years on and there is more user-generated content than ever, in part because it is so easy to generate: you can type an update on Facebook, post a photo on Instagram, make a video on TikTok, or have a conversation on Clubhouse. That, though, points to Web 2.0’s failure: interoperability is nowhere to be found. Twitter, which has awoken from its years-long stupor to launch or announce whole host of new products, provides an excellent lens with which to examine the drivers of this centralization.
-
- Aug 2020
-
searchuserinterfaces.com searchuserinterfaces.com
-
Research shows that people are highly likely to revisit information they have viewed in the past and to re-issue queries that they have written in the past (Jones et al., 2002, Milic-Frayling et al., 2004). In one large study, 40% of people's search results clicks were on pages that they had clicked on before over the course of a year, with 71% of these using the identical query string as before (Teevan et al., 2006a). In a survey associated with this study, 17% of interviewees reported “not being able to return to a page I once visited” as one of the “biggest problems in using the web.” Therefore, allowing search over recently viewed information can improve a user's productivity (Dumais et al., 2003). Web browsers, as opposed to search engines, can provide much of this functionality. For example, the Chrome Web browser supports information revisiting by showing a grid of thumbnail images representing a user's most frequently visited web pages, and the drop-down menu from the many browser Web address bars shows recently visited pages. Search engines themselves can provide query history, as well as history of previously selected pages if the user agrees to having that information recorded. The PubMed bioscience journal service shows recently issued queries and visited documents in a simple history display (see Figure 1.6). Similarly, many shopping Web site show recently viewed items in a graphical form. Thumbnail images have also been experimented with in search results listing, both for reminding searchers of previously visited pages and for suggesting information about the hit, such as its genre.
-
- Jul 2020
-
global.asc.upenn.edu global.asc.upenn.edu
-
1.4 Psychological resistances
This is the mote
-
-
www.bloomberg.com www.bloomberg.com
-
Big companies have enjoyed big profits, fattened by widening margins as wages stagnate. That’s allowed them to sustain a huge debt load. But drilling down shows that credit quality, as viewed by ratings companies, has tumbled. According to S&P Global Ratings, the companies rated BBB+, BBB, or BBB- (the three lowest investment grades before they would hit “junk” status and face much higher interest payments) now outnumber all of the companies with some level of A-rated debt. It looks as though companies are “gaming” the ratings companies, borrowing as much as they can get away with.
Precisely what happened with consumers credit scores as a function of their borrowing habits. You see a large number of consumers at the 600 threshold. These arbitrary cutoffs, create problematic tipping points.
-
-
blog.readwise.io blog.readwise.io
-
Spaced repetition is a technique for spacing out of reviews of previously learned material according to an algorithm designed to optimize your limited time for review. Each time you review a piece of information, you supply feedback to that algorithm which estimates the optimal time to show you that information again.
-
-
blog.readwise.io blog.readwise.io
-
You'll also want to review your original reaction to those passages. You can capture these reactions, of course, by taking notes.
Note taking requires a little more effort than I would expend during the initial capture process. There's a wide variance in the thoroughness and time that one takes to write a note. As a consequence note taking, when done most thoroughly, may disrupt the flow of reading. What's more, you do not know ahead of time the relative significance of each passage, and much of the note taking effort can go to waste if too much focus is paid to those passage that are relatively less substantial.
-
Mortimer Adler, the author of the classic manual on reading How To Read a Book
-
With reading, this means highlighting especially salient passages.
Yes we need to highlight salient passages, but more importantly we need to begin our capture of important information, as Mortimer Adler suggests, by "coming to terms" with the author.
-
-
blog.readwise.io blog.readwise.io
-
All it takes is a swipe of the finger.
Would a press not be better?
-
-
fortelabs.com fortelabs.com
-
There’s a natural tension between the two, compression and context.
This a false dichotomy and tradeoff. You can compress information based its context.
-
- Jun 2020
-
augmentingcognition.com augmentingcognition.com
-
My somewhat pious belief was that if people focused more on remembering the basics, and worried less about the “difficult” high-level issues, they'd find the high-level issues took care of themselves. But while I held this as a strong conviction about other people, I never realized it also applied to me. And I had no idea at all how strongly it applied to me. Using Anki to read papers in new fields disabused me of this illusion. I found it almost unsettling how much easier Anki made learning such subjects. I now believe memory of the basics is often the single largest barrier to understanding. If you have a system such as Anki for overcoming that barrier, then you will find it much, much easier to read into new fields.
-
- May 2020
-
www.cityrealty.com www.cityrealty.com
-
The northern end of the park has typically seen less affluent neighbors and significantly less attention, but Central Park Conservancy is about to change that. Earlier this fall, the non-profit group announced a $150 million renovation that would improve the parkland, add a new boardwalk along the man-made lake known as Harlem Meer, and build a new recreation facility to replace the Lasker pool and skating rink, both of which date back to the 1960’s. (Side note: The Trump Organization has the concession to run the skating rink through 2021, by which time there may be someone else in the White House.) Construction is set to begin in 2021, and completion is estimated for 2024.
-
-
stratechery.com stratechery.com
-
“paying for the regular delivery of well-defined value” — are so important. I defined every part of that phrase: Paying: A subscription is an ongoing commitment to the production of content, not a one-off payment for one piece of content that catches the eye. Regular Delivery: A subscriber does not need to depend on the random discovery of content; said content can be delivered to to the subscriber directly, whether that be email, a bookmark, or an app. Well-defined Value: A subscriber needs to know what they are paying for, and it needs to be worth it.
-
It is very important to clearly define what a subscriptions means. First, it’s not a donation: it is asking a customer to pay money for a product. What, then, is the product? It is not, in fact, any one article (a point that is missed by the misguided focus on micro-transactions). Rather, a subscriber is paying for the regular delivery of well-defined value. The importance of this distinction stems directly from the economics involved: the marginal cost of any one Stratechery article is $0. After all, it is simply text on a screen, a few bits flipped in a costless arrangement. It makes about as much sense to sell those bit-flipping configurations as it does to sell, say, an MP3, costlessly copied. So you need to sell something different. In the case of MP3s, what the music industry finally learned — after years of kicking and screaming about how terribly unfair it was that people “stole” their music, which didn’t actually make sense because digital goods are non-rivalrous — is that they should sell convenience. If streaming music is free on a marginal cost basis, why not deliver all of the music to all of the customers for a monthly fee? This is the same idea behind nearly every large consumer-facing web service: Netflix, YouTube, Facebook, Google, etc. are all predicated on the idea that content is free to deliver, and consumers should have access to as much as possible. Of course how they monetize that convenience differs: Netflix has subscriptions, while Google, YouTube, and Facebook deliver ads (the latter two also leverage the fact that content is free to create). None of them, though, sell discrete digital goods. It just doesn’t make sense.
-
-
blog.readwise.io blog.readwise.io
-
Cloze deletion is, of course, just a fancy way of saying fill in the blank. This might sound trivial, but the simple act forces you to consider the surrounding context and search your mind for an answer. This, in turn, is scientifically proven to form stronger memories enabling you to remember profoundly more of what you've read.
-
-
stratechery.com stratechery.com
-
The music industry, meanwhile, has, at least relative to newspapers, come out of the shift to the Internet in relatively good shape; while piracy drove the music labels into the arms of Apple, which unbundled the album into the song, streaming has rewarded the integration of back catalogs and new music with bundle economics: more and more users are willing to pay $10/month for access to everything, significantly increasing the average revenue per customer. The result is an industry that looks remarkably similar to the pre-Internet era: Notice how little power Spotify and Apple Music have; neither has a sufficient user base to attract suppliers (artists) based on pure economics, in part because they don’t have access to back catalogs. Unlike newspapers, music labels built an integration that transcends distribution.
Tags
Annotators
URL
-
- Apr 2020
-
www.nature.com www.nature.com
-
The team behind Hypothesis, an open-source software tool that allows people to annotate web pages, announced in March that its users had collectively posted more than 5 million comments across the scholarly web since the tool was launched in 2011. That’s up from about 220,000 total comments in 2015 (see ‘Comment counts’). The company has grown from 26,000 registered users to 215,000 over the same period.
Tags
Annotators
URL
-
- Jan 2020
-
web.archive.org web.archive.org
-
"Apple research transferred more stuff into product than any other lab I can think of, including Hewlett-Packard and IBM," the source said, but Jobs wasn't aware enough of the role ARL played in developing current Apple technology before deciding to cut the group's funding, he noted.
-
- Dec 2019
-
en.wikipedia.org en.wikipedia.org
-
Hans Moravec argued in 1976 that computers were still millions of times too weak to exhibit intelligence. He suggested an analogy: artificial intelligence requires computer power in the same way that aircraft require horsepower. Below a certain threshold, it's impossible, but, as power increases, eventually it could become easy.[79] With regard to computer vision, Moravec estimated that simply matching the edge and motion detection capabilities of human retina in real time would require a general-purpose computer capable of 109 operations/second (1000 MIPS).[80] As of 2011, practical computer vision applications require 10,000 to 1,000,000 MIPS. By comparison, the fastest supercomputer in 1976, Cray-1 (retailing at $5 million to $8 million), was only capable of around 80 to 130 MIPS, and a typical desktop computer at the time achieved less than 1 MIPS.
-
-
www.loper-os.org www.loper-os.org
-
Imagine that every car maker save for Toyota insisted on using the infamous East German Trabant as a standard of quality - yet blindly imitated random elements of Toyota's visual design. How long would it take for the whiners to appear on the scene and start making noises about monopolistic tyranny? How long would it take for Toyota to start living up to these accusations in earnest? And why should it not do so? What is to be gained from corporate sainthood? From a refusal to fleece eagerly willing suckers for all they're worth? Idle threats of defection by outraged iPhone developers [4] are laughable nonsense simply because - in the two categories listed - Apple has no competition. Every commercial product which competes directly with an Apple product (particularly the iPhone) gives me (and many others) the distinct impression that "where it is original, it is not good, and where it is good, it is not original."
-
-
www.dougengelbart.org www.dougengelbart.org
-
He then showed you how he could make a few strokes on the keyset to designate the type of link he wanted established, and pick the two symbol structures that were to be linked by means of the light pen. He said that most links possessed a direction, i.e., they were like an arrow pointing from one substructure to another, so that in setting up a link he must specify the two substructures in a given order.
-
"Most of the structuring forms I'll show you stem from the simple capability of being able to establish arbitrary linkages between different substructures, and of directing the computer subsequently to display a set of linked substructures with any relative positioning we might designate among the different substructures. You can designate as many different kinds of links as you wish, so that you can specify different display or manipulative treatment for the different types."
-
"You usually think of an argument as a serial sequence of steps of reason, beginning with known facts, assumptions, etc., and progressing toward a conclusion. Well, we do have to think through these steps serially, and we usually do list the steps serially when we write them out because that is pretty much the way our papers and books have to present them—they are pretty limiting in the symbol structuring they enable us to use. Have you even seen a 'scrambled-text' programmed instruction book? That is an interesting example of a deviation from straight serial presentation of steps.3b6b "Conceptually speaking, however, an argument is not a serial affair. It is sequential, I grant you, because some statements have to follow others, but this doesn't imply that its nature is necessarily serial. We usually string Statement B after Statement A, with Statements C, D, E, F, and so on following in that order—this is a serial structuring of our symbols. Perhaps each statement logically followed from all those which preceded it on the serial list, and if so, then the conceptual structuring would also be serial in nature, and it would be nicely matched for us by the symbol structuring.3b6c "But a more typical case might find A to be an independent statement, B dependent upon A, C and D independent, E depending upon D and B, E dependent upon C, and F dependent upon A, D, and E. See, sequential but not serial? A conceptual network but not a conceptual chain. The old paper and pencil methods of manipulating symbols just weren't very adaptable to making and using symbol structures to match the ways we make and use conceptual structures. With the new symbol-manipulating methods here, we have terrific flexibility for matching the two, and boy, it really pays off in the way you can tie into your work.3b6d This makes you recall dimly the generalizations you had heard previously about process structuring limiting symbol structuring, symbol structuring limiting concept structuring, and concept structuring limiting mental structuring.
-
Suppose that one wants to link Card B to Card A, to make a trail from A to B.
we should also be able to go from B to A
-
One need arose quite commonly as trains of thought would develop on a growing series of note cards. There was no convenient way to link these cards together so that the train of thought could later be recalled by extracting the ordered series of notecards. An associative-trail scheme similar to that out lined by Bush for his Memex could conceivably be implemented with these cards to meet this need and add a valuable new symbol-structuring process to the system.
-
Note, too, the implications extending from Bush's mention of one user duplicating a trail (a portion of his structure) and giving it to a friend who can put it into his Memex and integrate it into his own trail (structure).
-
An example of this general sort of thing was given by Bush where he points out that the file index can be called to view at the push of a button, which implicitly provides greater capability to work within more sophisticated and complex indexing systems
-
The associative trails whose establishment and use within the files he describes at some length provide a beautiful example of a new capability in symbol structuring that derives from new artifact-process capability, and that provides new ways to develop and portray concept structures. Any file is a symbol structure whose purpose is to represent a variety of concepts and concept structures in a way that makes them maximally available and useful to the needs of the human's mental-structure development—within the limits imposed by the capability of the artifacts and human for jointly executing processes of symbol-structure manipulation.
-
As we are currently using it, the term includes the organization, study, modification, and execution of processes and process structures. Whereas concept structuring and symbol structuring together represent the language component of our augmentation means, process structuring represents the methodology component (plus a little more, actually). There has been enough previous discussion of process structures that we need not describe the notion here, beyond perhaps an example or two. The individual processes (or actions) of my hands and fingers have to be cooperatively organized if the typewriter is to do my bidding. My successive actions throughout my working day are meant to cooperate toward a certain over-all professional goal.
-
With a computer manipulating our symbols and generating their portrayals to us on a display, we no longer need think of our looking at the symbol structure which is stored—as we think of looking at the symbol structures stored in notebooks, memos, and books. What the computer actually stores need be none of our concern, assuming that it can portray symbol structures to us that are consistent with the form in which we think our information is structured.
Separation of model and view
-
view generation
-
But another kind of view might be obtained by extracting and ordering all statements in the local text that bear upon consideration A of the argument—or by replacing all occurrences of specified esoteric words by one's own definitions.
-
A natural language provides its user with a ready-made structure of concepts that establishes a basic mental structure, and that allows relatively flexible, general-purpose concept structuring. Our concept of language as one of the basic means for augmenting the human intellect embraces all of the concept structuring which the human may make use of.
-
Before we pursue further direct discussion of the H-LAM/T system, let us examine some background material. Consider the following historical progression in the development of our intellectual capabilities:2c4a 2c4b (1) Concept Manipulation—Humans rose above the lower forms of life by evolving the biological capability for developing abstractions and concepts. They could manipulate these concepts within their minds to a certain extent, and think about situations in the abstract. Their mental capabilities allowed them to develop general concepts from specific instances, predict specific instances from general concepts, associate concepts, remember them, etc. We speak here of concepts in their raw, unverbalized form. For example, a person letting a door swing shut behind him suddenly visualizes the person who follows him carrying a cup of hot coffee and some sticky pastries. Of all the aspects of the pending event, the spilling of the coffee and the squashing of the pastry somehow are abstracted immediately, and associated with a concept of personal responsibility and a dislike for these consequences. But a solution comes to mind immediately as an image of a quick stop and an arm stab back toward the door, with motion and timing that could prevent the collision, and the solution is accepted and enacted. With only non-symbolic concept manipulation, we could probably build primitive shelter, evolve strategies of war and hunt, play games, and make practical jokes. But further powers of intellectual effectiveness are implicit in this stage of biological evolution (the same stage we are in today).2c4b1 (2) Symbol Manipulation—Humans made another great step forward when they learned to represent particular concepts in their minds with specific symbols. Here we temporarily disregard communicative speech and writing, and consider only the direct value to the individual of being able to do his heavy thinking by mentally manipulating symbols instead of the more unwieldly concepts which they represent. Consider, for instance, the mental difficulty involved in herding twenty-seven sheep if, instead of remembering one cardinal number and occasionally counting, we had to remember what each sheep looked like, so that if the flock seemed too small we could visualize each one and check whether or not it was there.2c4b2 (3) Manual, External, Symbol Manipulation—Another significant step toward harnessing the biologically evolved mental capabilities in pursuit of comprehension and problem solutions came with the development of the means for externalizing some of the symbol-manipulation activity, particularly in graphical representation. This supplemented the individual's memory and ability to visualize. (We are not concerned here with the value derived from human cooperation made possible by speech and writing, both forms of external symbol manipulation. We speak of the manual means of making graphical representations of symbols—a stick and sand, pencil and paper and eraser, straight edge or compass, and so on.) It is principally this kind of means for external symbol manipulation that has been associated with the evolution of the individual's present way of doing his concept manipulation (thinking).
-
It has been jokingly suggested several times during the course of this study that what we are seeking is an "intelligence amplifier." (The term is attributed originally to W. Ross Ashby[2,3]. At first this term was rejected on the grounds that in our view one's only hope was to make a better match between existing human intelligence and the problems to be tackled, rather than in making man more intelligent. But deriving the concepts brought out in the preceding section has shown us that indeed this term does seem applicable to our objective. 2c2a Accepting the term "intelligence amplification" does not imply any attempt to increase native human intelligence. The term "intelligence amplification" seems applicable to our goal of augmenting the human intellect in that the entity to be produced will exhibit more of what can be called intelligence than an unaided human could; we will have amplified the intelligence of the human by organizing his intellectual capabilities into higher levels of synergistic structuring. What possesses the amplified intelligence is the resulting H-LAM/T system, in which the LAM/T augmentation means represent the amplifier of the human's intelligence.2c2b In amplifying our intelligence, we are applying the principle of synergistic structuring that was followed by natural evolution in developing the basic human capabilities. What we have done in the development of our augmentation means is to construct a superstructure that is a synthetic extension of the natural structure upon which it is built. In a very real sense, as represented by the steady evolution of our augmentation means, the development of "artificial intelligence" has been going on for centuries.
-
-
www.nytimes.com www.nytimes.com
-
His answer is that our creative minds are being strengthened rather than atrophied by the ability to interact easily with the Web and Wikipedia. “Not only has transactive memory not hurt us,” he writes, “it’s allowed us to perform at higher levels, accomplishing acts of reasoning that are impossible for us alone.”
This is where I disagree with Thompson. The potential for IA is there but we have retrogressed with the advent of the web.
-
Socrates and his prediction that writing would destroy the Greek tradition of dialectic. Socrates’ primary concern was that people would write things down instead of remembering them. “This discovery of yours will create forgetfulness in the learners’ souls, because they will not use their memories,” Plato quotes him as saying. “They will trust to the external written characters and not remember of themselves.”
The dialectic process is important particularly in the context of human to computer communication and synthesis. Here Socrates articulates the importance of memory to this process and how writing undermines it. If there is an asymmetry between the mind of the writer and reader the written work provides method of diffusing information from one mind to another. This balance of the mind is true of human to computer interaction as well. We need to expand our memory capacity if we are to be expand the reasoning capacity of computers. But instead we are using computers to substitute our memories. We neglect memory so we can't reason; humans and computers alike.
-
This is not a new idea. It is based on the vision expounded by Vannevar Bush in his 1945 essay “As We May Think,” which conjured up a “memex” machine that would remember and connect information for us mere mortals. The concept was refined in the early 1960s by the Internet pioneer J. C. R. Licklider, who wrote a paper titled “Man-Computer Symbiosis,” and the computer designer Douglas Engelbart, who wrote “Augmenting Human Intellect.” They often found themselves in opposition to their colleagues, like Marvin Minsky and John McCarthy, who stressed the goal of pursuing artificial intelligence machines that left humans out of the loop.
Seymour Papert, had an approach that provides a nice synthesis between these two camps, buy leveraging early childhood development to provide insights on the creation of AI.
-
Thompson’s point is that “artificial intelligence” — defined as machines that can think on their own just like or better than humans — is not yet (and may never be) as powerful as “intelligence amplification,” the symbiotic smarts that occur when human cognition is augmented by a close interaction with computers.
Intelligence amplification over artificial intelligence. In reality you can't get to AI until you've mastered IA.
-
Like a centaur, the hybrid would have the strength of each of its components: the processing power of a large logic circuit and the intuition of a human brain’s wetware. The result: human-machine teams, even when they didn’t include the best grandmasters or most powerful computers, consistently beat teams composed solely of human grandmasters or superfast machines.
This is what is most needed: the spark of intuition coupled with the indefatigably pursuit of its implications. We handle the former and computers the latter.
-
-
en.wikipedia.org en.wikipedia.org
-
During 1995, a decision was made to (officially) start licensing the Mac OS and Macintosh ROMs to 3rd party manufacturers who started producing Macintosh "clones". This was done in order to achieve deeper market penetration and extra revenue for the company. This decision lead to Apple having over a 10% market share until 1997 when Steve Jobs was re-hired as interim CEO to replace Gil Amelio. Jobs promptly found a loophole in the licensing contracts Apple had with the clone manufacturers and terminated the Macintosh OS licensing program, ending the Macintosh clone era. The result of this action was that Macintosh computer market share quickly fell from 10% to around 3%.
-
- Nov 2019
-
www.paulgraham.com www.paulgraham.com
-
In languages, as in so many things, there's not much correlation between popularity and quality. Why does John Grisham (King of Torts sales rank, 44) outsell Jane Austen (Pride and Prejudice sales rank, 6191)? Would even Grisham claim that it's because he's a better writer?
-
-
www.paulgraham.com www.paulgraham.com
-
Which makes them exactly the kind of programmers companies should want to hire. Hence what, for lack of a better name, I'll call the Python paradox: if a company chooses to write its software in a comparatively esoteric language, they'll be able to hire better programmers, because they'll attract only those who cared enough to learn it. And for programmers the paradox is even more pronounced: the language to learn, if you want to get a good job, is a language that people don't learn merely to get a job.
Tags
Annotators
URL
-
-
www.paulgraham.com www.paulgraham.com
-
It would be great if more Americans were trained as programmers, but no amount of training can flip a ratio as overwhelming as 95 to 5. Especially since programmers are being trained in other countries too. Barring some cataclysm, it will always be true that most great programmers are born outside the US. It will always be true that most people who are great at anything are born outside the US.
No amount of training in the current development paradigm can flip this ratio but if we were to make dev tools simpler and ubiquitous then it just might.
-
-
en.wikipedia.org en.wikipedia.org
-
In his Discourse on the Origins of Inequality, Rousseau, anticipating the language of Darwin, states that as the animal-like human species increased there arose a "formidable struggle for existence" between it and other species for food.[34] It was then, under the pressure of necessity, that le caractère spécifique de l'espèce humaine—the specific quality that distinguished man from the beasts—emerged—intelligence, a power, meager at first but yet capable of an "almost unlimited development". Rousseau calls this power the faculté de se perfectionner—perfectibility.[35] Man invented tools, discovered fire, and in short, began to emerge from the state of nature. Yet at this stage, men also began to compare himself to others: "It is easy to see. ... that all our labors are directed upon two objects only, namely, for oneself, the commodities of life, and consideration on the part of others."
Tags
Annotators
URL
-
-
www.nobelprize.org www.nobelprize.org
-
This brings me to the crucial issue. Unlike the position that exists in the physical sciences, in economics and other disciplines that deal with essentially complex phenomena, the aspects of the events to be accounted for about which we can get quantitative data are necessarily limited and may not include the important ones. While in the physical sciences it is generally assumed, probably with good reason, that any important factor which determines the observed events will itself be directly observable and measurable, in the study of such complex phenomena as the market, which depend on the actions of many individuals, all the circumstances which will determine the outcome of a process, for reasons which I shall explain later, will hardly ever be fully known or measurable. And while in the physical sciences the investigator will be able to measure what, on the basis of a prima facie theory, he thinks important, in the social sciences often that is treated as important which happens to be accessible to measurement. This is sometimes carried to the point where it is demanded that our theories must be formulated in such terms that they refer only to measurable magnitudes.
-
The particular occasion of this lecture, combined with the chief practical problem which economists have to face today, have made the choice of its topic almost inevitable. On the one hand the still recent establishment of the Nobel Memorial Prize in Economic Science marks a significant step in the process by which, in the opinion of the general public, economics has been conceded some of the dignity and prestige of the physical sciences. On the other hand, the economists are at this moment called upon to say how to extricate the free world from the serious threat of accelerating inflation which, it must be admitted, has been brought about by policies which the majority of economists recommended and even urged governments to pursue. We have indeed at the moment little cause for pride: as a profession we have made a mess of things.
-
It seems to me that this failure of the economists to guide policy more successfully is closely connected with their propensity to imitate as closely as possible the procedures of the brilliantly successful physical sciences – an attempt which in our field may lead to outright error. It is an approach which has come to be described as the “scientistic” attitude – an attitude which, as I defined it some thirty years ago, “is decidedly unscientific in the true sense of the word, since it involves a mechanical and uncritical application of habits of thought to fields different from those in which they have been formed.”1
-
-
whitepaper.audius.co whitepaper.audius.co
-
Early in the life of the Audius network, the AudiusDAO will control governance. During this bootstrap-ping phase, the Audius DAO will also have the abilityto intervene in catastrophic circumstances to x criticalissues in the Audius blockchain code, such as issues en-abling fraud or resulting in unintended loss of Audiusor Loud tokens.
-
There will be two groups created at the timeof main network launch: Audius DAO (DecentralizedAutonomous Organization) and Artist Advisory DAO.
-
To make governance more accessible to users, voting canbe delegated by anyone to other users or groups of users,such that if a user places no vote on a specic proposal,their designated delegate's vote will be used in place oftheir own.
-
These user classes are not mutually exclusive. There-fore, if a user has earnings and/or holdings that fall intomultiple classes, their vote can be counted in multipleclasses.
-
To submit a proposal, a user must bond a set num-ber of Audius tokens (denotedBGP) in the governancesystem, which remain bonded for the duration of theirproposal. Before a proposal's eective date, the origi-nal submitter can also choose to withdraw the proposalif they so choose, returning their bonded tokens. Thisbond is required as an anti-spam measure and to ensurethat proposers to have a sucient stake in the Audiusprotocol to make changes to it. At the proposal's res-olution (successful, failed, or withdrawn), the bond isreturned to proposal submitter.
-
Pro-posals also include a block count at which point they gointo eect; this eectiveness date must be at least 1 weekin the future at time of proposal submission to give usersample time to review and vote on the proposal.
-
Participation in governance creates value in Audius,and should be rewarded
Voting should not be rewarded. Apathy should be penalized.
-
copy of theseguidelines will be included in a contract on the network,and updates to these guidelines ow through the Audiusgovernance protocol. A full fee and bond schedule forarbitration will be published closer to the time of theAudius main network launch, and these fees and bondscan be modied in the Audius governance protocol.
should be done already...
-
On a recurring basis,subscription listens would be tallied and payouts wouldbe made to artists by a transparent, auditable subscrip-tion system running on the Audius blockchain.
What are the mechanisms of the system?
-
-
en.wikipedia.org en.wikipedia.org
-
In 2001, AI founder Marvin Minsky asked "So the question is why didn't we get HAL in 2001?"[167] Minsky believed that the answer is that the central problems, like commonsense reasoning, were being neglected, while most researchers pursued things like commercial applications of neural nets or genetic algorithms. John McCarthy, on the other hand, still blamed the qualification problem.[168] For Ray Kurzweil, the issue is computer power and, using Moore's Law, he predicted that machines with human-level intelligence will appear by 2029.[169] Jeff Hawkins argued that neural net research ignores the essential properties of the human cortex, preferring simple models that have been successful at solving simple problems.[170] There were many other explanations and for each there was a corresponding research program underway.
-
The first indication of a change in weather was the sudden collapse of the market for specialized AI hardware in 1987. Desktop computers from Apple and IBM had been steadily gaining speed and power and in 1987 they became more powerful than the more expensive Lisp machines made by Symbolics and others. There was no longer a good reason to buy them. An entire industry worth half a billion dollars was demolished overnight.
-
Eventually the earliest successful expert systems, such as XCON, proved too expensive to maintain. They were difficult to update, they could not learn, they were "brittle" (i.e., they could make grotesque mistakes when given unusual inputs), and they fell prey to problems (such as the qualification problem) that had been identified years earlier. Expert systems proved useful, but only in a few special contexts
-
The neats: logic and symbolic reasoning[edit source] Logic was introduced into AI research as early as 1958, by John McCarthy in his Advice Taker proposal.[100] In 1963, J. Alan Robinson had discovered a simple method to implement deduction on computers, the resolution and unification algorithm. However, straightforward implementations, like those attempted by McCarthy and his students in the late 1960s, were especially intractable: the programs required astronomical numbers of steps to prove simple theorems.[101] A more fruitful approach to logic was developed in the 1970s by Robert Kowalski at the University of Edinburgh, and soon this led to the collaboration with French researchers Alain Colmerauer and Philippe Roussel who created the successful logic programming language Prolog.[102] Prolog uses a subset of logic (Horn clauses, closely related to "rules" and "production rules") that permit tractable computation. Rules would continue to be influential, providing a foundation for Edward Feigenbaum's expert systems and the continuing work by Allen Newell and Herbert A. Simon that would lead to Soar and their unified theories of cognition.[103] Critics of the logical approach noted, as Dreyfus had, that human beings rarely used logic when they solved problems. Experiments by psychologists like Peter Wason, Eleanor Rosch, Amos Tversky, Daniel Kahneman and others provided proof.[104] McCarthy responded that what people do is irrelevant. He argued that what is really needed are machines that can solve problems—not machines that think as people do.[105] The scruffies: frames and scripts[edit source] Among the critics of McCarthy's approach were his colleagues across the country at MIT. Marvin Minsky, Seymour Papert and Roger Schank were trying to solve problems like "story understanding" and "object recognition" that required a machine to think like a person. In order to use ordinary concepts like "chair" or "restaurant" they had to make all the same illogical assumptions that people normally made. Unfortunately, imprecise concepts like these are hard to represent in logic. Gerald Sussman observed that "using precise language to describe essentially imprecise concepts doesn't make them any more precise."[106] Schank described their "anti-logic" approaches as "scruffy", as opposed to the "neat" paradigms used by McCarthy, Kowalski, Feigenbaum, Newell and Simon.[107] In 1975, in a seminal paper, Minsky noted that many of his fellow "scruffy" researchers were using the same kind of tool: a framework that captures all our common sense assumptions about something. For example, if we use the concept of a bird, there is a constellation of facts that immediately come to mind: we might assume that it flies, eats worms and so on. We know these facts are not always true and that deductions using these facts will not be "logical", but these structured sets of assumptions are part of the context of everything we say and think. He called these structures "frames". Schank used a version of frames he called "scripts" to successfully answer questions about short stories in English.[108] Many years later object-oriented programming would adopt the essential idea of "inheritance" from AI research on frames.
-
-
en.wikipedia.org en.wikipedia.org
-
In 1988 Apple sued Microsoft and Hewlett-Packard on the grounds that they infringed Apple's copyrighted GUI, citing (among other things) the use of rectangular, overlapping, and resizable windows. After four years, the case was decided against Apple, as were later appeals. Apple's actions were criticized by some in the software community, including the Free Software Foundation (FSF), who felt Apple was trying to monopolize on GUIs in general, and boycotted GNU software for the Macintosh platform for seven years.
Tags
Annotators
URL
-
-
en.wikipedia.org en.wikipedia.org
-
Bolt, Beranek and Newman (BBN) developed its own Lisp machine, named Jericho,[7] which ran a version of Interlisp. It was never marketed. Frustrated, the whole AI group resigned, and were hired mostly by Xerox. So, Xerox Palo Alto Research Center had, simultaneously with Greenblatt's own development at MIT, developed their own Lisp machines which were designed to run InterLisp (and later Common Lisp). The same hardware was used with different software also as Smalltalk machines and as the Xerox Star office system.
-
In 1979, Russell Noftsker, being convinced that Lisp machines had a bright commercial future due to the strength of the Lisp language and the enabling factor of hardware acceleration, proposed to Greenblatt that they commercialize the technology.[citation needed] In a counter-intuitive move for an AI Lab hacker, Greenblatt acquiesced, hoping perhaps that he could recreate the informal and productive atmosphere of the Lab in a real business. These ideas and goals were considerably different from those of Noftsker. The two negotiated at length, but neither would compromise. As the proposed firm could succeed only with the full and undivided assistance of the AI Lab hackers as a group, Noftsker and Greenblatt decided that the fate of the enterprise was up to them, and so the choice should be left to the hackers. The ensuing discussions of the choice divided the lab into two factions. In February 1979, matters came to a head. The hackers sided with Noftsker, believing that a commercial venture fund-backed firm had a better chance of surviving and commercializing Lisp machines than Greenblatt's proposed self-sustaining start-up. Greenblatt lost the battle.
Tags
Annotators
URL
-
- Oct 2019
-
whitepaper.audius.co whitepaper.audius.co
-
We see a number of specic challenges faced by creatorsand listeners today:1. There is little to no transparency around the originsof creator payouts (e.g. number of plays, location,original gross payment before fees)2. Incomplete rights ownership data often preventscontent creators from getting paid; instead, earn-ings accumulate in digital service providers (DSPs)and rights societies3. There are layers of middlemen and signicant timedelay involved in payments to creators4. Publishing rights are complicated and opaque, withno incentives for the industry to make rights datapublic and accurate5. Remixes, covers, and other derivative content arelargely censored due to rights management issues6. Licensing issues prevent DSPs and content from be-ing accessible worldwide
Tags
Annotators
URL
-
-
www-groups.dcs.st-and.ac.uk www-groups.dcs.st-and.ac.uk
-
I do not see him in this light. I do not think that any one who has pored over the contents of that box which he packed up when he finally left Cambridge in 1696 and which, though partly dispersed, have come down to us, can see him like that. Newton was not the first of the age of reason. He was the last of the magicians, the last of the Babylonians and Sumerians, the last great mind which looked out on the visible and intellectual world with the same eyes as those who began to build our intellectual inheritance rather less than 10,000 years ago. Isaac Newton, a posthumous child bom with no father on Christmas Day, 1642, was the last wonderchild to whom the Magi could do sincere and appropriate homage.
-
- Sep 2019
-
-
One widely circulated report this summer—which appears to have caught Mr. Trump’s attention—estimates that China shed five million industrial jobs, 1.9 million of them directly because of U.S. tariffs, between the beginning of the trade conflict and the end of May this year.
-
That isn’t insubstantial. But it is still small compared with China’s urban labor force of 570 million. It also represents a slower pace than the 23 million manufacturing jobs shed in China between 2015 and 2017, according to the report, published by China International Capital Corp., an investment bank with Chinese state ownership.
-
-
-
The Fed offered $30 billion of reserves maturing Oct. 8, receiving $62 billion in bids from banks offering collateral in the form of Treasury and mortgage securities. Banks bid for $32 billion more than the amount offered by the Fed. In a second offering, the Fed added $75 billion in overnight reserves, with banks bidding for $80.2 billion, or $5.2 billion more than was available.
-
-
-
A prolonged walkout can quickly take a financial toll on car companies because they book revenue only when a vehicle is shipped to a dealership. An assembly-plant shutdown can cost an auto maker an estimated $1.3 million every hour, according to the Center for Automotive Research in Ann Arbor, Mich.
-
The GM strike would surpass in size the work stoppage by more than 30,000 employees at Stop & Shop groceries in New England earlier this year. But it would be far smaller than one involving 73,000 GM workers in 2007, when the company’s workforce was much larger.
-
-
-
But sustained Saudi outage of several million daily barrels would rattle markets, because of the lack of other players big enough to step in and provide enough supply to cover the shortfall longer term. Even if Saudi officials were successful in restoring all or most of the lost production, the attack demonstrates a new vulnerability to supply lines across the oil-rich Gulf. Tankers have been paying sharply higher insurance premiums, while shipping rates have soared in the region after a series of maritime attacks on oil-laden vessels, which the U.S. has blamed on Iran.
-
-
-
Reflecting those divisions, officials decided not to enlarge significantly the pool of assets the bank can buy—though it did expand the kinds of corporate and mortgage bonds it can purchase. Without changing rules that prohibit the bank from buying more than a third of any government’s debt, Mr. Ducrozet estimated that the ECB can continue its bond purchases for only 9-12 months.
-
-
en.wikipedia.org en.wikipedia.org
-
The Executive [Lincoln] is frequently compelled to affix his signature to bills of the highest importance, much of which he regards as wholly at war with the national interests.
-
- Aug 2019
-
www.usatf.org www.usatf.org
-
The USATF Legend Coach Award is in its sixth year and is selected by the USATF Coaches Advisory Committee. The inaugural award was presented to Hall of Fame Tigerbelle Coach Ed Temple in 2014, followed by Dr. Joe Vigil in 2015, Tom Tellez in 2016, Clyde Hart in 2017 and Brooks Johnson last year.
-
-
-
"But in moving towards flat design we are losing much of the wisdom that was embedded in the old 3D style of UI, for example: a user must be able to glance at a screen and know what is an interactive element (e.g., a button or link) and what is not (e.g., a label or motto); a user must be able to tell at a glance what an interactive element does (does it initiate a process, link to another page, download a document, etc.?); the UI should be explorable, discoverable and self-explanatory. But many apps and websites, in the interest of a clean, spartan visual appearance, leave important UI controls hidden until the mouse hovers over just the right area or the app is in just the right state. This leaves the user in the dark, often frustrated and disempowered."
-
-
www.pri.org www.pri.org
-
“Democrats think it's not progressive enough because it doesn’t put extra burdens on higher-income people, like an income tax does,” Hines says. “And Republicans worry that it's too easy for the government to raise money with one.”
-
-
www.pcmag.com www.pcmag.com
-
No available HMDs support VirtualLink at this writing, nor are we aware of any, but it's something to keep in mind if you're waffling between a GeForce RTX card and a last-generation GeForce GTX or a Radeon card for VR. Nothing is certain, but it's possible a future headset may debut with this as the optional or mandatory interface.
-
-
-
When the resort refinanced its debt in 2017 in a $469 million deal, bankers picked DBRS as one of two firms to rate the debt. DBRS had just loosened its standards for such “single-asset” commercial-mortgage deals. DBRS issued grades as much as three rungs higher on comparable slices rated by Morningstar in 2014.
-
Investor reliance on credit ratings has gone from “high to higher,” says Swedish economist Bo Becker, who co-wrote a study finding that in the $4.4 trillion U.S. bond-mutual-fund industry, 94% of rules governing investments made direct or indirect references to ratings in 2017, versus 90% in 2010.
-
-
-
Negative rates in theory mean the German government can borrow money from investors and get paid for doing so. But Berlin runs a budget surplus and has no desire to increase spending as other slower-growing European countries would like it to do. Olaf Scholz, Germany’s finance minister, has said recently the government doesn’t need to act as if it is in a crisis
-
-
-
At the same time, US companies are deleveraging, which has shrunk the supply of new corporate debt, leading to a dearth of investment-grade issuance. Net supply from municipal borrowers, another vital source of new issuance, has also turned negative so there is not enough available for pension funds and insurers to buy.
-
“Pension funds can’t match their liabilities with where rates are today so they have to hope that equity markets will continue to rally,” he says.
-
- Jul 2019
-
www.loper-os.org www.loper-os.org
-
The Apple of Steve Jobs needed HyperCard-like products like the Monsanto Company needs a $100 home genetic-engineering set.
-
The Lisp Machine (which could just as easily have been, say, a Smalltalk machine) was a computing environment with a coherent, logical design, where the “turtles go all the way down.” An environment which enabled stopping, examining the state of, editing, and resuming a running program, including the kernel. An environment which could actually be fully understood by an experienced developer. One where nearly all source code was not only available but usefully so, at all times, in real time. An environment to which we owe so many of the innovations we take for granted. It is easy for us now to say that such power could not have existed, or is unnecessary. Yet our favorite digital toys (and who knows what other artifacts of civilization) only exist because it was once possible to buy a computer designed specifically for exploring complex ideas. Certainly no such beast exists today – but that is not what saddens me most. Rather, it is the fact that so few are aware that anything has been lost.
-
The reason for this is that HyperCard is an echo of a different world. One where the distinction between the “use” and “programming” of a computer has been weakened and awaits near-total erasure. A world where the personal computer is a mind-amplifier, and not merely an expensive video telephone. A world in which Apple’s walled garden aesthetic has no place. What you may not know is that Steve Jobs killed far greater things than HyperCard. He was almost certainly behind the death of SK8. And the Lisp Machine version of the Newton. And we may never learn what else. And Mr. Jobs had a perfectly logical reason to prune the Apple tree thus. He returned the company to its original vision: the personal computer as a consumer appliance, a black box enforcing a very traditional relationship between the vendor and the purchaser. Jobs supposedly claimed that he intended his personal computer to be a “bicycle for the mind.” But what he really sold us was a (fairly comfortable) train for the mind. A train which goes only where rails have been laid down, like any train, and can travel elsewhere only after rivers of sweat pour forth from armies of laborers. (Preferably in Cupertino.) The Apple of Steve Jobs needed HyperCard-like products like the Monsanto Company needs a $100 home genetic-engineering set. The Apple of today, lacking Steve Jobs — probably needs a stake through the heart.
-
-
en.wikipedia.org en.wikipedia.org
-
Kahle has been critical of Google's book digitization, especially of Google's exclusivity in restricting other search engines' digital access to the books they archive. In a 2011 talk Kahle described Google's 'snippet' feature as a means of tip-toeing around copyright issues, and expressed his frustration with the lack of a decent loaning system for digital materials. He said the digital transition has moved from local control to central control, non-profit to for-profit, diverse to homogeneous, and from "ruled by law" to "ruled by contract". Kahle stated that even public-domain material published before 1923, and not bound by copyright law, is still bound by Google's contracts and requires permission to be distributed or copied. Kahle reasoned that this trend has emerged for a number of reasons: distribution of information favoring centralization, the economic cost of digitizing books, the issue of library staff without the technical knowledge to build these services, and the decision of the administrators to outsource information services
-
-
en.wikipedia.org en.wikipedia.org
-
It is this combination of features that also makes HyperCard a powerful hypermedia system. Users can build backgrounds to suit the needs of some system, say a rolodex, and use simple HyperTalk commands to provide buttons to move from place to place within the stack, or provide the same navigation system within the data elements of the UI, like text fields. Using these features, it is easy to build linked systems similar to hypertext links on the Web.[5] Unlike the Web, programming, placement, and browsing were all the same tool. Similar systems have been created for HTML but traditional Web services are considerably more heavyweight.
Tags
Annotators
URL
-
-
en.wikipedia.org en.wikipedia.org
-
Such are great historical men—whose own particular aims involve those large issues which are the will of the World-Spirit.
-
-
www.quora.com www.quora.com
-
One way to look at this is that when a new powerful medium of expression comes along that was not enough in our genes to be part of traditional cultures, it is something we need to learn how to get fluent with and use. Without the special learning, the new media will be mostly used to automate the old forms of thought. This will also have effects, especially if the new media is more efficient at what the old did: this can result in gluts, that act like legal drugs (as indeed are the industrial revolution’s ability to create sugar and fat, it can also overproduce stories, news, status, and new ways for oral discourse.
-
To understand what has happened, we only need to look at the history of writing and printing to note two very different consequences (a) the first, a vast change over the last 450 years in how the physical and social worlds are dealt with via the inventions of modern science and governance, and (b) that most people who read at all still mostly read fiction, self-help and religion books, and cookbooks, etc.* (all topics that would be familiar to any cave-person).
-
-
en.wikipedia.org en.wikipedia.org
-
A practical example of service design thinking can be found at the Myyrmanni shopping mall in Vantaa, Finland. The management attempted to improve the customer flow to the second floor as there were queues at the landscape lifts and the KONE steel car lifts were ignored. To improve customer flow to the second floor of the mall (2010) Kone Lifts implemented their 'People Flow' Service Design Thinking by turning the Elevators into a Hall of Fame for the 'Incredibles' comic strip characters. Making their Elevators more attractive to the public solved the people flow problem. This case of service design thinking by Kone Elevator Company is used in literature as an example of extending products into services.
Tags
Annotators
URL
-
-
en.wikipedia.org en.wikipedia.org
-
In 1996 and 1998, a pair of workshops at the University of Glasgow on information retrieval and human–computer interaction sought to address the overlap between these two fields. Marchionini notes the impact of the World Wide Web and the sudden increase in information literacy – changes that were only embryonic in the late 1990s.
it took a half a century for these disciplines to discern their complementarity!
-
-
www.edge.org www.edge.orgEdge.org2
-
I do not actually know of a real findability index, but tools in the field of information retrieval could be applied to develop one. One of the unsolved problems in the field is how to help the searcher to determine if the information simply is not available.
-
Although some have written about information overload, data smog, and the like, my view has always been the more information online, the better, so long as good search tools are available. Sometimes this information is found by directed search using a web search engine, sometimes by serendipty by following links, and sometimes by asking hundreds of people in our social network or hundreds of thousands of people on a question answering website such as Answers.com, Quora, or Yahoo Answer
Tags
Annotators
URL
-
-
www.edge.org www.edge.orgEdge.org2
-
Unfortunately, misguided views about usability still cause significant damage in today's world. In the 2000 U.S. elections, poor ballot design led thousands of voters in Palm Beach, Florida to vote for the wrong candidate, thus turning the tide of the entire presidential election. At the time, some observers made the ignorant claim that voters who could not understand the Palm Beach butterfly ballot were not bright enough to vote. I wonder if people who made such claims have never made the frustrating "mistake" of trying to pull open a door that requires pushing. Usability experts see this kind of problem as an error in the design of the door, rather than a problem with the person trying to leave the room.
-
The web, in yet another example of its leveling effect, allows nearly everyone to see nearly every interface. Thus designers can learn rapidly from what others have done, and users can see if one web site's experience is substandard compared to others.
-
-
en.wikipedia.org en.wikipedia.org
-
At the start of the 1970s, The New Communes author Ron E. Roberts classified communes as a subclass of a larger category of Utopias.[5] He listed three main characteristics. Communes of this period tended to develop their own characteristics of theory though, so while many strived for variously expressed forms of egalitarianism, Roberts' list should never be read as typical. Roberts' three listed items were: first, egalitarianism – that communes specifically rejected hierarchy or graduations of social status as being necessary to social order. Second, human scale – that members of some communes saw the scale of society as it was then organized as being too industrialized (or factory sized) and therefore unsympathetic to human dimensions. And third, that communes were consciously anti-bureaucratic.
-
-
en.wikipedia.org en.wikipedia.org
-
Another prominent conclusion is that joint asset ownership is suboptimal if investments are in human capital.
Does that have to be the case?
Tags
Annotators
URL
-
-
cbie.gitbook.io cbie.gitbook.io
-
Other examples of complex adaptive systems are:stock markets: Many traders make decisions on the information known to them and their individual expectations about future movements of the market. They may start selling when they see the prices are going down (because other traders are selling). Such herding behavior can lead to high volatility on stock markets. immune systems: Immune systems consist of various mechanisms, including a large population of lymphocytes that detect and destroy pathogens and other intruders in the body. The immune systems needs to be able to detect new pathogens for the host to survive and therefore needs to be able to adapt.brains: The neural system in the brain consists of many neurons that are exchanging information. The interactions of many neurons make it possible for me to write this sentence and ponder the meaning of life. ecosystems: Ecosystems consist of many species that interact by eating other species, distributing nutrients, and pollinating plants. Ecosystems can be seen as complex food webs that are able to cope with changes in the number of certain species, and adapt – to a certain extent – to changes in climate. human societies: When you buy this new iPhone that is manufactured in China, with materials derived from African soils, and with software developed by programmers from India, you need to realize that those actions are made by autonomous organizations, firms and individuals. These many individual actions are guided by rules and agreements we have developed, but there is no ruler who can control these interactions.
-
Path FormationPaved paths are not always the most desirable routes going from point A to point B. This may lead pedestrians to take short-cuts. Initially pedestrians walk over green grass. Subsequent people tend to use the stamped grass path instead of the pristine grass, and after many pedestrians an unpaved path is formed without any top-down design.
-
- Jun 2019
-
www.asindexing.org www.asindexing.org
-
However, indexes in the modern sense, giving exact locations of names and subjects in a book, were not compiled in antiquity, and only very few seem to have been made before the age of printing. There are several reasons for this. First, as long as books were written in the form of scrolls, there were neither page nor leaf numbers not line counts (as we have them now for classical texts). Also, even had there been such numerical indicators, it would have been impractical to append an index giving exact references, because in order for a reader to consult the index, the scroll would have to be unrolled to the very end and then to be rolled back to the relevant page. (Whoever has had to read a book available only on microfilm, the modern successor of the papyrus scroll, will have experienced how difficult and inconvenient it is to go from the index to the text.) Second, even though popular works were written in many copies (sometimes up to several hundreds),no two of them would be exactly the same, so that an index could at best have been made to chapters or paragraphs, but not to exact pages. Yet such a division of texts was rarely done (the one we have now for classical texts is mostly the work of medieval and Renaissance scholars). Only the invention of printing around 1450 made it possible to produce identical copies of books in large numbers, so that soon afterwards the first indexes began to be compiled, especially those to books of reference, such as herbals. (pages 164-166) Index entries were not always alphabetized by considering every letter in a word from beginning to end, as people are wont to do today. Most early indexes were arranged only by the first letter of the first word, the rest being left in no particular order at all. Gradually, alphabetization advanced to an arrangement by the first syllable, that is, the first two or three letters, the rest of an entry still being left unordered. Only very few indexes compiled in the 16th and early 17th centuries had fully alphabetized entries, but by the 18th century full alphabetization became the rule... (p. 136) (For more information on the subject of indexes, please see Professor Wellisch's Indexing from A to Z, which contains an account of an indexer being punished by having his ears lopped off, a history of narrative indexing, an essay on the zen of indexing, and much more. Please, if you quote from this page, CREDIT THE AUTHOR. Thanks.) Indexes go way back beyond the 17th century. The Gerardes Herbal from the 1590s had several fascinating indexes according to Hilary Calvert. Barbara Cohen writes that the alphabetical listing in the earliest ones only went as far as the first letter of the entry... no one thought at first to index each entry in either letter-by-letter or word-by-word order. Maja-Lisa writes that Peter Heylyn's 1652 Cosmographie in Four Bookes includes a series of tables at the end. They are alphabetical indexes and he prefaces them with "Short Tables may not seeme proportionalble to so long a Work, expecially in an Age wherein there are so many that pretend to learning, who study more the Index then they do the Book."
-
Pliny the Elder (died 79 A.D.) wrote a massive work called The Natural History in 37 Books. It was a kind of encyclopedia that comprised information on a wide range of subjects. In order to make it a bit more user friendly, the entire first book of the work is nothing more than a gigantic table of contents in which he lists, book by book, the various subjects discussed. He even appended to each list of items for each book his list of Greek and Roman authors used in compiling the information for that book. He indicates in the very end of his preface to the entire work that this practice was first employed in Latin literature by Valerius Soranus, who lived during the last part of the second century B.C. and the first part of the first century B.C. Pliny's statement that Soranus was the first in Latin literature to do this indicates that it must have already been practiced by Greek writers.
-
-
en.wikipedia.org en.wikipedia.org
-
Smil notes that as of 2018, coal, oil, and natural gas still supply 90% of the world's primary energy. Despite decades of growth of renewable energy, the world uses more fossil fuels in 2018 than in 2000, by percentage.
Tags
Annotators
URL
-
-
en.wikipedia.org en.wikipedia.org
-
Jevons received public recognition for his work on The Coal Question (1865), in which he called attention to the gradual exhaustion of Britain's coal supplies and also put forth the view that increases in energy production efficiency leads to more, not less, consumption.[5]:7f, 161f This view is known today as the Jevons paradox, named after him. Due to this particular work, Jevons is regarded today as the first economist of some standing to develop an 'ecological' perspective on the economy.
Tags
Annotators
URL
-