- May 2025
-
www.youtube.com www.youtube.com
-
in the classroom you want to focus you we're earlier talking about what do you what do you tell students when they first show up right yeah you want to focus on meaning not on language focus on what's happening not on words and phrases and pronunciation all right
for - natural language acquisition - teaching - focus on meaning, not words - Latest Annotation
-
science tells us that kids learn better from one from zero from the birth to five years old they're the fastest they're the best at learning model them then just do what they do you can't get better than that
for - stats - natural language acquisition - 1 to 2 year old is age of fastest and best learning
comment - ALG philosophy - replicate the experiences that 1 to 2 year olds have
-
show me any other program that that tries to teach you language for a one to two-year-old that's what we're doing it doesn't compare to teaching a language to a five-year-old we're not there yet
for - natural language acquisition - age - 2 year old is right age to aim to learn at
comment - 2 year old age is when an infant learns to hear and speak a spoken language first - reading and writing does not happen until about 5 years of age - When we are learning a new second language, it is therefore appropriate to aim for the same goal as a native 2 year old language user
-
a wrong guess is a hundred times better than a right answer yeah that's just giving you the reason is a right answer closes your mind a wrong guess you're still open and that's the vital characteristic
for - quote - natural language acquisition - wrong guess - right answer - adjacency - natural language acquisition - open mind
-
patience and tolerance for ambiguity
for - natural language acquisition skills - patience - tolerance for ambiguity - constructing good guesses to meaning
-
that was the biggest challenge i think we had and still have within uh alg is teachers think they've got to explain the language and they're short cutting the process they're short circuiting the process and they're cheating the student out of a otherwise good experience
for - adjacency - Socratic method - ALG - natural language acquisition - explanation - infants learning native language
adjacency - between - Socratic method - natural language acquisition - ALG - explanation - adjacency relationship - When the teacher explains the meaning to the student, - it actually robs the student of the active learning experience of guessing the right meaning - Infants learning their native language for the first time are necessarily in the "deep end" and face discomfort - They (we) are constantly forced to guess and actually actively construct meaning out of the universe of symbols we are being exposed to in a multitude of contexts
-
that discomfort is a tough one that's the first part you gotta face that and if you're not facing it then you've learned to walk with crutches
for - natural language acquisition - important role of discomfort
-
if i cannot adjust to guessing right about meaning i will never learn in this way very well at all
for - key insight - natural language acquisition - guessing
-
for - natural language acquisition - Automatic Language Growth - ALG - youtube - interview - David Long - Automatic Language Growth - from - youtube - The Language School that Teaches Adults like Babies - https://hyp.is/Ls_IbCpbEfCEqEfjBlJ8hw/www.youtube.com/watch?v=984rkMbvp-w
summary - The key takeaway is that even as adults, we have retained our innate language learning skill which requires simply treating a new language as a new, novel experience that we can apprehend naturally simply by experiencing it like the way we did when we were exposed to our first, native language - We didn't know what a "language" was theoretically when we were infants, but we simply fell into the experience and played with the experiences and our primary caretakers guided us - We didn't know grammar and rules of language, we just learned innately
-
some people who are so uh trained in learning on purpose they have a hard time relaxing with anything that's unclear
for - intention vs relaxing - natural language acquistion
-
i can never get past the idea of study because what we're doing is not study at all
for - quote - not study at all - David Long - natural language immersion
Tags
- ALG
- adjacency - natural language acquisition - open mind
- quote - natural language acquisition - wrong guess - right answer
- natural language acquisition - important role of discomfort
- intention vs relaxing - natural language acquistion
- adjacency - Socratic method - ALG - natural language acquisition - explanation
- natural language acquisition
- stats - natural language acquisition - 1 to 2 year old is age of fastest and best learning
- natural language immersion
- Latest Annotation
- youtube - interview - David Long - Automatic Language Growth
- natural language acquisition - age - 2 year old is right age to aim to learn at
- natural language acquisition - teaching - focus on meaning, not words
- quote - not study at all - David Long
- natural language acquisition skills - patience - tolerance for ambiguity - constructing good guesses to meaning
- ALG philosophy - replicate the experiences that 1 to 2 year olds have
- question - comparison - human vs artificial intelligence - Can't an AI also consider things we sit on to then generalize their classifcation algorithm?
- Automatic Language Growth
- from - youtube - The Language School that Teaches Adults like Babies
- adjacency - Socratic method - ALG - natural language acquisition - explanation - infants learning native language
- key insight - natural language acquisition - guessing
Annotators
URL
-
-
www.youtube.com www.youtube.com
-
for - natural language acquisition - youtube - The Language School that Teaches Adults like Babies - to - book - From the Outside In - linguist - J. Marvin Brown - https://hyp.is/PjtjBipbEfCr4ieLB5y1Ew/files.eric.ed.gov/fulltext/ED501257.pdf - quote - When I speak in Thai, I think in Thai - J. Marvin Brown
summary - This video summarizes the remarkable life of linguist J. Marvin Brown, who spent a lifetime trying to understand how to learn a second language and to use it the way a natural language user does - After a lifetime of research and trying out various teaching and learning methods, he finally realized that adults all have the abilitty to learn a new language in the same way any infant does, naturally through listening and watching - The key was to not bring in conscious thinking of an adult and immerse oneself in - This seems like a highly relevant clue to language creation and to linguistic BEing journeys - to - youtube - Interview with David Long - Automatic Language Growth - https://hyp.is/GRPUHipvEfCVEaMaLSU-BA/www.youtube.com/watch?v=5yhIM2Vt-Cc
Tags
- youtube - The Language School that Teaches Adults like Babies
- to - book - From the Outside In
- inguist - J. Marvin Brown
- Deep Humanity linguistic BEing journey - natural language acquisition
- to - youtube - Interview with David Long - Automatic Language Growth
- quote - When I speak in Thai, I think in Thai - J. Marvin Brown
- natural language acquisition
Annotators
URL
-
-
files.eric.ed.gov files.eric.ed.gov
-
for - natural language acquisition - author - J. Marvin Brown - book - From the Outside In - from - youtube - The Language School that Teaches Adults like Babies - https://hyp.is/Ls_IbCpbEfCEqEfjBlJ8hw/www.youtube.com/watch?v=984rkMbvp-w
-
-
ipfs.indy0.net ipfs.indy0.net
-
perceived by oneself “in here.” In this sense, the world consists of objects outthere in space (the container that holds them) before me as the perceivingsubject.
for - adjacency - Indyweb dev - natural language - timebinding - parallel vs serial processing - comparison - spoken vs written language - what's also interesting is that spoken language is timebinding, sequential and our written language descended from that, - in spite of written language existing in 2D and 3D space, it inherited sequential flow, even though it does not have to - In this sense, legacy spoken language system constrains written language to be - serial - sequential and - timebound instead of - parallel - Read any written text and you will observe that the pattern is sequential - We constrain our syntax it to "flow" sequentially in 2D space, even though there is absolutely no other reason to constrain it to do so - This also reveals another implicit rule about language, that it assumes we can only focus our attention on one aspect of reality at a time
-
- Mar 2025
-
www.cmarix.com www.cmarix.com
-
AI adoption is rapidly increasing in all industries for several use cases. In terms of natural language technologies, the question generally is – is it better to use NLP approaches or invest in LLM technologies? LLM vs NLP is an important discussion to identify which technology is most ideal for your specific project requirements.
Explore the key differences between NLP and LLM in this comprehensive comparison. Learn how these technologies shape AI-driven applications, their core functionalities, and their impact on industries like chatbots, sentiment analysis, and content generation.
-
- Feb 2025
-
www.reddit.com www.reddit.com
-
Good discussion and outline of research about natural method of language acquisition over grammar-translation method of language acquisition.
-
-
www.catholicculture.org www.catholicculture.org
-
Dreamt of learning Latin? Here’s how you’ll finally do it by [[Thomas V. Mirus]]
A non-specialist look at his Latin language acquisition with lots of resources around Hans Ørberg's Lingua Latin text.
-
Some might wonder why I recommend Lingua Latina instead of Fr. William Most’s Latin by the Natural Method series. Though Fr. Most was a friend of Catholic Culture and a brilliant theologian, after having used both books my opinion is that Most’s Latin style is significantly inferior to and less enjoyable than Ørberg’s. For example, Ørberg early on begins to acclimatize the student to the more flexible word order that makes Latin so different from English, exposure to which is essential for true reading fluency. Most’s Latin is, especially at the beginning, clunky and tedious in order to be didactic; the brilliance of Ørberg is that he manages to be didactic for the beginner while also being fluid and clever in his writing. Yet despite his greater didacticism, Fr. Most relies on English explanations of the Latin grammar, whereas Ørberg accomplishes his task entirely in Latin. Ørberg also has illustrations to teach the meaning of words without translation. Fr. Most does not include macrons to indicate vowel length, which is essential to learn correct pronunciation. He does include stress marks, which Ørberg does not, but the rules of stress are more easily learnt without stress marks than syllable length without macrons.
Thomas V. Mirus' comparison of Fr. William Most's Latin text with Hans Ørberg's.
-
Luke Ranieri’s video “Latin by the Ranieri-Dowling Method”.
-
- Jul 2024
-
whoosh.readthedocs.io whoosh.readthedocs.io
-
Whoosh provides methods for computing the “key terms” of a set of documents. For these methods, “key terms” basically means terms that are frequent in the given documents, but relatively infrequent in the indexed collection as a whole.
Very interesting method, and way of looking at the signal. "What makes a document exceptional because something is common within itself and uncommon without".
-
- May 2024
-
media.dltj.org media.dltj.org
-
Google translate is generative AI
Google Translate as generative AI
-
- Feb 2024
-
www.cortical.io www.cortical.io
-
for - semantic folding - semantic fingerprint - natural language processing - NLP - cortical.io - Numenta
-
- Feb 2023
-
arstechnica.com arstechnica.com
-
An AI model that can learn and work with this kind of problem needs to handle order in a very flexible way. The old models—LSTMs and RNNs—had word order implicitly built into the models. Processing an input sequence of words meant feeding them into the model in order. A model knew what word went first because that’s the word it saw first. Transformers instead handled sequence order numerically, with every word assigned a number. This is called "positional encoding." So to the model, the sentence “I love AI; I wish AI loved me” looks something like (I 1) (love 2) (AI 3) (; 4) (I 5) (wish 6) (AI 7) (loved 8) (me 9).
Google’s “the transformer”
One breakthrough was positional encoding versus having to handle the input in the order it was given. Second, using a matrix rather than vectors. This research came from Google Translate.
-
- Jan 2023
-
www.complexityexplorer.org www.complexityexplorer.org
-
a common technique in natural language processing is to operationalize certain semantic concepts (e.g., "synonym") in terms of syntactic structure (two words that tend to occur nearby in a sentence are more likely to be synonyms, etc). This is what word2vec does.
Can I use some of these sorts of methods with respect to corpus linguistics over time to better identified calcified words or archaic phrases that stick with the language, but are heavily limited to narrower(ing) contexts?
-
-
genizalab.princeton.edu genizalab.princeton.edu
-
Local file Local file
-
Fried-berg Judeo-Arabic Project, accessible at http://fjms.genizah.org. This projectmaintains a digital corpus of Judeo-Arabic texts that can be searched and an-alyzed.
The Friedberg Judeo-Arabic Project contains a large corpus of Judeo-Arabic text which can be manually searched to help improve translations of texts, but it might also be profitably mined using information theoretic and corpus linguistic methods to provide larger group textual translations and suggestions at a grander scale.
-
- Dec 2022
-
inst-fs-iad-prod.inscloudgate.net inst-fs-iad-prod.inscloudgate.net
-
Emily M. Bender, Timnit Gebru, Angelina McMillan-Major, and Shmargaret Shmitchell. 2021. On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? 🦜. In Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency (FAccT '21). Association for Computing Machinery, New York, NY, USA, 610–623. https://doi.org/10.1145/3442188.3445922
-
- Nov 2022
-
www.researchgate.net www.researchgate.net
-
Robert Amsler is a retired computational lexicology, computational linguist, information scientist. His P.D. was from UT-Austin in 1980. His primary work was in the area of understanding how machine-readable dictionaries could be used to create a taxonomy of dictionary word senses (which served as the motivation for the creation of WordNet) and in understanding how lexicon can be extracted from text corpora. He also invented a new technique in citation analysis that bears his name. His work is mentioned in Wikipedia articles on Machine-Readable dictionary, Computational lexicology, Bibliographic coupling, and Text mining. He currently lives in Vienna, VA and reads email at robert.amsler at utexas. edu. He is currenly interested in chronological studies of vocabulary, esp. computer terms.
https://www.researchgate.net/profile/Robert-Amsler
Apparently follow my blog. :)
Makes me wonder how we might better process and semantically parse peoples' personal notes, particularly when they're atomic and cross-linked?
-
- Oct 2022
-
www.explainpaper.com www.explainpaper.com
-
Another in a growing line of research tools for processing and making sense of research literature including Research Rabbit, Connected Papers, Semantic Scholar, etc.
Functionality includes the ability to highlight sections of research papers with natural language processing to explain what those sections mean. There's also a "chat" that allows you to ask questions about the paper which will attempt to return reasonable answers, which is an artificial intelligence sort of means of having an artificial "conversation with the text".
cc: @dwhly @remikalir @jeremydean
-
- Aug 2022
- Dec 2021
-
cacm.acm.org cacm.acm.org
-
Catala, a programming language developed by Protzenko's graduate student Denis Merigoux, who is working at the National Institute for Research in Digital Science and Technology (INRIA) in Paris, France. It is not often lawyers and programmers find themselves working together, but Catala was designed to capture and execute legal algorithms and to be understood by lawyers and programmers alike in a language "that lets you follow the very specific legal train of thought," Protzenko says.
A domain-specific language for encoding legal interpretations.
-
- Nov 2021
-
www.nature.com www.nature.com
- Jun 2021
-
-
Jung, Y., Lee, Y. K., & Hahn, S. (2021). Web-scraping the Expression of Loneliness during COVID-19. PsyArXiv. https://doi.org/10.31234/osf.io/59gwk
-
-
en.wikipedia.org en.wikipedia.org
-
ISO 639-3 extends the ISO 639-2 alpha-3 codes with an aim to cover all known natural languages.
-
-
loc.gov loc.gov
-
This doesn't seem entirely trust-worthy/useful.
The native name seems incorrect/missing for some languages, like German, Hebrew, compared to https://gist.github.com/piraveen/fafd0d984b2236e809d03a0e306c8a4d
-
-
en.wikipedia.org en.wikipedia.org
-
Similarities in dialects[edit]
-
- Mar 2021
-
en.wikipedia.org en.wikipedia.org
-
Originally he had used the terms usage scenarios and usage case – the latter a direct translation of his Swedish term användningsfall – but found that neither of these terms sounded natural in English, and eventually he settled on use case.
-
-
psyarxiv.com psyarxiv.com
-
Lindow, Mike, David DeFranza, Arul Mishra, and Himanshu Mishra. ‘Scared into Action: How Partisanship and Fear Are Associated with Reactions to Public Health Directives’. PsyArXiv, 12 January 2021. https://doi.org/10.31234/osf.io/8me7q.
-
-
arxiv.org arxiv.org
-
Kozlowski, Diego, Jennifer Dusdal, Jun Pang, and Andreas Zilian. ‘Semantic and Relational Spaces in Science of Science: Deep Learning Models for Article Vectorisation’. ArXiv:2011.02887 [Physics], 5 November 2020. http://arxiv.org/abs/2011.02887.
-
- Sep 2020
-
markojs.com markojs.comMarko1
- Aug 2020
-
onlinelibrary.wiley.com onlinelibrary.wiley.com
-
Bhatia, S., Walasek, L., Slovic, P., & Kunreuther, H. (2020). The More Who Die, the Less We Care: Evidence from Natural Language Analysis of Online News Articles and Social Media Posts. Risk Analysis, risa.13582. https://doi.org/10.1111/risa.13582
-
-
psyarxiv.com psyarxiv.com
-
Hull, T., Levine, J., Bantilan, N., Desai, A., & Majumder, M. S. (2020, August 13). Digital phenotyping of complex psychological responses to the COVID-19 pandemic. https://doi.org/10.31234/osf.io/qtrpf
-
- Jul 2020
-
-
-
osf.io osf.io
-
Rosati, G., Domenech, L., Chazarreta, A., & Maguire, T. (2020). Capturing and analyzing social representations. A first application of Natural Language Processing techniques to reader’s comments in COVID-19 news. Argentina, 2020 [Preprint]. SocArXiv. https://doi.org/10.31235/osf.io/3pcdu
-
- May 2020
-
arxiv.org arxiv.org
-
Katz, D. M., Coupette, C., Beckedorf, J., & Hartung, D. (2020). Complex Societies and the Growth of the Law. ArXiv:2005.07646 [Physics]. http://arxiv.org/abs/2005.07646
-
-
-
Kennedy, B., Atari, M., Davani, A. M., Hoover, J., Omrani, A., Graham, J., & Dehghani, M. (2020, May 7). Moral Concerns are Differentially Observable in Language. https://doi.org/10.31234/osf.io/uqmty
-
- Apr 2020
-
en.wikipedia.org en.wikipedia.org
-
english.stackexchange.com english.stackexchange.com
-
The question of whether or not it is "proper" is meaningless, unless you define the particular arbiter of manners who you want to defer to. There is no authority for the English language.
-
-
kokociel.blogspot.com kokociel.blogspot.com
-
Just as with wine-tasting, having a bigger vocabulary for colours allows specific colours to be perceived more readily and remembered more easily, even if not done consciously.
-
- Mar 2020
-
developer.wordpress.org developer.wordpress.org
-
the singular form of the string (note that it can be used for numbers other than one in some languages, so '%s item' should be used instead of 'One item')
-
-
thepugautomatic.com thepugautomatic.com
-
This will of course depend on your perspective, but: beware Finnish and other highly inflected languages. As a grammar nerd, I actually love this stuff. But judging by my colleagues, you won’t.
-