52 Matching Annotations
  1. May 2025
    1. in the classroom you want to focus you we're earlier talking about what do you what do you tell students when they first show up right yeah you want to focus on meaning not on language focus on what's happening not on words and phrases and pronunciation all right

      for - natural language acquisition - teaching - focus on meaning, not words - Latest Annotation

    2. science tells us that kids learn better from one from zero from the birth to five years old they're the fastest they're the best at learning model them then just do what they do you can't get better than that

      for - stats - natural language acquisition - 1 to 2 year old is age of fastest and best learning

      comment - ALG philosophy - replicate the experiences that 1 to 2 year olds have

    3. show me any other program that that tries to teach you language for a one to two-year-old that's what we're doing it doesn't compare to teaching a language to a five-year-old we're not there yet

      for - natural language acquisition - age - 2 year old is right age to aim to learn at

      comment - 2 year old age is when an infant learns to hear and speak a spoken language first - reading and writing does not happen until about 5 years of age - When we are learning a new second language, it is therefore appropriate to aim for the same goal as a native 2 year old language user

    4. a wrong guess is a hundred times better than a right answer yeah that's just giving you the reason is a right answer closes your mind a wrong guess you're still open and that's the vital characteristic

      for - quote - natural language acquisition - wrong guess - right answer - adjacency - natural language acquisition - open mind

    5. patience and tolerance for ambiguity

      for - natural language acquisition skills - patience - tolerance for ambiguity - constructing good guesses to meaning

    6. that was the biggest challenge i think we had and still have within uh alg is teachers think they've got to explain the language and they're short cutting the process they're short circuiting the process and they're cheating the student out of a otherwise good experience

      for - adjacency - Socratic method - ALG - natural language acquisition - explanation - infants learning native language

      adjacency - between - Socratic method - natural language acquisition - ALG - explanation - adjacency relationship - When the teacher explains the meaning to the student, - it actually robs the student of the active learning experience of guessing the right meaning - Infants learning their native language for the first time are necessarily in the "deep end" and face discomfort - They (we) are constantly forced to guess and actually actively construct meaning out of the universe of symbols we are being exposed to in a multitude of contexts

    7. that discomfort is a tough one that's the first part you gotta face that and if you're not facing it then you've learned to walk with crutches

      for - natural language acquisition - important role of discomfort

    8. if i cannot adjust to guessing right about meaning i will never learn in this way very well at all

      for - key insight - natural language acquisition - guessing

    9. for - natural language acquisition - Automatic Language Growth - ALG - youtube - interview - David Long - Automatic Language Growth - from - youtube - The Language School that Teaches Adults like Babies - https://hyp.is/Ls_IbCpbEfCEqEfjBlJ8hw/www.youtube.com/watch?v=984rkMbvp-w

      summary - The key takeaway is that even as adults, we have retained our innate language learning skill which requires simply treating a new language as a new, novel experience that we can apprehend naturally simply by experiencing it like the way we did when we were exposed to our first, native language - We didn't know what a "language" was theoretically when we were infants, but we simply fell into the experience and played with the experiences and our primary caretakers guided us - We didn't know grammar and rules of language, we just learned innately

    10. some people who are so uh trained in learning on purpose they have a hard time relaxing with anything that's unclear

      for - intention vs relaxing - natural language acquistion

    11. i can never get past the idea of study because what we're doing is not study at all

      for - quote - not study at all - David Long - natural language immersion

    1. for - natural language acquisition - youtube - The Language School that Teaches Adults like Babies - to - book - From the Outside In - linguist - J. Marvin Brown - https://hyp.is/PjtjBipbEfCr4ieLB5y1Ew/files.eric.ed.gov/fulltext/ED501257.pdf - quote - When I speak in Thai, I think in Thai - J. Marvin Brown

      summary - This video summarizes the remarkable life of linguist J. Marvin Brown, who spent a lifetime trying to understand how to learn a second language and to use it the way a natural language user does - After a lifetime of research and trying out various teaching and learning methods, he finally realized that adults all have the abilitty to learn a new language in the same way any infant does, naturally through listening and watching - The key was to not bring in conscious thinking of an adult and immerse oneself in - This seems like a highly relevant clue to language creation and to linguistic BEing journeys - to - youtube - Interview with David Long - Automatic Language Growth - https://hyp.is/GRPUHipvEfCVEaMaLSU-BA/www.youtube.com/watch?v=5yhIM2Vt-Cc

    1. perceived by oneself “in here.” In this sense, the world consists of objects outthere in space (the container that holds them) before me as the perceivingsubject.

      for - adjacency - Indyweb dev - natural language - timebinding - parallel vs serial processing - comparison - spoken vs written language - what's also interesting is that spoken language is timebinding, sequential and our written language descended from that, - in spite of written language existing in 2D and 3D space, it inherited sequential flow, even though it does not have to - In this sense, legacy spoken language system constrains written language to be - serial - sequential and - timebound instead of - parallel - Read any written text and you will observe that the pattern is sequential - We constrain our syntax it to "flow" sequentially in 2D space, even though there is absolutely no other reason to constrain it to do so - This also reveals another implicit rule about language, that it assumes we can only focus our attention on one aspect of reality at a time

  2. Mar 2025
    1. AI adoption is rapidly increasing in all industries for several use cases. In terms of natural language technologies, the question generally is – is it better to use NLP approaches or invest in LLM technologies? LLM vs NLP is an important discussion to identify which technology is most ideal for your specific project requirements.

      Explore the key differences between NLP and LLM in this comprehensive comparison. Learn how these technologies shape AI-driven applications, their core functionalities, and their impact on industries like chatbots, sentiment analysis, and content generation.

  3. Feb 2025
    1. Dreamt of learning Latin? Here’s how you’ll finally do it by [[Thomas V. Mirus]]

      A non-specialist look at his Latin language acquisition with lots of resources around Hans Ørberg's Lingua Latin text.

    2. Some might wonder why I recommend Lingua Latina instead of Fr. William Most’s Latin by the Natural Method series. Though Fr. Most was a friend of Catholic Culture and a brilliant theologian, after having used both books my opinion is that Most’s Latin style is significantly inferior to and less enjoyable than Ørberg’s. For example, Ørberg early on begins to acclimatize the student to the more flexible word order that makes Latin so different from English, exposure to which is essential for true reading fluency. Most’s Latin is, especially at the beginning, clunky and tedious in order to be didactic; the brilliance of Ørberg is that he manages to be didactic for the beginner while also being fluid and clever in his writing. Yet despite his greater didacticism, Fr. Most relies on English explanations of the Latin grammar, whereas Ørberg accomplishes his task entirely in Latin. Ørberg also has illustrations to teach the meaning of words without translation. Fr. Most does not include macrons to indicate vowel length, which is essential to learn correct pronunciation. He does include stress marks, which Ørberg does not, but the rules of stress are more easily learnt without stress marks than syllable length without macrons.

      Thomas V. Mirus' comparison of Fr. William Most's Latin text with Hans Ørberg's.

    3. Luke Ranieri’s video “Latin by the Ranieri-Dowling Method”.
  4. Jul 2024
    1. Whoosh provides methods for computing the “key terms” of a set of documents. For these methods, “key terms” basically means terms that are frequent in the given documents, but relatively infrequent in the indexed collection as a whole.

      Very interesting method, and way of looking at the signal. "What makes a document exceptional because something is common within itself and uncommon without".

  5. May 2024
  6. Feb 2024
  7. Feb 2023
    1. An AI model that can learn and work with this kind of problem needs to handle order in a very flexible way. The old models—LSTMs and RNNs—had word order implicitly built into the models. Processing an input sequence of words meant feeding them into the model in order. A model knew what word went first because that’s the word it saw first. Transformers instead handled sequence order numerically, with every word assigned a number. This is called "positional encoding." So to the model, the sentence “I love AI; I wish AI loved me” looks something like (I 1) (love 2) (AI 3) (; 4) (I 5) (wish 6) (AI 7) (loved 8) (me 9).

      Google’s “the transformer”

      One breakthrough was positional encoding versus having to handle the input in the order it was given. Second, using a matrix rather than vectors. This research came from Google Translate.

  8. Jan 2023
    1. a common technique in natural language processing is to operationalize certain semantic concepts (e.g., "synonym") in terms of syntactic structure (two words that tend to occur nearby in a sentence are more likely to be synonyms, etc). This is what word2vec does.

      Can I use some of these sorts of methods with respect to corpus linguistics over time to better identified calcified words or archaic phrases that stick with the language, but are heavily limited to narrower(ing) contexts?

    1. Fried-berg Judeo-Arabic Project, accessible at http://fjms.genizah.org. This projectmaintains a digital corpus of Judeo-Arabic texts that can be searched and an-alyzed.

      The Friedberg Judeo-Arabic Project contains a large corpus of Judeo-Arabic text which can be manually searched to help improve translations of texts, but it might also be profitably mined using information theoretic and corpus linguistic methods to provide larger group textual translations and suggestions at a grander scale.

  9. Dec 2022
    1. Emily M. Bender, Timnit Gebru, Angelina McMillan-Major, and Shmargaret Shmitchell. 2021. On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? 🦜. In Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency (FAccT '21). Association for Computing Machinery, New York, NY, USA, 610–623. https://doi.org/10.1145/3442188.3445922

  10. Nov 2022
    1. Robert Amsler is a retired computational lexicology, computational linguist, information scientist. His P.D. was from UT-Austin in 1980. His primary work was in the area of understanding how machine-readable dictionaries could be used to create a taxonomy of dictionary word senses (which served as the motivation for the creation of WordNet) and in understanding how lexicon can be extracted from text corpora. He also invented a new technique in citation analysis that bears his name. His work is mentioned in Wikipedia articles on Machine-Readable dictionary, Computational lexicology, Bibliographic coupling, and Text mining. He currently lives in Vienna, VA and reads email at robert.amsler at utexas. edu. He is currenly interested in chronological studies of vocabulary, esp. computer terms.

      https://www.researchgate.net/profile/Robert-Amsler

      Apparently follow my blog. :)

      Makes me wonder how we might better process and semantically parse peoples' personal notes, particularly when they're atomic and cross-linked?

  11. Oct 2022
    1. https://www.explainpaper.com/

      Another in a growing line of research tools for processing and making sense of research literature including Research Rabbit, Connected Papers, Semantic Scholar, etc.

      Functionality includes the ability to highlight sections of research papers with natural language processing to explain what those sections mean. There's also a "chat" that allows you to ask questions about the paper which will attempt to return reasonable answers, which is an artificial intelligence sort of means of having an artificial "conversation with the text".

      cc: @dwhly @remikalir @jeremydean

  12. Aug 2022
  13. Dec 2021
    1. Catala, a programming language developed by Protzenko's graduate student Denis Merigoux, who is working at the National Institute for Research in Digital Science and Technology (INRIA) in Paris, France. It is not often lawyers and programmers find themselves working together, but Catala was designed to capture and execute legal algorithms and to be understood by lawyers and programmers alike in a language "that lets you follow the very specific legal train of thought," Protzenko says.

      A domain-specific language for encoding legal interpretations.

  14. Nov 2021
  15. Jun 2021
  16. Mar 2021
    1. Originally he had used the terms usage scenarios and usage case – the latter a direct translation of his Swedish term användningsfall – but found that neither of these terms sounded natural in English, and eventually he settled on use case.
  17. Sep 2020
  18. Aug 2020
  19. Jul 2020
  20. May 2020
  21. Apr 2020
    1. Just as with wine-tasting, having a bigger vocabulary for colours allows specific colours to be perceived more readily and remembered more easily, even if not done consciously.
  22. Mar 2020
    1. This will of course depend on your perspective, but: beware Finnish and other highly inflected languages. As a grammar nerd, I actually love this stuff. But judging by my colleagues, you won’t.