Log in Sign up
2 Matching Annotations
  1. Oct 2023
  2. typeshare.co typeshare.co
    Under the Hood: How to Use ChatGPT's Attention Mechanism for Better Prompts
    1
    1. chrisaldrich 22 Oct 2023
      in Public

      https://typeshare.co/go-go-golems/posts/under-the-hood-how-to-use-chatgpts-attention-mechanism-for-better-prompts

      read ChatGPT prompts attention mechanism transformers artificial intelligence for writing auto-regression
    Visit annotations in context

    Tags

    • artificial intelligence for writing
    • auto-regression
    • ChatGPT
    • read
    • attention mechanism
    • prompts
    • transformers

    Annotators

    • chrisaldrich

    URL

    typeshare.co/go-go-golems/posts/under-the-hood-how-to-use-chatgpts-attention-mechanism-for-better-prompts
  3. Nov 2022
  4. www.semanticscholar.org www.semanticscholar.org
    1706.03762.pdf
    1
    1. mark.crowley 22 Nov 2022
      in Public
      we propose the Transformer, a model architecture eschewing recurrence and insteadrelying entirely on an attention mechanism to draw global dependencies between input and output.The Transformer allows for significantly more parallelization a

      Using the attention mechanism to determine global dependencies between input and output instead of using recurrent links to past states. This is the essence of their new idea.

      transformers attention-mechanism
    Visit annotations in context

    Tags

    • attention-mechanism
    • transformers

    Annotators

    • mark.crowley

    URL

    semanticscholar.org/reader/204e3073870fae3d05bcbc2f6a8e263d9b72e776
Share:
Group. Only group members will be able to view this annotation.
Only me. No one else will be able to view this annotation.
Hypothes.is
  • About
  • Blog
  • Bioscience
  • Education
  • Jobs
  • Help
  • Contact
  • Terms of Service
  • Privacy Policy