_______               __                   _______
       |   |   |.---.-..----.|  |--..-----..----. |    |  |.-----..--.--.--..-----.
       |       ||  _  ||  __||    < |  -__||   _| |       ||  -__||  |  |  ||__ --|
       |___|___||___._||____||__|__||_____||__|   |__|____||_____||________||_____|
                                                             on Gopher (inofficial)
   URI Visit Hacker News on the Web
       
       
       COMMENT PAGE FOR:
   URI   Understanding Transformers via N-gram Statistics
       
       
        maz1b wrote 3 min ago:
        How does this have 74 points and only one comment?
        
        on topic: couldn't one in theory, re-publish this kind of paper for
        different kinds of LLMs, as the textual corpus upon which LLMs are
        built based off ultimately, at some level, human effort and human input
        whether it be writing, or typing?
       
        justanotherjoe wrote 1 hour 8 min ago:
        Sounds regressive and feeds into the weird unintellectual narrative
        that llm is just like ngram models (lol, lmao even)
        
        Thr author submitted like 10 papers this May alone.  Is that weird?
       
       
   DIR <- back to front page