Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context (Q86226450)

From Wikidata
Jump to navigation Jump to search
scientific article published on 2 June 2019
edit
Language Label Description Also known as
English
Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context
scientific article published on 2 June 2019

    Statements

    Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context (English)
    0 references
    Zihang Dai
    0 references
    Zhilin Yang
    0 references
    Yiming Yang
    0 references
    2 June 2019
    0 references
    cs.LG
    0 references
    cs.CL
    0 references
    stat.ML
    0 references

    Identifiers

    0 references
     
    edit
      edit
        edit
          edit
            edit
              edit
                edit
                  edit
                    edit