(Translated by https://www.hiragana.jp/)
attention-mechanism · GitHub Topics · GitHub
Skip to content
#

attention-mechanism

Here are 1,520 public repositories matching this topic...

RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.

  • Updated Jul 4, 2024
  • Python
awesome-graph-classification
pytorch-GAT

My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!

  • Updated Nov 17, 2022
  • Jupyter Notebook

もと金融きんゆう-司法しほう领域(兼有けんゆう闲聊せい质)てき聊天つくえじん,其中てき主要しゅよう块有しんいき抽取、NLU、NLG、识图谱等,并且利用りようDjango整合せいごうりょうぜんはし展示てんじ,目前もくぜんやめ经封そうりょうnlpkgてきrestfulせっこう

  • Updated Jun 13, 2021

Improve this page

Add a description, image, and links to the attention-mechanism topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the attention-mechanism topic, visit your repo's landing page and select "manage topics."

Learn more