Learn Generative AI with PyTorch (Manning Publications, 2024)
-
Updated
Jul 9, 2024 - Jupyter Notebook
Learn Generative AI with PyTorch (Manning Publications, 2024)
An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
Pure C multi modal 3D Hybrid GAN using Cross attention, attention and convolution
The Enterprise-Grade Production-Ready Multi-Agent Orchestration Framework Join our Community: https://discord.com/servers/agora-999382051935506503
Tensorflow implementation of a 3D-CNN U-net with Grid Attention and DSV for pancreas segmentation trained on CT-82.
Contrastive-LSH Embedding and Tokenization Technique for Multivariate Time Series Classification
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
The purpose of this project is to understand how the Transformers work and to build a Vision Transformer
FlashAttention (Metal Port)
This study introduces RMFNet, a sophisticated attention-based model designed to enhance the efficacy of early-stage brain tumor detection through advanced feature extraction methodologies.
The attention heads in the Transformer architecture possess a variety of capabilities. This is a carefully compiled list that summarizes the diverse functions of the attention heads.
LSTM-ARIMA with Attention and multiplicative decomposition for sophisticated stock forecasting.
Implementation of 💍 Ring Attention, from Liu et al. at Berkeley AI, in Pytorch
Training-free Post-training Efficient Sub-quadratic Complexity Attention. Implemented with OpenAI Triton.
This is an unofficial Pytorch implementation of the Infini Attention mechanism introduced in the paper : "Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention". Note that the official code for the paper has not been released yet. In case of issues, add a PR (add an explanation of the changes made and why so?)
STAD-GCN: Spatial-Temporal Attention-based Dynamic Graph Convolutional Network for Retail Market Price Prediction, pytorch version (ESWA 2024)
A Decoder-only Transfomer model for text generation.
A simple but complete full-attention transformer with a set of promising experimental features from various papers
A Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction
An attention based approach to convert Indian Sign Language to Text using simulated hand gesture data
Add a description, image, and links to the attention-mechanism topic page so that developers can more easily learn about it.
To associate your repository with the attention-mechanism topic, visit your repo's landing page and select "manage topics."