🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
-
Updated
Jun 12, 2024 - Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
A Python-based toolkit for comparing transformers.
A framework for few-shot evaluation of language models.
An offline CPU-first memory-scarce chat application to perform RAG on your corpus of data. Powered by OpenChat and CTranslate2.
An attention based approach to convert Indian Sign Language to Text using simulated hand gesture data
A high-throughput and memory-efficient inference and serving engine for LLMs
A toolbox of vision models and algorithms based on MindSpore
[CVPR 2024] Official Implementation of Collaborating Foundation models for Domain Generalized Semantic Segmentation
Large Language Model Text Generation Inference
Port of OpenAI's Whisper model in C/C++
📰 Must-read papers and blogs on LLM based Long Context Modeling 🔥
An Extensible Toolkit for Finetuning and Inference of Large Foundation Models. Large Models for All.
Transformer Based Model Paper Review and PyTorch Code
SwissArmyTransformer is a flexible and powerful library to develop your own Transformer variants.
Thyroid Ultrasound Image Classification for Disease Diagnosis / โมเดลจำเเนกรูปภาพอัลตราซาวนด์เพื่อตรวจมะเร็งไทรอยด์
ML Model designed to learn compositional structure of LEGO assemblies
A high-performance inference system for large language models, designed for production environments.
This is a JAX/Flax-based transformer language model trained on a Japanese dataset. It is based on the official Flax example code (lm1b).
Add a description, image, and links to the transformer topic page so that developers can more easily learn about it.
To associate your repository with the transformer topic, visit your repo's landing page and select "manage topics."