transformers

Get started

  • Quick tour
  • Installation
  • Philosophy
  • Glossary

Using 🤗 Transformers

  • Summary of the tasks
  • Summary of the models
  • reprocessing data
  • Training and fine-tuning
  • Model sharing and uploading
  • Summary of the tokenizers
  • Multi-lingual models

Advanced guides

  • Pretrained models
  • Examples
  • Fine-tuning with custom datasets
  • 🤗 Transformers Notebooks
  • Converting Tensorflow Checkpoints
  • Migrating from previous packages
  • How to contribute to transformers?
  • Testing
  • Exporting transformers models

Research

  • BERTology
  • Perplexity of fixed-length models
  • Benchmarks

Main Classes

  • Callbacks
  • Configuration
  • Logging
  • Models
  • Optimization
  • Model outputs
  • Pipelines
  • Processors
  • Tokenizer
  • Trainer

Models

  • ALBERT
  • Auto Classes
  • BART
  • BARThez
  • BERT
  • BertGeneration
  • Blenderbot
  • CamemBERT
  • CTRL
  • DeBERTa
  • DialoGPT
  • DistilBERT
  • DPR
  • ELECTRA
  • Encoder Decoder Models
  • FlauBERT
  • FSMT
  • Funnel Transformer
  • LayoutLM
  • Longformer
  • LXMERT
  • MarianMT
  • MBart
  • MobileBERT
  • MPNet
  • MT5
  • OpenAI GPT
  • OpenAI GPT2
  • Pegasus
  • ProphetNet
  • RAG
  • Reformer
  • RetriBERT
  • RoBERTa
  • SqueezeBERT
  • T5
  • TAPAS
  • Transformer XL
  • XLM
  • XLM-ProphetNet
  • XLM-RoBERTa
  • XLNet

Internal Helpers

  • Custom Layers and Utilities
  • Utilities for pipelines
  • Utilities for Tokenizers
  • Utilities for Trainer
  • Utilities for Generation
transformers

ⓘ You are viewing legacy docs. Go to latest documentation instead.

  • Docs »
  • Search


© Copyright 2020, The Hugging Face Team, Licenced under the Apache License, Version 2.0

Built with Sphinx using a theme provided by Read the Docs.