transformers

Get started

  • Quick tour
  • Installation
  • Philosophy
  • Glossary

Using 🤗 Transformers

  • Summary of the tasks
  • Summary of the models
  • Preprocessing data
  • Training and fine-tuning
  • Model sharing and uploading
  • Tokenizer summary
  • Multi-lingual models

Advanced guides

  • Pretrained models
  • Examples
  • 🤗 Transformers Notebooks
  • Converting Tensorflow Checkpoints
  • Migrating from previous packages
  • TorchScript
  • How to contribute to transformers?

Research

  • BERTology
  • Benchmarks

Package Reference

  • Configuration
  • Models
  • Tokenizer
  • Pipelines
  • Optimization
  • Processors
  • Trainer
  • AutoModels
  • Encoder Decoder Models
  • BERT
  • OpenAI GPT
  • Transformer XL
  • OpenAI GPT2
  • XLM
  • XLNet
  • RoBERTa
  • DistilBERT
  • CTRL
  • CamemBERT
  • ALBERT
  • XLM-RoBERTa
  • FlauBERT
  • Bart
  • T5
  • ELECTRA
  • DialoGPT
  • Reformer
  • MarianMT
  • Longformer
  • RetriBERT
  • MobileBERT
transformers

ⓘ You are viewing legacy docs. Go to latest documentation instead.

  • Docs »
  • Search


© Copyright 2020, huggingface

Built with Sphinx using a theme provided by Read the Docs.