Experiments/Integrations

Hugging Face Transformers

Integrating with Hugging Face Transformers

Open In Colab

mlop provides an interface for rich model logging and monitoring with Hugging Face Transformers.

Migrating from Weights & Biases

See the Migrating from Weights & Biases guide for a quickstart.

Logging Model Details

To use mlop for Hugging Face Transformers, simply import the MLOPCallback class and specify report_to="mlop", or to use mlop alongside other loggers, specify report_to="all".

from mlop.compat.transformers import MLOPCallback
from transformers import Trainer, TrainingArguments
 
training_args = TrainingArguments(
    report_to="mlop",
    **kwargs,
)
trainer = Trainer(
    **kwargs,
)
 
trainer.train()
mlop.finish()

Warning

At the moment, from mlop.compat.transformers import MLOPCallback must be called before initializing TrainingArguments to allow mlop enable itself for Hugging Face Transformers. This requirement will be removed in the near future after mlop is merged into transformers.

Source

See the Python code for more details.

Using MLOPCallback

The MLOPCallback inherits from the standard transformers.trainer_callback.TrainerCallback class, and strictly conforms to its standard API interface.

On this page