Experiments/Integrations
PyTorch Lightning
Integrating with PyTorch Lightning
mlop provides super performant integration with PyTorch Lightning, including dedicated support for checkpointing, logging artifacts, model details, and more.
Migrating from Weights & Biases
See the Migrating from Weights & Biases guide for a quickstart.
Logging Lightning Models
To use mlop as a PyTorch or Fabric Logger for Lightning, simply import the logger class dedicated for Lightning and use it in the Trainer
or Fabric
initialization.
Parameter | Type | Description |
---|---|---|
op | mlop.op.Op | The initialized mlop run to log to. |
project | Optional[str] | The name of the project (if uninitialized). |
name | Optional[str] | The name of the run (if uninitialized). |
**kwargs | Any | Additional keyword arguments accepted by the mlop.init constructor. |
Source
See the Python code for more details.
Using MLOPLogger
The MLOPLogger
inherits from the standard lightning.pytorch.loggers.Logger
class, and strictly conforms to its standard API interface.
Method | Description | Example |
---|---|---|
log_metrics | Log metrics to mlop. | logger.log_metrics(data: Dict[str, float]) |
log_hyperparams | Log hyperparameters to mlop. | logger.log_hyperparams(params: Dict[str, Any]) |
log_file | Log a file to mlop. | logger.log_file(key: str, files: List[mlop.File], **kwargs) |
log_image | Log an image to mlop. | logger.log_image(key: str, images: List[mlop.Image], **kwargs) |
log_audio | Log an audio file to mlop. | logger.log_audio(key: str, audios: List[mlop.Audio], **kwargs) |
log_video | Log a video to mlop. | logger.log_video(key: str, videos: List[mlop.Video], **kwargs) |
log_checkpoint | Log a checkpoint to mlop. | logger.log_checkpoint(key: str, checkpoint: "ModelCheckpoint", **kwargs) |
log_graph | Log a model graph to mlop. | logger.log_graph(model: "torch.nn.Module", **kwargs) |
watch | Watch a model. | logger.watch(model: "torch.nn.Module", **kwargs) |