CSVLogger

class lightning.pytorch.loggers.CSVLogger(save_dir, name='lightning_logs', version=None, prefix='', flush_logs_every_n_steps=100)[source]

Bases: Logger, CSVLogger

Log to local file system in CSV and YAML format.

Metrics (from self.log() calls) are logged to CSV format while hyperparameters (from self.log_hyperparams() or self.save_hyperparameters()) are logged to YAML format.

Logs are saved to os.path.join(save_dir, name, version). The CSV file is named metrics.csv and the YAML file is named hparams.yaml.

This logger supports logging to remote filesystems via fsspec. Make sure you have it installed.

Example:

from lightning.pytorch import Trainer
from lightning.pytorch.loggers import CSVLogger

# Basic usage
logger = CSVLogger("logs", name="my_exp_name")
trainer = Trainer(logger=logger)

Use the logger anywhere in your LightningModule as follows:

import torch
from lightning.pytorch import LightningModule

class LitModel(LightningModule):
    def __init__(self, learning_rate=0.001, batch_size=32):
        super().__init__()
        # This will log hyperparameters to hparams.yaml
        self.save_hyperparameters()

    def training_step(self, batch, batch_idx):
        loss = self.compute_loss(batch)
        # This will log metrics to metrics.csv
        self.log("train_loss", loss)
        return loss

    def configure_optimizers(self):
        return torch.optim.Adam(self.parameters(), lr=self.hparams.learning_rate)

You can also manually log hyperparameters:

# Log additional hyperparameters manually
logger.log_hyperparams({"dropout": 0.2, "optimizer": "adam"})

File Structure:

The logger creates the following files in the log directory:

  • metrics.csv: Contains metrics from self.log() calls

  • hparams.yaml: Contains hyperparameters from self.save_hyperparameters() or self.log_hyperparams()

Parameters:
  • save_dir (Union[str, Path]) – Save directory

  • name (Optional[str]) – Experiment name, optional. Defaults to 'lightning_logs'. If name is None, logs (versions) will be stored to the save dir directly.

  • version (Union[int, str, None]) – Experiment version. If version is not specified the logger inspects the save directory for existing versions, then automatically assigns the next available version.

  • prefix (str) – A string to put at the beginning of metric keys.

  • flush_logs_every_n_steps (int) – How often to flush logs to disk (defaults to every 100 steps).

log_hyperparams(params=None)[source]

Log hyperparameters to YAML format.

Hyperparameters are saved to hparams.yaml in the log directory. This method is automatically called when using self.save_hyperparameters() in your LightningModule, but can also be called manually.

Parameters:

params (Union[dict[str, Any], Namespace, None]) – Dictionary or Namespace containing hyperparameters to log.

Return type:

None

Example

>>> logger = CSVLogger("logs")
>>> logger.log_hyperparams({"learning_rate": 0.001, "batch_size": 32})
>>> # This creates logs/lightning_logs/version_0/hparams.yaml
property experiment: _ExperimentWriter

Actual _ExperimentWriter object. To use _ExperimentWriter features in your LightningModule do the following.

Example:

self.logger.experiment.some_experiment_writer_function()
property log_dir: str

The log directory for this run.

By default, it is named 'version_${self.version}' but it can be overridden by passing a string value for the constructor’s version parameter instead of None or an int.

property root_dir: str

Parent directory for all checkpoint subdirectories.

If the experiment name parameter is an empty string, no experiment subdirectory is used and the checkpoint will be saved in “save_dir/version”

property save_dir: str

The current directory where logs are saved.

Returns:

The path to current directory where logs are saved.