csv_logs¶
Classes
Log to local file system in CSV and YAML format. |
|
Experiment writer for CSVLogger. |
CSV logger¶
CSV logger for basic experiment logging that does not require opening ports
- class lightning.pytorch.loggers.csv_logs.CSVLogger(save_dir, name='lightning_logs', version=None, prefix='', flush_logs_every_n_steps=100)[source]¶
-
Log to local file system in CSV and YAML format.
Metrics (from
self.log()
calls) are logged to CSV format while hyperparameters (fromself.log_hyperparams()
orself.save_hyperparameters()
) are logged to YAML format.Logs are saved to
os.path.join(save_dir, name, version)
. The CSV file is namedmetrics.csv
and the YAML file is namedhparams.yaml
.This logger supports logging to remote filesystems via
fsspec
. Make sure you have it installed.Example:
from lightning.pytorch import Trainer from lightning.pytorch.loggers import CSVLogger # Basic usage logger = CSVLogger("logs", name="my_exp_name") trainer = Trainer(logger=logger)
Use the logger anywhere in your
LightningModule
as follows:import torch from lightning.pytorch import LightningModule class LitModel(LightningModule): def __init__(self, learning_rate=0.001, batch_size=32): super().__init__() # This will log hyperparameters to hparams.yaml self.save_hyperparameters() def training_step(self, batch, batch_idx): loss = self.compute_loss(batch) # This will log metrics to metrics.csv self.log("train_loss", loss) return loss def configure_optimizers(self): return torch.optim.Adam(self.parameters(), lr=self.hparams.learning_rate)
You can also manually log hyperparameters:
# Log additional hyperparameters manually logger.log_hyperparams({"dropout": 0.2, "optimizer": "adam"})
File Structure:
The logger creates the following files in the log directory:
metrics.csv
: Contains metrics fromself.log()
callshparams.yaml
: Contains hyperparameters fromself.save_hyperparameters()
orself.log_hyperparams()
- Parameters:
name¶ (
Optional
[str
]) – Experiment name, optional. Defaults to'lightning_logs'
. If name isNone
, logs (versions) will be stored to the save dir directly.version¶ (
Union
[int
,str
,None
]) – Experiment version. If version is not specified the logger inspects the save directory for existing versions, then automatically assigns the next available version.prefix¶ (
str
) – A string to put at the beginning of metric keys.flush_logs_every_n_steps¶ (
int
) – How often to flush logs to disk (defaults to every 100 steps).
- log_hyperparams(params=None)[source]¶
Log hyperparameters to YAML format.
Hyperparameters are saved to
hparams.yaml
in the log directory. This method is automatically called when usingself.save_hyperparameters()
in your LightningModule, but can also be called manually.- Parameters:
params¶ (
Union
[dict
[str
,Any
],Namespace
,None
]) – Dictionary or Namespace containing hyperparameters to log.- Return type:
Example
>>> logger = CSVLogger("logs") >>> logger.log_hyperparams({"learning_rate": 0.001, "batch_size": 32}) >>> # This creates logs/lightning_logs/version_0/hparams.yaml
- property experiment: _ExperimentWriter¶
Actual _ExperimentWriter object. To use _ExperimentWriter features in your
LightningModule
do the following.Example:
self.logger.experiment.some_experiment_writer_function()
- property log_dir: str¶
The log directory for this run.
By default, it is named
'version_${self.version}'
but it can be overridden by passing a string value for the constructor’s version parameter instead ofNone
or an int.
- class lightning.pytorch.loggers.csv_logs.ExperimentWriter(log_dir)[source]¶
Bases:
_ExperimentWriter
Experiment writer for CSVLogger.
Logs metrics in CSV format and hyperparameters in YAML format.
This logger supports logging to remote filesystems via
fsspec
. Make sure you have it installed.