From bd6b5170b70cf7bc141ead410ff01092f93155ab Mon Sep 17 00:00:00 2001 From: Craig Date: Sat, 12 Apr 2025 10:49:24 +0100 Subject: [PATCH] Check off tasks in todo for logging handling --- todo.md | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/todo.md b/todo.md index 70b705f..c69816f 100644 --- a/todo.md +++ b/todo.md @@ -54,12 +54,12 @@ This list outlines the steps required to complete the Torchvision Finetuning pro - [x] Calculate total loss (`sum(...)`). - [x] Perform backward pass (`optimizer.zero_grad()`, `loss.backward()`, `optimizer.step()`) - [x] Print/log loss for the single step (and temporarily exit). -- [ ] Implement logging setup in `utils/log_utils.py` (`setup_logging` function). - - [ ] Configure `logging.basicConfig` for file and console output. -- [ ] Integrate logging into `train.py`. - - [ ] Call `setup_logging`. - - [ ] Replace `print` with `logging.info`. - - [ ] Log config, device, and training progress/losses. +- [x] Implement logging setup in `utils/log_utils.py` (`setup_logging` function). + - [x] Configure `logging.basicConfig` for file and console output. +- [x] Integrate logging into `train.py`. + - [x] Call `setup_logging`. + - [x] Replace `print` with `logging.info`. + - [x] Log config, device, and training progress/losses. - [ ] Implement full training loop in `train.py`. - [ ] Remove single-step exit. - [ ] Add LR scheduler (`torch.optim.lr_scheduler.StepLR`).