Check off tasks in todo for logging handling

This commit is contained in:
Craig
2025-04-12 10:49:24 +01:00
parent 620c34bf13
commit bd6b5170b7

12
todo.md
View File

@@ -54,12 +54,12 @@ This list outlines the steps required to complete the Torchvision Finetuning pro
- [x] Calculate total loss (`sum(...)`). - [x] Calculate total loss (`sum(...)`).
- [x] Perform backward pass (`optimizer.zero_grad()`, `loss.backward()`, `optimizer.step()`) - [x] Perform backward pass (`optimizer.zero_grad()`, `loss.backward()`, `optimizer.step()`)
- [x] Print/log loss for the single step (and temporarily exit). - [x] Print/log loss for the single step (and temporarily exit).
- [ ] Implement logging setup in `utils/log_utils.py` (`setup_logging` function). - [x] Implement logging setup in `utils/log_utils.py` (`setup_logging` function).
- [ ] Configure `logging.basicConfig` for file and console output. - [x] Configure `logging.basicConfig` for file and console output.
- [ ] Integrate logging into `train.py`. - [x] Integrate logging into `train.py`.
- [ ] Call `setup_logging`. - [x] Call `setup_logging`.
- [ ] Replace `print` with `logging.info`. - [x] Replace `print` with `logging.info`.
- [ ] Log config, device, and training progress/losses. - [x] Log config, device, and training progress/losses.
- [ ] Implement full training loop in `train.py`. - [ ] Implement full training loop in `train.py`.
- [ ] Remove single-step exit. - [ ] Remove single-step exit.
- [ ] Add LR scheduler (`torch.optim.lr_scheduler.StepLR`). - [ ] Add LR scheduler (`torch.optim.lr_scheduler.StepLR`).