Check off tasks in todo for logging handling

This commit is contained in:
Craig
2025-04-12 10:49:24 +01:00
parent 620c34bf13
commit bd6b5170b7

12
todo.md
View File

@@ -54,12 +54,12 @@ This list outlines the steps required to complete the Torchvision Finetuning pro
- [x] Calculate total loss (`sum(...)`).
- [x] Perform backward pass (`optimizer.zero_grad()`, `loss.backward()`, `optimizer.step()`)
- [x] Print/log loss for the single step (and temporarily exit).
- [ ] Implement logging setup in `utils/log_utils.py` (`setup_logging` function).
- [ ] Configure `logging.basicConfig` for file and console output.
- [ ] Integrate logging into `train.py`.
- [ ] Call `setup_logging`.
- [ ] Replace `print` with `logging.info`.
- [ ] Log config, device, and training progress/losses.
- [x] Implement logging setup in `utils/log_utils.py` (`setup_logging` function).
- [x] Configure `logging.basicConfig` for file and console output.
- [x] Integrate logging into `train.py`.
- [x] Call `setup_logging`.
- [x] Replace `print` with `logging.info`.
- [x] Log config, device, and training progress/losses.
- [ ] Implement full training loop in `train.py`.
- [ ] Remove single-step exit.
- [ ] Add LR scheduler (`torch.optim.lr_scheduler.StepLR`).