Debugging with logging
Below is the new section you can insert right after Section 14.6 – Debugging with pdb (or before the Mini-Project).
Debugging with logging
Below is the new section you can insert right after Section 14.6 – Debugging with pdb (or before the Mini-Project).
Below is the new section you can insert right after Section 14.6 – Debugging with pdb (or before the Mini-Project).
It follows the same clean, professional style and teaches production-grade logging with real-world examples, best practices, and integration with the Task Manager.
14.7 Debugging with logging (Production-Ready Debugging)
“Print statements are the assembly language of debugging.”
“Logging is the C of debugging.” – Modern Python Wisdom
Why logging > print()?
print() |
logging |
|---|---|
| Always outputs | Configurable levels |
| No timestamps | Built-in metadata |
| Hard to disable | Disable per module |
| No file output | Write to files, rotate |
| No structure | JSON, structured logs |
1. Basic Setup
# logger.py
import logging
# Create a logger
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG) # Capture everything
# Create handlers
console_handler = logging.StreamHandler()
file_handler = logging.FileHandler("app.log")
# Set levels
console_handler.setLevel(logging.INFO)
file_handler.setLevel(logging.DEBUG)
# Create formatter
formatter = logging.Formatter(
"%(asctime)s - %(name)s - %(levelname)s - %(filename)s:%(lineno)d - %(message)s"
)
console_handler.setFormatter(formatter)
file_handler.setFormatter(formatter)
# Add handlers to logger
logger.addHandler(console_handler)
logger.addHandler(file_handler)
2. Log Levels (Use Wisely!)
| Level | When to Use |
|---|---|
DEBUG |
Detailed info, variable values |
INFO |
Normal operation milestones |
WARNING |
Unexpected but handled |
ERROR |
Serious problem, function failed |
CRITICAL |
App will crash or corrupt |
logger.debug("User data: %s", user_dict) # Safe: no string formatting issues
logger.info("User %s logged in", username)
logger.warning("Deprecated API used")
logger.error("Failed to save task", exc_info=True)
logger.critical("Database connection lost!")
3. Best Practice: Never interpolate sensitive data
# BAD
logger.debug(f"User password: {password}")
# GOOD
logger.debug("Processing login for user: %s", username)
4. Real Example: Log-Enabled Task Manager
# task_manager.py (updated with logging)
import json
from dataclasses import dataclass
from pathlib import Path
from typing import List
import logging
# Configure root logger (once at app start)
logging.basicConfig(
level=logging.DEBUG,
format="%(asctime)s [%(levelname)s] %(name)s:%(lineno)d - %(message)s",
handlers=[
logging.FileHandler("task_manager.log"),
logging.StreamHandler()
]
)
logger = logging.getLogger(__name__)
@dataclass
class Task:
title: str
done: bool = False
class TaskManager:
def __init__(self, file_path: str = "tasks.json"):
self.file = Path(file_path)
self.tasks: List[Task] = []
logger.info("Initializing TaskManager with file: %s", self.file)
self.load()
def load(self):
if self.file.exists():
try:
data = json.loads(self.file.read_text())
self.tasks = [Task(**t) for t in data]
logger.info("Loaded %d tasks from disk", len(self.tasks))
except json.JSONDecodeError as e:
logger.error("Corrupted JSON file: %s", e, exc_info=True)
self.tasks = []
else:
logger.warning("No tasks file found. Starting fresh.")
def save(self):
try:
data = [t.__dict__ for t in self.tasks]
self.file.write_text(json.dumps(data, indent=2))
logger.debug("Saved %d tasks to %s", len(self.tasks), self.file)
except OSError as e:
logger.critical("Failed to save tasks!", exc_info=True)
def add(self, title: str):
if not title.strip():
logger.warning("Attempted to add empty task title")
raise ValueError("Title cannot be empty")
self.tasks.append(Task(title.strip()))
logger.info("Added new task: %s", title.strip())
self.save()
def complete(self, index: int):
if 1 <= index <= len(self.tasks):
task = self.tasks[index-1]
task.done = True
logger.info("Marked task as done: %s", task.title)
self.save()
else:
logger.error("Invalid task index: %d (valid: 1-%d)", index, len(self.tasks))
def list(self):
logger.debug("Listing %d tasks", len(self.tasks))
for i, task in enumerate(self.tasks, 1):
status = "DONE" if task.done else "PENDING"
print(f"{i}. [{status}] {task.title}")
5. Run & See Logs
python task_manager.py add "Buy milk"
python task_manager.py list
Console Output:
2025-04-05 10:00:01,234 [INFO] __main__:23 - Initializing TaskManager with file: tasks.json
2025-04-05 10:00:01,235 [WARNING] __main__:30 - No tasks file found. Starting fresh.
2025-04-05 10:00:01,236 [INFO] __main__:55 - Added new task: Buy milk
2025-04-05 10:00:01,236 [DEBUG] __main__:47 - Saved 1 tasks to tasks.json
1. [PENDING] Buy milk
task_manager.log file created automatically
6. Advanced: JSON Logging (for Log Aggregators)
pip install python-json-logger
from pythonjsonlogger import jsonlogger
import logging
logger = logging.getLogger()
handler = logging.StreamHandler()
formatter = jsonlogger.JsonFormatter(
'%(asctime)s %(name)s %(levelname)s %(message)s %(filename)s %(lineno)d'
)
handler.setFormatter(formatter)
logger.addHandler(handler)
Output:
{
"asctime": "2025-04-05 10:00:01,236",
"name": "__main__",
"levelname": "INFO",
"message": "Added new task: Buy milk",
"filename": "task_manager.py",
"lineno": 55
}
Perfect for ELK, Datadog, Splunk
7. Logging in Libraries vs Apps
| Context | Logger Name |
|---|---|
| App entrypoint | logging.getLogger(__name__) |
| Library module | logging.getLogger('myapp.utils') |
| Third-party | Don’t configure — let app do it |
Never do this in a library:
logging.basicConfig(...) # Only in main script!
8. Disable Noisy Logs (e.g. requests)
logging.getLogger("urllib3").setLevel(logging.WARNING)
logging.getLogger("requests").setLevel(logging.WARNING)
9. Rotate Logs (Don’t Fill Disk!)
from logging.handlers import RotatingFileHandler
handler = RotatingFileHandler("app.log", maxBytes=10_000_000, backupCount=5)
logger.addHandler(handler)
→ app.log, app.log.1, ..., app.log.5 (50 MB total)
10. Best Practices Summary
| Do | Don’t |
|---|---|
Use logger = logging.getLogger(__name__) |
Use print() in production |
| Log context (user ID, task ID) | Log passwords or tokens |
| Use structured logs in prod | Hardcode log paths |
Configure once in main |
Call basicConfig() multiple times |
| Use levels appropriately | Log everything at DEBUG in prod |
11. Quick Config for Different Environments
# config/logging_config.py
import logging.config
LOGGING_CONFIG = {
"version": 1,
"disable_existing_loggers": False,
"formatters": {
"standard": {
"format": "%(asctime)s [%(levelname)s] %(name)s: %(message)s"
}
},
"handlers": {
"console": {
"class": "logging.StreamHandler",
"formatter": "standard",
"level": "INFO"
},
"file": {
"class": "logging.handlers.RotatingFileHandler",
"filename": "app.log",
"maxBytes": 10_000_000,
"backupCount": 5,
"formatter": "standard",
"level": "DEBUG"
}
},
"root": {
"handlers": ["console", "file"],
"level": "DEBUG"
}
}
# In main.py
import logging.config
from config.logging_config import LOGGING_CONFIG
logging.config.dictConfig(LOGGING_CONFIG)
logger = logging.getLogger(__name__)
12. VS Code: View Logs Live
- Open
app.log - Right-click → "Follow"
- Or use Log Viewer extension
Summary: Debugging Toolbox
| Tool | Use Case |
|---|---|
print() |
Quick REPL check |
pdb / ipdb |
Step-through, inspect state |
logging |
Production monitoring, audit trail |
pytest |
Automated verification |
sentry.io |
Error tracking in prod |
Now your app logs like a real product!
Next Step: Add user ID to logs in Task Manager:
python:disable-run logger.info("User %s added task: %s", user_id, title)
Insert this section into your notes after Section 14.6.
Your Python tutorial now covers:
- Testing (
pytest) - Interactive Debugging (
pdb/ipdb) - Production Debugging (
logging)
Happy Logging!
```
Debugging with logging
Below is the new section you can insert right after Section 14.6 – Debugging with pdb (or before the Mini-Project).
Debugging with logging
Below is the new section you can insert right after Section 14.6 – Debugging with pdb (or before the Mini-Project).
Below is the new section you can insert right after Section 14.6 – Debugging with pdb (or before the Mini-Project).
It follows the same clean, professional style and teaches production-grade logging with real-world examples, best practices, and integration with the Task Manager.
14.7 Debugging with logging (Production-Ready Debugging)
“Print statements are the assembly language of debugging.”
“Logging is the C of debugging.” – Modern Python Wisdom
Why logging > print()?
print() |
logging |
|---|---|
| Always outputs | Configurable levels |
| No timestamps | Built-in metadata |
| Hard to disable | Disable per module |
| No file output | Write to files, rotate |
| No structure | JSON, structured logs |
1. Basic Setup
# logger.py
import logging
# Create a logger
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG) # Capture everything
# Create handlers
console_handler = logging.StreamHandler()
file_handler = logging.FileHandler("app.log")
# Set levels
console_handler.setLevel(logging.INFO)
file_handler.setLevel(logging.DEBUG)
# Create formatter
formatter = logging.Formatter(
"%(asctime)s - %(name)s - %(levelname)s - %(filename)s:%(lineno)d - %(message)s"
)
console_handler.setFormatter(formatter)
file_handler.setFormatter(formatter)
# Add handlers to logger
logger.addHandler(console_handler)
logger.addHandler(file_handler)
2. Log Levels (Use Wisely!)
| Level | When to Use |
|---|---|
DEBUG |
Detailed info, variable values |
INFO |
Normal operation milestones |
WARNING |
Unexpected but handled |
ERROR |
Serious problem, function failed |
CRITICAL |
App will crash or corrupt |
logger.debug("User data: %s", user_dict) # Safe: no string formatting issues
logger.info("User %s logged in", username)
logger.warning("Deprecated API used")
logger.error("Failed to save task", exc_info=True)
logger.critical("Database connection lost!")
3. Best Practice: Never interpolate sensitive data
# BAD
logger.debug(f"User password: {password}")
# GOOD
logger.debug("Processing login for user: %s", username)
4. Real Example: Log-Enabled Task Manager
# task_manager.py (updated with logging)
import json
from dataclasses import dataclass
from pathlib import Path
from typing import List
import logging
# Configure root logger (once at app start)
logging.basicConfig(
level=logging.DEBUG,
format="%(asctime)s [%(levelname)s] %(name)s:%(lineno)d - %(message)s",
handlers=[
logging.FileHandler("task_manager.log"),
logging.StreamHandler()
]
)
logger = logging.getLogger(__name__)
@dataclass
class Task:
title: str
done: bool = False
class TaskManager:
def __init__(self, file_path: str = "tasks.json"):
self.file = Path(file_path)
self.tasks: List[Task] = []
logger.info("Initializing TaskManager with file: %s", self.file)
self.load()
def load(self):
if self.file.exists():
try:
data = json.loads(self.file.read_text())
self.tasks = [Task(**t) for t in data]
logger.info("Loaded %d tasks from disk", len(self.tasks))
except json.JSONDecodeError as e:
logger.error("Corrupted JSON file: %s", e, exc_info=True)
self.tasks = []
else:
logger.warning("No tasks file found. Starting fresh.")
def save(self):
try:
data = [t.__dict__ for t in self.tasks]
self.file.write_text(json.dumps(data, indent=2))
logger.debug("Saved %d tasks to %s", len(self.tasks), self.file)
except OSError as e:
logger.critical("Failed to save tasks!", exc_info=True)
def add(self, title: str):
if not title.strip():
logger.warning("Attempted to add empty task title")
raise ValueError("Title cannot be empty")
self.tasks.append(Task(title.strip()))
logger.info("Added new task: %s", title.strip())
self.save()
def complete(self, index: int):
if 1 <= index <= len(self.tasks):
task = self.tasks[index-1]
task.done = True
logger.info("Marked task as done: %s", task.title)
self.save()
else:
logger.error("Invalid task index: %d (valid: 1-%d)", index, len(self.tasks))
def list(self):
logger.debug("Listing %d tasks", len(self.tasks))
for i, task in enumerate(self.tasks, 1):
status = "DONE" if task.done else "PENDING"
print(f"{i}. [{status}] {task.title}")
5. Run & See Logs
python task_manager.py add "Buy milk"
python task_manager.py list
Console Output:
2025-04-05 10:00:01,234 [INFO] __main__:23 - Initializing TaskManager with file: tasks.json
2025-04-05 10:00:01,235 [WARNING] __main__:30 - No tasks file found. Starting fresh.
2025-04-05 10:00:01,236 [INFO] __main__:55 - Added new task: Buy milk
2025-04-05 10:00:01,236 [DEBUG] __main__:47 - Saved 1 tasks to tasks.json
1. [PENDING] Buy milk
task_manager.log file created automatically
6. Advanced: JSON Logging (for Log Aggregators)
pip install python-json-logger
from pythonjsonlogger import jsonlogger
import logging
logger = logging.getLogger()
handler = logging.StreamHandler()
formatter = jsonlogger.JsonFormatter(
'%(asctime)s %(name)s %(levelname)s %(message)s %(filename)s %(lineno)d'
)
handler.setFormatter(formatter)
logger.addHandler(handler)
Output:
{
"asctime": "2025-04-05 10:00:01,236",
"name": "__main__",
"levelname": "INFO",
"message": "Added new task: Buy milk",
"filename": "task_manager.py",
"lineno": 55
}
Perfect for ELK, Datadog, Splunk
7. Logging in Libraries vs Apps
| Context | Logger Name |
|---|---|
| App entrypoint | logging.getLogger(__name__) |
| Library module | logging.getLogger('myapp.utils') |
| Third-party | Don’t configure — let app do it |
Never do this in a library:
logging.basicConfig(...) # Only in main script!
8. Disable Noisy Logs (e.g. requests)
logging.getLogger("urllib3").setLevel(logging.WARNING)
logging.getLogger("requests").setLevel(logging.WARNING)
9. Rotate Logs (Don’t Fill Disk!)
from logging.handlers import RotatingFileHandler
handler = RotatingFileHandler("app.log", maxBytes=10_000_000, backupCount=5)
logger.addHandler(handler)
→ app.log, app.log.1, ..., app.log.5 (50 MB total)
10. Best Practices Summary
| Do | Don’t |
|---|---|
Use logger = logging.getLogger(__name__) |
Use print() in production |
| Log context (user ID, task ID) | Log passwords or tokens |
| Use structured logs in prod | Hardcode log paths |
Configure once in main |
Call basicConfig() multiple times |
| Use levels appropriately | Log everything at DEBUG in prod |
11. Quick Config for Different Environments
# config/logging_config.py
import logging.config
LOGGING_CONFIG = {
"version": 1,
"disable_existing_loggers": False,
"formatters": {
"standard": {
"format": "%(asctime)s [%(levelname)s] %(name)s: %(message)s"
}
},
"handlers": {
"console": {
"class": "logging.StreamHandler",
"formatter": "standard",
"level": "INFO"
},
"file": {
"class": "logging.handlers.RotatingFileHandler",
"filename": "app.log",
"maxBytes": 10_000_000,
"backupCount": 5,
"formatter": "standard",
"level": "DEBUG"
}
},
"root": {
"handlers": ["console", "file"],
"level": "DEBUG"
}
}
# In main.py
import logging.config
from config.logging_config import LOGGING_CONFIG
logging.config.dictConfig(LOGGING_CONFIG)
logger = logging.getLogger(__name__)
12. VS Code: View Logs Live
- Open
app.log - Right-click → "Follow"
- Or use Log Viewer extension
Summary: Debugging Toolbox
| Tool | Use Case |
|---|---|
print() |
Quick REPL check |
pdb / ipdb |
Step-through, inspect state |
logging |
Production monitoring, audit trail |
pytest |
Automated verification |
sentry.io |
Error tracking in prod |
Now your app logs like a real product!
Next Step: Add user ID to logs in Task Manager:
python:disable-run logger.info("User %s added task: %s", user_id, title)
Insert this section into your notes after Section 14.6.
Your Python tutorial now covers:
- Testing (
pytest) - Interactive Debugging (
pdb/ipdb) - Production Debugging (
logging)
Happy Logging!
```