Loading...
Loading...
structlog - structured logging library for Python with native JSON support, context binding, and processor pipeline. Integrates with FastAPI, Django, and standard logging module. USE WHEN: user mentions "structlog", "python structured logging", "context binding", asks about "JSON logging python", "fastapi logging", "django structured logging" DO NOT USE FOR: Standard Python logging - use `python-logging` instead, Node.js logging - use `pino` or `winston`, Java logging - use `slf4j` or `logback` instead
npx skill4agent add claude-dev-suite/claude-dev-suite structlogDeep Knowledge: Usewith technology:mcp__documentation__fetch_docsfor comprehensive documentation.structlog
pip install structlogimport structlog
structlog.configure(
processors=[
structlog.stdlib.filter_by_level,
structlog.stdlib.add_logger_name,
structlog.stdlib.add_log_level,
structlog.stdlib.PositionalArgumentsFormatter(),
structlog.processors.TimeStamper(fmt="iso"),
structlog.processors.StackInfoRenderer(),
structlog.processors.format_exc_info,
structlog.processors.UnicodeDecoder(),
structlog.processors.JSONRenderer()
],
wrapper_class=structlog.stdlib.BoundLogger,
context_class=dict,
logger_factory=structlog.stdlib.LoggerFactory(),
cache_logger_on_first_use=True,
)import structlog
log = structlog.get_logger()
log.info("user_logged_in", user_id=123, ip="192.168.1.1")
log.warning("rate_limit_exceeded", endpoint="/api/users", count=100)
log.error("database_error", error="connection timeout", retry=3)log = structlog.get_logger()
# Bind context for all subsequent logs
log = log.bind(request_id="abc-123", user_id=42)
log.info("processing_started") # Includes request_id and user_id
log.info("step_completed", step=1)
log.info("processing_finished")
# New context
log = log.new(request_id="xyz-789")from fastapi import FastAPI, Request
import structlog
app = FastAPI()
@app.middleware("http")
async def add_request_context(request: Request, call_next):
structlog.contextvars.clear_contextvars()
structlog.contextvars.bind_contextvars(
request_id=request.headers.get("X-Request-ID", str(uuid.uuid4())),
path=request.url.path,
)
return await call_next(request)# settings.py
LOGGING = {
"version": 1,
"disable_existing_loggers": False,
"formatters": {
"json": {
"()": structlog.stdlib.ProcessorFormatter,
"processor": structlog.processors.JSONRenderer(),
},
},
"handlers": {
"console": {
"class": "logging.StreamHandler",
"formatter": "json",
},
},
"root": {
"handlers": ["console"],
"level": "INFO",
},
}try:
risky_operation()
except Exception:
log.exception("operation_failed", operation="risky")
# Automatically includes stack trace| Anti-Pattern | Why It's Bad | Solution |
|---|---|---|
| Using ConsoleRenderer in production | Wastes CPU, not machine-parseable | Use JSONRenderer for production |
| Not clearing context variables | Leaks context across requests | Use |
| Logging large objects | Serialization overhead | Log only necessary fields or IDs |
| Creating new logger per request | Performance overhead | Use |
| Missing exception logging | Loses stack traces | Use |
| Not configuring processors | Incomplete/inconsistent output | Configure full processor pipeline |
| Issue | Cause | Solution |
|---|---|---|
| Plain text output instead of JSON | ConsoleRenderer configured | Change to |
| Context not appearing in logs | Not using context binding | Use |
| Performance issues | Too many processors | Remove unnecessary processors, use JSONRenderer |
| Missing timestamps | No TimeStamper processor | Add |
| Logs not colorized in dev | Missing dev configuration | Use |
| Context bleeding across requests | Not clearing contextvars | Clear context at request start with middleware |