java-logging

Compare original and translation side by side

🇺🇸

Original

English
🇨🇳

Translation

Chinese

Java Logging

Java 日志

Architecture Overview

架构概览

┌─────────────────────────────────────────────────────────────┐
│                    Application Code                         │
│              (uses SLF4J API only)                          │
└─────────────────────────────────────────────────────────────┘
┌─────────────────────────────────────────────────────────────┐
│                      SLF4J Facade                           │
│              (abstraction layer)                            │
└─────────────────────────────────────────────────────────────┘
              ┌───────────────┼───────────────┐
              ▼               ▼               ▼
        ┌──────────┐   ┌──────────┐   ┌──────────────┐
        │ Logback  │   │ Log4j2   │   │ java.util    │
        │          │   │          │   │ .logging     │
        └──────────┘   └──────────┘   └──────────────┘
Rule: Always code to SLF4J API. Implementation is a runtime dependency.
┌─────────────────────────────────────────────────────────────┐
│                    Application Code                         │
│              (uses SLF4J API only)                          │
└─────────────────────────────────────────────────────────────┘
┌─────────────────────────────────────────────────────────────┐
│                      SLF4J Facade                           │
│              (abstraction layer)                            │
└─────────────────────────────────────────────────────────────┘
              ┌───────────────┼───────────────┐
              ▼               ▼               ▼
        ┌──────────┐   ┌──────────┐   ┌──────────────┐
        │ Logback  │   │ Log4j2   │   │ java.util    │
        │          │   │          │   │ .logging     │
        └──────────┘   └──────────┘   └──────────────┘
规则:始终基于SLF4J API编写代码,日志实现属于运行时依赖。

SLF4J API Usage

SLF4J API 使用指南

Basic Logging

基础日志用法

java
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

public class UserService {
    private static final Logger log = LoggerFactory.getLogger(UserService.class);

    public User findUser(Long id) {
        log.debug("Finding user with id: {}", id);

        try {
            User user = repository.findById(id);
            log.info("User found: {}", user.getEmail());
            return user;
        } catch (Exception e) {
            log.error("Failed to find user with id: {}", id, e);
            throw e;
        }
    }
}
java
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

public class UserService {
    private static final Logger log = LoggerFactory.getLogger(UserService.class);

    public User findUser(Long id) {
        log.debug("Finding user with id: {}", id);

        try {
            User user = repository.findById(id);
            log.info("User found: {}", user.getEmail());
            return user;
        } catch (Exception e) {
            log.error("Failed to find user with id: {}", id, e);
            throw e;
        }
    }
}

With Lombok

配合Lombok使用

java
import lombok.extern.slf4j.Slf4j;

@Slf4j
public class OrderService {
    public void processOrder(Order order) {
        log.info("Processing order: {}", order.getId());
    }
}
java
import lombok.extern.slf4j.Slf4j;

@Slf4j
public class OrderService {
    public void processOrder(Order order) {
        log.info("Processing order: {}", order.getId());
    }
}

Log Levels

日志级别

LevelPurposeExample
TRACE
Very detailed debuggingLoop iterations, variable values
DEBUG
Debugging informationMethod entry/exit, query params
INFO
Business eventsUser login, order placed
WARN
Potential issuesDeprecated API used, retry attempt
ERROR
Errors requiring attentionException caught, operation failed
java
log.trace("Entering loop iteration {}", i);
log.debug("Query parameters: userId={}, status={}", userId, status);
log.info("Order {} placed successfully", orderId);
log.warn("Payment retry attempt {} of {}", attempt, maxRetries);
log.error("Failed to process payment for order {}", orderId, exception);
级别用途示例
TRACE
极细粒度调试信息循环迭代过程、变量取值
DEBUG
调试相关信息方法进入/退出、查询参数
INFO
业务事件记录用户登录、订单提交
WARN
潜在问题提醒调用已弃用API、重试操作
ERROR
需要人工介入的错误捕获到异常、操作执行失败
java
log.trace("Entering loop iteration {}", i);
log.debug("Query parameters: userId={}, status={}", userId, status);
log.info("Order {} placed successfully", orderId);
log.warn("Payment retry attempt {} of {}", attempt, maxRetries);
log.error("Failed to process payment for order {}", orderId, exception);

Parameterized Messages (Best Practice)

参数化消息(最佳实践)

java
// GOOD - uses parameterized logging (efficient)
log.debug("Processing user {} with role {}", userId, role);

// BAD - string concatenation (always evaluated)
log.debug("Processing user " + userId + " with role " + role);

// For expensive operations, use isEnabled check
if (log.isDebugEnabled()) {
    log.debug("Complex data: {}", computeExpensiveDebugInfo());
}
java
// 推荐写法 - 使用参数化日志(性能更高)
log.debug("Processing user {} with role {}", userId, role);

// 不推荐写法 - 字符串拼接会始终执行计算
log.debug("Processing user " + userId + " with role " + role);

// 对于计算成本高的操作,先判断日志级别是否开启
if (log.isDebugEnabled()) {
    log.debug("Complex data: {}", computeExpensiveDebugInfo());
}

Logback Configuration

Logback 配置

Spring Boot (logback-spring.xml)

Spring Boot (logback-spring.xml)

xml
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
    <!-- Include Spring Boot defaults -->
    <include resource="org/springframework/boot/logging/logback/defaults.xml"/>

    <!-- Properties -->
    <property name="LOG_PATH" value="${LOG_PATH:-logs}"/>
    <property name="LOG_FILE" value="${LOG_FILE:-application}"/>

    <!-- Console Appender -->
    <appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
        <encoder>
            <pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} %highlight(%-5level) [%thread] %cyan(%logger{36}) - %msg%n</pattern>
        </encoder>
    </appender>

    <!-- Rolling File Appender -->
    <appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
        <file>${LOG_PATH}/${LOG_FILE}.log</file>
        <rollingPolicy class="ch.qos.logback.core.rolling.SizeAndTimeBasedRollingPolicy">
            <fileNamePattern>${LOG_PATH}/${LOG_FILE}.%d{yyyy-MM-dd}.%i.log.gz</fileNamePattern>
            <maxFileSize>100MB</maxFileSize>
            <maxHistory>30</maxHistory>
            <totalSizeCap>3GB</totalSizeCap>
        </rollingPolicy>
        <encoder>
            <pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} %-5level [%thread] %logger{36} - %msg%n</pattern>
            <charset>UTF-8</charset>
        </encoder>
    </appender>

    <!-- JSON Appender for Production -->
    <appender name="JSON" class="ch.qos.logback.core.rolling.RollingFileAppender">
        <file>${LOG_PATH}/${LOG_FILE}-json.log</file>
        <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
            <fileNamePattern>${LOG_PATH}/${LOG_FILE}-json.%d{yyyy-MM-dd}.log.gz</fileNamePattern>
            <maxHistory>7</maxHistory>
        </rollingPolicy>
        <encoder class="net.logstash.logback.encoder.LogstashEncoder">
            <includeMdcKeyName>traceId</includeMdcKeyName>
            <includeMdcKeyName>userId</includeMdcKeyName>
        </encoder>
    </appender>

    <!-- Async Appender for Performance -->
    <appender name="ASYNC_FILE" class="ch.qos.logback.classic.AsyncAppender">
        <queueSize>512</queueSize>
        <discardingThreshold>0</discardingThreshold>
        <appender-ref ref="FILE"/>
    </appender>

    <!-- Logger Configuration -->
    <logger name="com.yourcompany" level="DEBUG"/>
    <logger name="org.springframework" level="INFO"/>
    <logger name="org.hibernate.SQL" level="DEBUG"/>
    <logger name="org.hibernate.type.descriptor.sql" level="TRACE"/>

    <!-- Root Logger -->
    <root level="INFO">
        <appender-ref ref="CONSOLE"/>
        <appender-ref ref="ASYNC_FILE"/>
    </root>

    <!-- Profile-specific configuration -->
    <springProfile name="prod">
        <root level="INFO">
            <appender-ref ref="JSON"/>
        </root>
    </springProfile>
</configuration>
xml
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
    <!-- Include Spring Boot defaults -->
    <include resource="org/springframework/boot/logging/logback/defaults.xml"/>

    <!-- Properties -->
    <property name="LOG_PATH" value="${LOG_PATH:-logs}"/>
    <property name="LOG_FILE" value="${LOG_FILE:-application}"/>

    <!-- Console Appender -->
    <appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
        <encoder>
            <pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} %highlight(%-5level) [%thread] %cyan(%logger{36}) - %msg%n</pattern>
        </encoder>
    </appender>

    <!-- Rolling File Appender -->
    <appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
        <file>${LOG_PATH}/${LOG_FILE}.log</file>
        <rollingPolicy class="ch.qos.logback.core.rolling.SizeAndTimeBasedRollingPolicy">
            <fileNamePattern>${LOG_PATH}/${LOG_FILE}.%d{yyyy-MM-dd}.%i.log.gz</fileNamePattern>
            <maxFileSize>100MB</maxFileSize>
            <maxHistory>30</maxHistory>
            <totalSizeCap>3GB</totalSizeCap>
        </rollingPolicy>
        <encoder>
            <pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} %-5level [%thread] %logger{36} - %msg%n</pattern>
            <charset>UTF-8</charset>
        </encoder>
    </appender>

    <!-- JSON Appender for Production -->
    <appender name="JSON" class="ch.qos.logback.core.rolling.RollingFileAppender">
        <file>${LOG_PATH}/${LOG_FILE}-json.log</file>
        <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
            <fileNamePattern>${LOG_PATH}/${LOG_FILE}-json.%d{yyyy-MM-dd}.log.gz</fileNamePattern>
            <maxHistory>7</maxHistory>
        </rollingPolicy>
        <encoder class="net.logstash.logback.encoder.LogstashEncoder">
            <includeMdcKeyName>traceId</includeMdcKeyName>
            <includeMdcKeyName>userId</includeMdcKeyName>
        </encoder>
    </appender>

    <!-- Async Appender for Performance -->
    <appender name="ASYNC_FILE" class="ch.qos.logback.classic.AsyncAppender">
        <queueSize>512</queueSize>
        <discardingThreshold>0</discardingThreshold>
        <appender-ref ref="FILE"/>
    </appender>

    <!-- Logger Configuration -->
    <logger name="com.yourcompany" level="DEBUG"/>
    <logger name="org.springframework" level="INFO"/>
    <logger name="org.hibernate.SQL" level="DEBUG"/>
    <logger name="org.hibernate.type.descriptor.sql" level="TRACE"/>

    <!-- Root Logger -->
    <root level="INFO">
        <appender-ref ref="CONSOLE"/>
        <appender-ref ref="ASYNC_FILE"/>
    </root>

    <!-- Profile-specific configuration -->
    <springProfile name="prod">
        <root level="INFO">
            <appender-ref ref="JSON"/>
        </root>
    </springProfile>
</configuration>

application.yml Configuration

application.yml 配置

yaml
logging:
  level:
    root: INFO
    com.yourcompany: DEBUG
    org.springframework.web: INFO
    org.hibernate.SQL: DEBUG
  pattern:
    console: "%d{HH:mm:ss.SSS} %highlight(%-5level) [%thread] %cyan(%logger{36}) - %msg%n"
    file: "%d{yyyy-MM-dd HH:mm:ss.SSS} %-5level [%thread] %logger{36} - %msg%n"
  file:
    name: logs/application.log
  logback:
    rollingpolicy:
      max-file-size: 100MB
      max-history: 30
yaml
logging:
  level:
    root: INFO
    com.yourcompany: DEBUG
    org.springframework.web: INFO
    org.hibernate.SQL: DEBUG
  pattern:
    console: "%d{HH:mm:ss.SSS} %highlight(%-5level) [%thread] %cyan(%logger{36}) - %msg%n"
    file: "%d{yyyy-MM-dd HH:mm:ss.SSS} %-5level [%thread] %logger{36} - %msg%n"
  file:
    name: logs/application.log
  logback:
    rollingpolicy:
      max-file-size: 100MB
      max-history: 30

Log4j2 Configuration

Log4j2 配置

log4j2-spring.xml

log4j2-spring.xml

xml
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="WARN">
    <Properties>
        <Property name="LOG_PATH">logs</Property>
        <Property name="LOG_PATTERN">%d{yyyy-MM-dd HH:mm:ss.SSS} %-5level [%thread] %logger{36} - %msg%n</Property>
    </Properties>

    <Appenders>
        <Console name="Console" target="SYSTEM_OUT">
            <PatternLayout pattern="${LOG_PATTERN}"/>
        </Console>

        <RollingFile name="File" fileName="${LOG_PATH}/app.log"
                     filePattern="${LOG_PATH}/app-%d{yyyy-MM-dd}-%i.log.gz">
            <PatternLayout pattern="${LOG_PATTERN}"/>
            <Policies>
                <SizeBasedTriggeringPolicy size="100MB"/>
                <TimeBasedTriggeringPolicy/>
            </Policies>
            <DefaultRolloverStrategy max="30"/>
        </RollingFile>

        <!-- Async for high performance -->
        <Async name="AsyncFile">
            <AppenderRef ref="File"/>
        </Async>
    </Appenders>

    <Loggers>
        <Logger name="com.yourcompany" level="debug"/>
        <Root level="info">
            <AppenderRef ref="Console"/>
            <AppenderRef ref="AsyncFile"/>
        </Root>
    </Loggers>
</Configuration>
xml
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="WARN">
    <Properties>
        <Property name="LOG_PATH">logs</Property>
        <Property name="LOG_PATTERN">%d{yyyy-MM-dd HH:mm:ss.SSS} %-5level [%thread] %logger{36} - %msg%n</Property>
    </Properties>

    <Appenders>
        <Console name="Console" target="SYSTEM_OUT">
            <PatternLayout pattern="${LOG_PATTERN}"/>
        </Console>

        <RollingFile name="File" fileName="${LOG_PATH}/app.log"
                     filePattern="${LOG_PATH}/app-%d{yyyy-MM-dd}-%i.log.gz">
            <PatternLayout pattern="${LOG_PATTERN}"/>
            <Policies>
                <SizeBasedTriggeringPolicy size="100MB"/>
                <TimeBasedTriggeringPolicy/>
            </Policies>
            <DefaultRolloverStrategy max="30"/>
        </RollingFile>

        <!-- Async for high performance -->
        <Async name="AsyncFile">
            <AppenderRef ref="File"/>
        </Async>
    </Appenders>

    <Loggers>
        <Logger name="com.yourcompany" level="debug"/>
        <Root level="info">
            <AppenderRef ref="Console"/>
            <AppenderRef ref="AsyncFile"/>
        </Root>
    </Loggers>
</Configuration>

Structured Logging (JSON)

结构化日志(JSON格式)

Dependencies

依赖引入

xml
<!-- For Logback -->
<dependency>
    <groupId>net.logstash.logback</groupId>
    <artifactId>logstash-logback-encoder</artifactId>
    <version>7.4</version>
</dependency>
xml
<!-- For Logback -->
<dependency>
    <groupId>net.logstash.logback</groupId>
    <artifactId>logstash-logback-encoder</artifactId>
    <version>7.4</version>
</dependency>

MDC (Mapped Diagnostic Context)

MDC(映射诊断上下文)

java
import org.slf4j.MDC;

@Component
public class RequestLoggingFilter implements Filter {

    @Override
    public void doFilter(ServletRequest request, ServletResponse response, FilterChain chain) {
        try {
            MDC.put("traceId", generateTraceId());
            MDC.put("userId", getCurrentUserId());
            MDC.put("requestPath", ((HttpServletRequest) request).getRequestURI());

            chain.doFilter(request, response);
        } finally {
            MDC.clear();
        }
    }
}
java
import org.slf4j.MDC;

@Component
public class RequestLoggingFilter implements Filter {

    @Override
    public void doFilter(ServletRequest request, ServletResponse response, FilterChain chain) {
        try {
            MDC.put("traceId", generateTraceId());
            MDC.put("userId", getCurrentUserId());
            MDC.put("requestPath", ((HttpServletRequest) request).getRequestURI());

            chain.doFilter(request, response);
        } finally {
            MDC.clear();
        }
    }
}

Structured Log Output

结构化日志输出

java
import static net.logstash.logback.argument.StructuredArguments.*;

log.info("Order processed",
    kv("orderId", order.getId()),
    kv("customerId", order.getCustomerId()),
    kv("amount", order.getTotal()),
    kv("currency", "USD"));
Output:
json
{
  "@timestamp": "2025-01-15T10:30:00.000Z",
  "level": "INFO",
  "logger": "com.example.OrderService",
  "message": "Order processed",
  "orderId": "ORD-12345",
  "customerId": "CUST-789",
  "amount": 99.99,
  "currency": "USD",
  "traceId": "abc123",
  "userId": "user456"
}
java
import static net.logstash.logback.argument.StructuredArguments.*;

log.info("Order processed",
    kv("orderId", order.getId()),
    kv("customerId", order.getCustomerId()),
    kv("amount", order.getTotal()),
    kv("currency", "USD"));
输出示例:
json
{
  "@timestamp": "2025-01-15T10:30:00.000Z",
  "level": "INFO",
  "logger": "com.example.OrderService",
  "message": "Order processed",
  "orderId": "ORD-12345",
  "customerId": "CUST-789",
  "amount": 99.99,
  "currency": "USD",
  "traceId": "abc123",
  "userId": "user456"
}

Best Practices

最佳实践

DO

推荐做法

  • Use SLF4J API everywhere
  • Use parameterized messages
    log.info("User {}", userId)
  • Include correlation IDs (traceId, requestId)
  • Log at appropriate levels
  • Use async appenders in production
  • Configure log rotation
  • Use JSON format for log aggregation
  • 所有代码都使用SLF4J API
  • 使用参数化消息写法
    log.info("User {}", userId)
  • 日志中包含关联ID(traceId、requestId)
  • 按照场景选择合适的日志级别
  • 生产环境使用异步appender
  • 配置日志滚动策略
  • 使用JSON格式输出方便日志聚合

DON'T

禁止做法

  • Don't use string concatenation in log messages
  • Don't log sensitive data (passwords, tokens, PII)
  • Don't log inside tight loops without level check
  • Don't use System.out.println for logging
  • Don't catch and swallow exceptions silently
  • 不要在日志消息中使用字符串拼接
  • 不要记录敏感数据(密码、令牌、个人可识别信息)
  • 不要在紧凑循环中不加级别判断直接打日志
  • 不要使用System.out.println做日志输出
  • 不要捕获异常后静默吞掉不打印日志

Security

安全注意事项

java
// BAD - logs sensitive data
log.info("User login: email={}, password={}", email, password);

// GOOD - mask sensitive data
log.info("User login: email={}", maskEmail(email));

// Helper
private String maskEmail(String email) {
    int atIndex = email.indexOf('@');
    if (atIndex > 2) {
        return email.substring(0, 2) + "***" + email.substring(atIndex);
    }
    return "***";
}
java
// 错误写法 - 记录敏感数据
log.info("User login: email={}, password={}", email, password);

// 正确写法 - 敏感数据脱敏
log.info("User login: email={}", maskEmail(email));

// 脱敏工具方法
private String maskEmail(String email) {
    int atIndex = email.indexOf('@');
    if (atIndex > 2) {
        return email.substring(0, 2) + "***" + email.substring(atIndex);
    }
    return "***";
}

When NOT to Use This Skill

本内容不适用的场景

  • SLF4J API-only questions: Use
    slf4j
    skill for API usage patterns
  • Logback configuration details: Use
    logback
    skill for XML config
  • Node.js/Python projects: Use language-appropriate logging skills
  • Application code patterns: Focus on SLF4J API, not implementation
  • Framework migration: Consult migration-specific guides
  • 仅询问SLF4J API相关问题:使用
    slf4j
    技能查询API使用模式
  • Logback配置细节问题:使用
    logback
    技能查询XML配置方法
  • Node.js/Python项目日志问题:使用对应语言的日志相关技能
  • 应用代码模式问题:本内容仅聚焦SLF4J API,不涉及业务实现细节
  • 框架迁移问题:请参考迁移专项指南

Anti-Patterns

反模式

Anti-PatternWhy It's BadSolution
Using System.out.printlnNo control, no persistence, no filteringUse SLF4J logger
String concatenation in logsAlways evaluated, performance hitUse parameterized logging:
log.info("User {}", id)
Not using async appenders in productionBlocks application threadsWrap with AsyncAppender
Logging without MDC in multi-threaded appsLoses request contextUse MDC for correlation IDs
DEBUG level in productionPerformance impact, disk usageUse INFO or WARN in production
Not masking sensitive dataSecurity/compliance violationFilter passwords, tokens, PII before logging
反模式弊端解决方案
使用System.out.println输出日志无级别控制、无持久化、无法过滤使用SLF4J日志组件
日志中使用字符串拼接无论日志级别是否开启都会执行拼接,影响性能使用参数化日志:
log.info("User {}", id)
生产环境不使用异步appender会阻塞应用线程用AsyncAppender包装同步appender
多线程应用中不使用MDC丢失请求上下文,无法串联同请求的日志用MDC存储关联ID
生产环境开启DEBUG级别性能损耗大,占用过多磁盘空间生产环境使用INFO或WARN级别
不对敏感数据脱敏违反安全/合规要求日志输出前过滤密码、令牌、个人可识别信息

Quick Troubleshooting

快速排障

IssueCauseSolution
NoClassDefFoundError: StaticLoggerBinderMissing SLF4J implementationAdd Logback or Log4j2 dependency
Multiple bindings warningMultiple implementations on classpathKeep only one: Logback OR Log4j2
Logs not appearingWrong log level or missing configCheck logback.xml and log levels
Performance degradationSynchronous appendersUse AsyncAppender wrapper
Logs not rotatingMissing rolling policyConfigure RollingFileAppender
MDC values not showingPattern missing %X{key}Add MDC placeholders to log pattern
问题原因解决方案
NoClassDefFoundError: StaticLoggerBinder缺少SLF4J实现依赖引入Logback或Log4j2依赖
多绑定警告类路径下存在多个日志实现仅保留一个实现:Logback 或 Log4j2
日志不输出日志级别配置错误或缺少配置文件检查logback.xml配置和日志级别设置
性能下降使用了同步appender用AsyncAppender包装
日志不滚动缺少滚动策略配置配置RollingFileAppender滚动规则
MDC值不显示日志pattern中缺少%X{key}占位符在日志格式中添加MDC占位符

Reference

参考资料