1

I'm working on a Java application that uses SLF4J for logging and Log4j for the underlying logging implementation. The application frequently logs repetitive messages in quick succession, leading to cluttered log files and making it harder to track unique issues.

I'd like to prevent duplicate log messages from being logged. My current Log4j configuration includes several appenders (e.g., ConsoleAppender), and I've been looking into filtering options to address the duplication issue.

Here are the key things I'm trying to achieve:

Ensure that any repeated log messages are filtered out. Keep the logging configuration in the log4j.xml file as simple as possible. Ideally, implement a custom filter or some other solution that integrates with my existing Log4j setup. So far, I've attempted to modify the log4j.xml by trying to add filters, but I'm unsure of the best approach to apply this filtering logic.

Could someone suggest an efficient way to filter out duplicate log messages with Log4j? Is there an existing filter, or would I need to implement a custom one? Any guidance on how to configure it in the log4j.xml file would be greatly appreciated!

  • I'm using SLF4J for logging, with Log4j as the implementation.
  • My appender is a ConsoleAppender, but I may extend this to other appenders later.
  • I'm open to custom filters or any built-in features of Log4j that would help.
  • List item
4
  • Can you clarify with an example what "duplicate log message" means for you? Are you talking about the same log event being logged more than once (same timestamp, message, thread and other attributes)? Or are you talking about multiple log events with the same message being printed at an excessive rate? Commented Oct 15, 2024 at 19:38
  • 1
    @Piotr P. Karwasz With duplicated message I mean that the following message is logged mutiple times: 2024-10-15 17:00:54|INFO|Setting up connection Commented Oct 16, 2024 at 8:24
  • Can you edit your question and add you Log4j 2 Core configuration file? Commented Oct 16, 2024 at 8:52
  • Looking at your solution it seems that you are receiving separate log events with the same message string (they come from separate log calls in your application). If you were using Log4j 2 instead of Log4j 1, you could use the provided BurstFilter to limit the rate of log events. Commented Oct 17, 2024 at 9:46

2 Answers 2

1

I have partially solved this problem yesterday. There aren't duplicates anymore in the loggings, but I have noticed that the loggings stop coming in after a short while. I assume it has something to do with the strictness in my filter in the DuplicateMessageFilter class.This is the appender what I added yesterday and the regarding class as well:

<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE log4j:configuration SYSTEM "log4j.dtd">
<log4j:configuration>

<List of other appenders>

<appender name="consoleAppender" class="org.apache.log4j.ConsoleAppender">
    <param name="Target" value="System.out"/>
    <layout class="org.apache.log4j.EnhancedPatternLayout">
        <param name="ConversionPattern" value="%d{yyyy-MM-dd HH:mm:ss}|%p|%m%n" />
    </layout>
    <filter class="com.someanonymouspackage.DuplicateMessageFilter">
    </filter>
</appender>

<List of logger elements>

<root>
    <priority value="debug" />
        <appender-ref ref="consoleAppender" />
    </root>
</log4j:configuration>
public class DuplicateMessageFilter extends Filter {
    private static final Set<String> messageCache = new HashSet<>();
    private static final long CACHE_TIMEOUT = 5000;
    private long lastLogTime = 0;

    @Override
    public int decide(LoggingEvent event) {
        String message = event.getRenderedMessage();
        long currentTime = System.currentTimeMillis();

        if ((currentTime - lastLogTime) > CACHE_TIMEOUT) {
            messageCache.clear();
        }

        lastLogTime = currentTime;

        if (messageCache.contains(message)) {
            return Filter.DENY;
        } else {
            messageCache.add(message);
            return Filter.NEUTRAL;
        }
    }
}
Sign up to request clarification or add additional context in comments.

3 Comments

I strongly suggest you migrate from Log4j 1 to Log4j 2 Core: Log4j 1 reached end-of-life in 2015 and has several vulnerabilities. You can find a Migrating from Log4j 1 guide on the Log4j website.
Thank you for your suggestion regarding the migration from Log4j 1 to Log4j 2 Core. I appreciate the importance of using up-to-date libraries to mitigate vulnerabilities. However, I want to clarify that this project is for work, and the decision to migrate lies with the project management team. In the meantime, could you suggest any immediate solutions or configurations within Log4j 1 that can help solve the problem regarding my question until a migration can be approved? Many thanks in advance!
If you are stuck with Log4j 1, your rate limiting mechanism looks fine to me. There are more advanced rate limiting algorithms, but if migration is imminent, it is probably not worth the effort to implement them.
-1

See the bellow points in order to fix the issue.

  1. Log through SLF4J only.
  2. Check for multiple appenders targeting the same output.
  3. Ensure only one logging implementation is present on the classpath.
  4. Set additivity="false" to avoid message propagation.
  5. Avoid duplicate logging statements in your code.
  6. Make sure there is a single SLF4J binding.

By following these steps, you should be able to prevent duplicate log messages in your Java application.

1 Comment

(1) has nothing to do with log duplication. (3) and (6) also don't cause log duplication, since each logging API binds to a single implementation. Regarding (4) the additivity setting is there, because it is useful: if you are going to add the appenders of the parent logger anyway, no need to set it to false.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.