Upgrading apache spark core from 3.3.2 to >=3.4.4 results in stackoverflowerror in logging - Stack Overflow

时间: 2025-01-06 admin 业界

I have an application with embedded spark-libraries. Everything is working find with spark-core v3.3.2. After upgrading to >= 3.4.4 getting a sparksession results in an stackoverflow exception within slf4j.

I've tried to fiddle out what is changed and tried reconfigure logging, but without any effect.

    Caused by: java.lang.StackOverflowError
        at java.base/java.time.Clock$SystemClock.instant(Clock.java:529)
        at java.base/java.time.Instant.now(Instant.java:273)
        at java.logging/java.util.logging.LogRecord.<init>(LogRecord.java:229)
        at org.slf4j.impl.JDK14LoggerAdapter.log(JDK14LoggerAdapter.java:576)
        at org.slf4j.impl.JDK14LoggerAdapter.log(JDK14LoggerAdapter.java:632)
        at org.slf4j.bridge.SLF4JBridgeHandler.callLocationAwareLogger(SLF4JBridgeHandler.java:232)
        at org.slf4j.bridge.SLF4JBridgeHandler.publish(SLF4JBridgeHandler.java:313)
        at java.logging/java.util.logging.Logger.log(Logger.java:979)
        at org.slf4j.impl.JDK14LoggerAdapter.log(JDK14LoggerAdapter.java:582)
        at org.slf4j.impl.JDK14LoggerAdapter.log(JDK14LoggerAdapter.java:632)
        at org.slf4j.bridge.SLF4JBridgeHandler.callLocationAwareLogger(SLF4JBridgeHandler.java:232)

Any suggestions?

I've tried switching sl4j-nop and slf4j-simple configurations. I've tried using spark 3.5.4. I've tried switching logging levels

Nothing helps