Logger classes issues after Java and Spark update

I know there has been similar questions regarding this exception, however none of them solved my problem.
I have a java app. Recently I had to upgrade from Java 17 to 21, which lead to also upgrading Apache Spark to 3.5.0. Before upgrading my app was running without any problems. After all the upgrades I get the following exception when I run my app:

Exception in thread "main" java.lang.reflect.InvocationTargetException
        at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:109)
        at java.base/java.lang.reflect.Method.invoke(Method.java:580)
        at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:49)
        at org.springframework.boot.loader.Launcher.launch(Launcher.java:108)
        at org.springframework.boot.loader.Launcher.launch(Launcher.java:58)
        at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:88)
Caused by: java.lang.ClassCastException: class org.apache.logging.slf4j.SLF4JLoggerContext cannot be cast to class org.apache.logging.log4j.core.LoggerContext (org.apache.logging.slf4j.SLF4JLoggerContext and org.apache.logging.log4j.core.LoggerContext are in unnamed module of loader org.springframework.boot.loader.LaunchedURLClassLoader @20fa23c1)
        at org.apache.spark.util.Utils$.setLogLevel(Utils.scala:2318)
        at org.apache.spark.SparkContext.setLogLevel(SparkContext.scala:399)
        at org.apache.spark.api.java.JavaSparkContext.setLogLevel(JavaSparkContext.scala:673)
        at cz.cuni.matfyz.mminfer.wrappers.MongoDBSchemaLessWrapper.initiateContext(MongoDBSchemaLessWrapper.java:72)
        at cz.cuni.matfyz.mminfer.algorithms.rba.RecordBasedAlgorithm.process(RecordBasedAlgorithm.java:27)
        at cz.cuni.matfyz.mminfer.MMInferOneInAll.main(MMInferOneInAll.java:45)
        at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:103)
        ... 5 more

This is how I run my app (the command line options are there to ensure compatibility of Spark 3.5.0 and Java 21 ):

java --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED  --add-opens=java.base/java.lang.reflect=ALL-UNNAMED  --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.concurrent=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/sun.nio.cs=ALL-UNNAMED --add-opens=java.base/sun.security.action=ALL-UNNAMED --add-opens=java.base/sun.util.calendar=ALL-UNNAMED --add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED -jar target/MM-Infer-One-In-All-1.0-SNAPSHOT.jar C:\Users\alzbe\Documents\mff_mgr\Diplomka\Apps\temp\checkpoint srutkova yelpbusinesssample

This is the respective dependencies in my pom file:

    <dependencies>
        <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.12</artifactId>
            <version>3.5.0</version>
            <!--<scope>provided</scope>-->
        </dependency>
        <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.12</artifactId>
            <version>3.5.0</version>
        </dependency>
        <!-- https://mvnrepository.com/artifact/org.springframework.boot/spring-boot-starter -->
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter</artifactId>
            <version>2.4.13</version>

        </dependency>
        <!--https://mvnrepository.com/artifact/org.mongodb/mongo-java-driver--> 
        <dependency>
            <groupId>org.mongodb</groupId>
            <artifactId>mongo-java-driver</artifactId>
            <version>3.12.10</version>
        </dependency>
        <!--https://mvnrepository.com/artifact/org.mongodb.spark/mongo-spark-connector--> 
        <dependency>
            <groupId>org.mongodb.spark</groupId>
            <artifactId>mongo-spark-connector_2.12</artifactId>
            <version>3.0.1</version>
            <scope>compile</scope>
            <type>jar</type>
        </dependency>
        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>3.8</version>
            <scope>compile</scope>
        </dependency>
        <dependency>
            <groupId>org.junit.jupiter</groupId>
            <artifactId>junit-jupiter</artifactId>
            <version>5.4.0</version>
            <scope>compile</scope>
        </dependency>

    </dependencies>
    

I have tried switching to different versions of spring boot and excluding slf4j/log4j. (Actually exclusion of log4j makes me unable to connect to my MongoDB) However all of my attempts failed.

Does anyone have any suggestions what the problem might be?

org.apache.logging.log4j.LoggerContext comes from log4j-to-slf4j 2.7
org.apache.logging.log4j.core.LoggerContext comes from log4j-to-slf4j 2.22

your spring boot version uses (forces) 2.13, while it seems that spark needs 2.22.
So one thing you can try is to force 2.22 (search how u can do it if you dont know and double check with dependencies task that this is the case) and if this doesnt work you need to upgrade spring. Spring doesnt currently support 2.22 in any versions but maybe a solution with spring 3 and forcing 2.22 will work (if spring 2 and forcing 2.22 doesnt). Or else, unfortunately, you cant use the latest spark

Leave a Comment