Reputation: 381
I'm building a java application with Spark framework with embedded Jetty and handlebars template engine. But when i get an 500 Internal Error, the console didn't say anything. I have added to my pom.xml the dependencies here: http://sparkjava.com/documentation.html#add-a-logger but does not print all exceptions / errors (like errors 500)
Here my pom.xml dependecies
<dependencies>
<!-- FRAMEWORK: Spark -->
<dependency>
<groupId>com.sparkjava</groupId>
<artifactId>spark-core</artifactId>
<version>2.5</version>
</dependency>
<!-- TEMPLATES: Handlebars -->
<dependency>
<groupId>com.sparkjava</groupId>
<artifactId>spark-template-handlebars</artifactId>
<version>2.3</version>
</dependency>
<!-- DB-MAPPING: sql2o -->
<dependency>
<groupId>org.sql2o</groupId>
<artifactId>sql2o</artifactId>
<version>1.5.4</version>
</dependency>
<!-- DRIVERS: sqlite-->
<dependency>
<groupId>org.xerial</groupId>
<artifactId>sqlite-jdbc</artifactId>
<version>3.8.11.2</version>
</dependency>
<!-- LOGGER: slf4j -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>1.7.21</version>
</dependency>
</dependencies>
How i can enable all the logging for spark?
Upvotes: 11
Views: 12121
Reputation: 5126
Not sure if this meant disabling spark or Hadoop in built logging but if thats the case, setting the log level in SparkContext helped me.
sc.setLogLevel("ERROR");
Possible options are ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN
Upvotes: -1
Reputation: 19070
To enable logging, just add the following dependency to your project:
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>1.7.21</version>
</dependency>
and you can register a catch-all Spark exception handler to log uncaught exceptions:
Spark.exception(Exception.class, (exception, request, response) -> {
exception.printStackTrace();
});
Upvotes: 17
Reputation: 89
Use log4j to make a logging implementation. That's why you don't have an idea why are you getting an internal server error
http://logging.apache.org/log4j/2.x/
Upvotes: 2
Reputation: 3
Have you added a log4j properties file? Have a look at this documentation.
Configuring Logging Spark uses log4j for logging. You can configure it by adding a log4j.properties file in the conf directory. One way to start is to copy the existing log4j.properties.template located there.
Upvotes: -3