How to get MDC logging working for Spring Batch - spring-batch

In Spring Batch it would be great to keep track of the execution thread through logging. However, MDC does not seem to work.
MDC.put("process", "batchJob");
logger.info("{}; status={}", getJobName(), batchStatus.name());
Anyone got MDC working in Spring Batch?

I solved it by adding a JobExecutionListener like that:
public class Slf4jBatchJobListener implements JobExecutionListener {
private static final String DEFAULT_MDC_UUID_TOKEN_KEY = "Slf4jMDCFilter.UUID";
private final Logger logger = LoggerFactory.getLogger(getClass());
public void beforeJob(JobExecution jobExecution) {
String token = UUID.randomUUID().toString().toUpperCase();
MDC.put(DEFAULT_MDC_UUID_TOKEN_KEY, token);
logger.info("Job {} with id {} starting...", jobExecution.getJobInstance().getJobName(), jobExecution.getId());
}
public void afterJob(JobExecution jobExecution) {
logger.info("Job {} with id {} ended.", jobExecution.getJobInstance().getJobName(), jobExecution.getId());
MDC.remove(DEFAULT_MDC_UUID_TOKEN_KEY);
}
}
Because some jobs are multi-threaded, I had to add also a TaskDecorator in order to copy the DMC from the parent thread to the subthread like this:
public class Slf4JTaskDecorator implements TaskDecorator {
#Override
public Runnable decorate(Runnable runnable) {
Map<String, String> contextMap = MDC.getCopyOfContextMap();
return () -> {
try {
MDC.setContextMap(contextMap);
runnable.run();
} finally {
MDC.clear();
}
};
}
}
Set the TaskDecorator to the TaskExecutor:
#Bean
public TaskExecutor taskExecutor(){
SimpleAsyncTaskExecutor taskExecutor = new SimpleAsyncTaskExecutor("spring_batch");
taskExecutor.setConcurrencyLimit(maxThreads);
taskExecutor.setTaskDecorator(new Slf4JTaskDecorator());
return taskExecutor;
}
And lastly, update the logging pattern in properties:
logging:
pattern:
level: "%5p %X{Slf4jMDCFilter.UUID}"

Related

Dynamic config Spring batch execution time and parameters via database configured parameters

I am a fresh spring batch user, pls help me. Here is my requirement:
I have fulfilled several spring batch jobs with different names. I want to execute these jobs with different job parameters, and I hope these parameters can be configured dynamically in database, so I can add new job execution with different job names and different parameters.
Also, I want to schedule my job execution in different time, and the crontab expression can also be configured.
Maybe the database structure is like:
id
task_name
spring_batch_job_name
cron_expression
Wonder if some guys can guide me. Very thanks!
Here is my job setting entity:
#Entity
#Table(name = "report_tasks_manager", schema = "reconciliation", catalog = "")
public class ReportTasksManager {
private int id;
private String taskDesc;
private String taskName;
// crontab expression
private String cronExpression;
// class name to execute job logic
private String methodName;
private int state;
private Integer conCurrent;
private String reserved1;
private String reserved2;
private String reserved3;
private Timestamp startTime;
private Timestamp endTime;
private Timestamp createTime;
}
I defined a class which implemented Job interface, and the execute() method in this class executes the business logic, like launching a spring batch job:
public class QuartzJobFactory implements Job {
public QuartzJobFactory() {
}
#Override
public void execute(JobExecutionContext jobExecutionContext) throws JobExecutionException {
System.out.println("time ={" + new Date() + "}");
System.out.println("starting job build factory");
ReportTasksManager reportTasksManager = (ReportTasksManager) jobExecutionContext.getMergedJobDataMap().get("scheduleJob");
System.out.println("job name = {" + reportTasksManager.getTaskName() + "}");
}
}
For registering a cron trigger, I defined a rest controller to checkout job parameters in database and configure scheduler
#RestController
#RequestMapping(path = "test")
public class QuartzManager {
private SchedulerFactory schedulerFactory = new StdSchedulerFactory();
#Autowired
private ReportTaskManagerDAO reportTaskManagerDAO;
#GetMapping(value = "schedule")
public void scheduleJob() {
// Read settings from database
List<ReportTasksManager> quartzList = reportTaskManagerDAO.findAll();
if (quartzList.size() > 0) {
quartzList.forEach(reportTasksManager -> {
try {
configQuartz(reportTasksManager, schedulerFactory.getScheduler());
} catch (SchedulerException | ClassNotFoundException e) {
e.printStackTrace();
}
});
}
}
#SuppressWarnings("unchecked")
private void configQuartz(ReportTasksManager reportTasksManager, Scheduler scheduler) throws SchedulerException, ClassNotFoundException {
TriggerKey triggerKey = TriggerKey.triggerKey(reportTasksManager.getTaskName(), Scheduler.DEFAULT_GROUP);
// check if triggers already defined in scheduler
CronTrigger trigger = (CronTrigger) scheduler.getTrigger(triggerKey);
if (null == trigger) {
// not define——new trigger&jobDetail
JobDetail jobDetail =
JobBuilder.newJob((Class<? extends Job>) Class.forName(reportTasksManager.getMethodName()))
.withIdentity(reportTasksManager.getTaskName(), Scheduler.DEFAULT_GROUP)
.build();
jobDetail.getJobDataMap().put("scheduleJob", reportTasksManager);
CronScheduleBuilder scheduleBuilder = CronScheduleBuilder.cronSchedule(reportTasksManager.getCronExpression());
trigger = TriggerBuilder.newTrigger()
.withIdentity(reportTasksManager.getTaskName(), Scheduler.DEFAULT_GROUP)
.withSchedule(scheduleBuilder)
.build();
scheduler.scheduleJob(jobDetail, trigger);
scheduler.start();
} else {
// already defined——update
CronScheduleBuilder scheduleBuilder = CronScheduleBuilder.cronSchedule(reportTasksManager.getCronExpression());
trigger = trigger.getTriggerBuilder()
.withIdentity(triggerKey)
.withSchedule(scheduleBuilder)
.build();
scheduler.rescheduleJob(triggerKey, trigger);
}
}
}
You can use create some Util class( on #PostConstruct) which loads your job config from DB.
For example:
#Entity
public class Configuration{
#Id
private long id;
private String field;
private String value;
// getter and setter
}
#Component
public interface ConfigurationRepo extends JpaRepository<Configuration, Long> {
}
public final class ConfigurationUtil {
private ConfigurationUtil() {
}
private static List<Configuration> defaultConfiguration;
/**
* #return the defaultConfiguration
*/
public static List<Configuration> getDefaultConfiguration() {
return defaultConfiguration;
}
/**
* #param defaultConfiguration the defaultConfiguration to set
*/
public static void setDefaultConfiguration(List<Configuration> defaultConfiguration) {
ConfigurationUtil.defaultConfiguration = defaultConfiguration;
}
public static String getValueByField(String field) {
return defaultConfiguration.stream()
.filter(s -> s.getField()
.equalsIgnoreCase(field))
.findFirst()
.get()
.getValue();
}
}
#Component
public class ConfigurationContextInitializer {
#Autowired
ConfigurationRepo configurationRepo;
#PostConstruct
public void init() {
ConfigurationUtil.setDefaultConfiguration(configurationRepo.findAll());
}
}
//To access DB value
ConfigurationUtil.getValueByField("JOB_NAME"); // depends on your DB key

Multi-Tenancy in Reactive Spring boot application using mongodb-reactive

How can we create a multi-tenant application in spring webflux using Mongodb-reactive repository?
I cannot find any complete resources on the web for reactive applications. all the resources available are for non-reactive applications.
UPDATE:
In a non-reactive application, we used to store contextual data in ThreadLocal but this cannot be done with reactive applications as there is thread switching. There is a way to store contextual info in reactor Context inside a WebFilter, But I don't how get hold of that data in ReactiveMongoDatabaseFactory class.
Thanks.
I was able to Implement Multi-Tenancy in Spring Reactive application using mangodb. Main classes responsible for realizing were: Custom MongoDbFactory class, WebFilter class (instead of Servlet Filter) for capturing tenant info and a ThreadLocal class for storing tenant info. Flow is very simple:
Capture Tenant related info from the request in WebFilter and set it in ThreadLocal. Here I am sending Tenant info using header: X-Tenant
Implement Custom MondoDbFactory class and override getMongoDatabase() method to return database based on current tenant available in ThreadLocal class.
Source code is:
CurrentTenantHolder.java
package com.jazasoft.demo;
public class CurrentTenantHolder {
private static final ThreadLocal<String> currentTenant = new InheritableThreadLocal<>();
public static String get() {
return currentTenant.get();
}
public static void set(String tenant) {
currentTenant.set(tenant);
}
public static String remove() {
synchronized (currentTenant) {
String tenant = currentTenant.get();
currentTenant.remove();
return tenant;
}
}
}
TenantContextWebFilter.java
package com.example.demo;
import org.springframework.http.server.reactive.ServerHttpRequest;
import org.springframework.stereotype.Component;
import org.springframework.web.server.ServerWebExchange;
import org.springframework.web.server.WebFilter;
import org.springframework.web.server.WebFilterChain;
import reactor.core.publisher.Mono;
#Component
public class TenantContextWebFilter implements WebFilter {
public static final String TENANT_HTTP_HEADER = "X-Tenant";
#Override
public Mono<Void> filter(ServerWebExchange exchange, WebFilterChain chain) {
ServerHttpRequest request = exchange.getRequest();
if (request.getHeaders().containsKey(TENANT_HTTP_HEADER)) {
String tenant = request.getHeaders().getFirst(TENANT_HTTP_HEADER);
CurrentTenantHolder.set(tenant);
}
return chain.filter(exchange).doOnSuccessOrError((Void v, Throwable throwable) -> CurrentTenantHolder.remove());
}
}
MultiTenantMongoDbFactory.java
package com.example.demo;
import com.mongodb.reactivestreams.client.MongoClient;
import com.mongodb.reactivestreams.client.MongoDatabase;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.SimpleReactiveMongoDatabaseFactory;
public class MultiTenantMongoDbFactory extends SimpleReactiveMongoDatabaseFactory {
private final String defaultDatabase;
public MultiTenantMongoDbFactory(MongoClient mongoClient, String databaseName) {
super(mongoClient, databaseName);
this.defaultDatabase = databaseName;
}
#Override
public MongoDatabase getMongoDatabase() throws DataAccessException {
final String tlName = CurrentTenantHolder.get();
final String dbToUse = (tlName != null ? tlName : this.defaultDatabase);
return super.getMongoDatabase(dbToUse);
}
}
MongoDbConfig.java
package com.example.demo;
import com.mongodb.reactivestreams.client.MongoClient;
import com.mongodb.reactivestreams.client.MongoClients;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.core.ReactiveMongoClientFactoryBean;
import org.springframework.data.mongodb.core.ReactiveMongoTemplate;
#Configuration
public class MongoDbConfig {
#Bean
public ReactiveMongoTemplate reactiveMongoTemplate(MultiTenantMongoDbFactory multiTenantMongoDbFactory) {
return new ReactiveMongoTemplate(multiTenantMongoDbFactory);
}
#Bean
public MultiTenantMongoDbFactory multiTenantMangoDbFactory(MongoClient mongoClient) {
return new MultiTenantMongoDbFactory(mongoClient, "test1");
}
#Bean
public ReactiveMongoClientFactoryBean mongoClient() {
ReactiveMongoClientFactoryBean clientFactory = new ReactiveMongoClientFactoryBean();
clientFactory.setHost("localhost");
return clientFactory;
}
}
UPDATE:
In reactive-stream we cannot store contextual information in ThreadLocal any more as the request is not tied to a single thread, So, This is not the correct solution.
However, Contextual information can be stored reactor Context in WebFilter like this. chain.filter(exchange).subscriberContext(context -> context.put("tenant", tenant));. Problem is how do get hold of this contextual info in ReactiveMongoDatabaseFactory implementation class.
Here is my very rough working solution for Spring WebFlux - they have since updated the ReactiveMongoDatabaseFactory - getMongoDatabase to return a Mono
Create web filter
public class TenantContextFilter implements WebFilter {
private static final Logger LOGGER = LoggerFactory.getLogger(TenantContextFilter.class);
#Override
public Mono<Void> filter(ServerWebExchange swe, WebFilterChain wfc) {
ServerHttpRequest request = swe.getRequest();
HttpHeaders headers = request.getHeaders();
if(headers.getFirst("X-TENANT-ID") == null){
LOGGER.info(String.format("Missing X-TENANT-ID header"));
throw new ResponseStatusException(HttpStatus.UNAUTHORIZED);
}
String tenantId = headers.getFirst("X-TENANT-ID");
LOGGER.info(String.format("Processing request with tenant identifier [%s]", tenantId));
return wfc.filter(swe)
.contextWrite(TenantContextHolder.setTenantId(tenantId));
}
}
Create class to get context (credit to somewhere I found this)
public class TenantContextHolder {
public static final String TENANT_ID = TenantContextHolder.class.getName() + ".TENANT_ID";
public static Context setTenantId(String id) {
return Context.of(TENANT_ID, Mono.just(id));
}
public static Mono<String> getTenantId() {
return Mono.deferContextual(contextView -> {
if (contextView.hasKey(TENANT_ID)) {
return contextView.get(TENANT_ID);
}
return Mono.empty();
}
);
}
public static Function<Context, Context> clearContext() {
return (context) -> context.delete(TENANT_ID);
}
}
My spring security setup (all requests allowed for testing)
#EnableWebFluxSecurity
#EnableReactiveMethodSecurity
public class SecurityConfig {
#Bean
public SecurityWebFilterChain WebFilterChain(ServerHttpSecurity http) {
return http
.formLogin(it -> it.disable())
.cors(it -> it.disable()) //fix this
.httpBasic(it -> it.disable())
.csrf(it -> it.disable())
.securityContextRepository(NoOpServerSecurityContextRepository.getInstance())
.authorizeExchange(it -> it.anyExchange().permitAll()) //allow anonymous
.addFilterAt(new TenantContextFilter(), SecurityWebFiltersOrder.HTTP_BASIC)
.build();
}
}
Create Tenant Mongo DB Factory
I still have some clean-up work for defaults etc...
public class MultiTenantMongoDBFactory extends SimpleReactiveMongoDatabaseFactory {
private static final Logger LOGGER = LoggerFactory.getLogger(MultiTenantMongoDBFactory.class);
private final String defaultDb;
public MultiTenantMongoDBFactory(MongoClient mongoClient, String databaseName) {
super(mongoClient, databaseName);
this.defaultDb = databaseName;
}
#Override
public Mono<MongoDatabase> getMongoDatabase() throws DataAccessException {
return TenantContextHolder.getTenantId()
.map(id -> {
LOGGER.info(String.format("Database trying to retrieved is [%s]", id));
return super.getMongoDatabase(id);
})
.flatMap(db -> {
return db;
})
.log();
}
}
Configuration Class
#Configuration
#EnableReactiveMongoAuditing
#EnableReactiveMongoRepositories(basePackages = {"com.order.repository"})
class MongoDbConfiguration {
#Bean
public ReactiveMongoDatabaseFactory reactiveMongoDatabaseFactory() {
return new MultiTenantMongoDBFactory(MongoClients.create("mongodb://user:password#localhost:27017"), "tenant_catalog");
}
#Bean
public ReactiveMongoTemplate reactiveMongoTemplate() {
ReactiveMongoTemplate template = new ReactiveMongoTemplate(reactiveMongoDatabaseFactory());
template.setWriteResultChecking(WriteResultChecking.EXCEPTION);
return template;
}
}
Entity Class
#Document(collection = "order")
//getters
//setters
Testing
Create two mongo db's with same collection, put different documents in both
In Postman I just did a get request with the "X-TENANT-ID" header and database name as the value (e.g. tenant-12343 or tenant-34383) and good to go!

Spring Batch: File not being read

I am trying to create an application that uses the spring-batch-excel extension to be able to read Excel files uploaded through a web interface by it's users in order to parse the Excel file for addresses.
When the code runs, there is no error, but all I get is the following in my log. Even though I have log/syso throughout my Processor and Writer (these are never being called, and all I can imagine is it's not properly reading the file, and returning no data to process/write). And yes, the file has data, several thousand records in fact.
Job: [FlowJob: [name=excelFileJob]] launched with the following parameters: [{file=Book1.xlsx}]
Executing step: [excelFileStep]
Job: [FlowJob: [name=excelFileJob]] completed with the following parameters: [{file=Book1.xlsx}] and the following status: [COMPLETED]
Below is my JobConfig
#Configuration
#EnableBatchProcessing
public class AddressExcelJobConfig {
#Bean
public BatchConfigurer configurer(EntityManagerFactory entityManagerFactory) {
return new CustomBatchConfigurer(entityManagerFactory);
}
#Bean
Step excelFileStep(ItemReader<AddressExcel> excelAddressReader,
ItemProcessor<AddressExcel, AddressExcel> excelAddressProcessor,
ItemWriter<AddressExcel> excelAddressWriter,
StepBuilderFactory stepBuilderFactory) {
return stepBuilderFactory.get("excelFileStep")
.<AddressExcel, AddressExcel>chunk(1)
.reader(excelAddressReader)
.processor(excelAddressProcessor)
.writer(excelAddressWriter)
.build();
}
#Bean
Job excelFileJob(JobBuilderFactory jobBuilderFactory,
#Qualifier("excelFileStep") Step excelAddressStep) {
return jobBuilderFactory.get("excelFileJob")
.incrementer(new RunIdIncrementer())
.flow(excelAddressStep)
.end()
.build();
}
}
Below is my AddressExcelReader
The late binding works fine, there is no error. I have tried loading the resource given the file name, in addition to creating a new ClassPathResource and FileSystemResource. All are giving me the same results.
#Component
#StepScope
public class AddressExcelReader implements ItemReader<AddressExcel> {
private PoiItemReader<AddressExcel> itemReader = new PoiItemReader<AddressExcel>();
#Override
public AddressExcel read()
throws Exception, UnexpectedInputException, ParseException, NonTransientResourceException {
return itemReader.read();
}
public AddressExcelReader(#Value("#{jobParameters['file']}") String file, StorageService storageService) {
//Resource resource = storageService.loadAsResource(file);
//Resource testResource = new FileSystemResource("upload-dir/Book1.xlsx");
itemReader.setResource(new ClassPathResource("/upload-dir/Book1.xlsx"));
itemReader.setLinesToSkip(1);
itemReader.setStrict(true);
itemReader.setRowMapper(excelRowMapper());
}
public RowMapper<AddressExcel> excelRowMapper() {
BeanWrapperRowMapper<AddressExcel> rowMapper = new BeanWrapperRowMapper<>();
rowMapper.setTargetType(AddressExcel.class);
return rowMapper;
}
}
Below is my AddressExcelProcessor
#Component
public class AddressExcelProcessor implements ItemProcessor<AddressExcel, AddressExcel> {
private static final Logger log = LoggerFactory.getLogger(AddressExcelProcessor.class);
#Override
public AddressExcel process(AddressExcel item) throws Exception {
System.out.println("Converting " + item);
log.info("Convert {}", item);
return item;
}
}
Again, this is never coming into play (no logs generated). And if it matters, this is how I'm launching my job from a FileUploadController from a #PostMapping("/") to handle the file upload, which first stores the file, then runs the job:
#PostMapping("/")
public String handleFileUpload(#RequestParam("file") MultipartFile file, RedirectAttributes redirectAttributes) {
storageService.store(file);
try {
JobParameters jobParameters = new JobParametersBuilder()
.addString("file", file.getOriginalFilename().toString()).toJobParameters();
jobLauncher.run(job, jobParameters);
} catch (JobExecutionAlreadyRunningException | JobRestartException | JobInstanceAlreadyCompleteException
| JobParametersInvalidException e) {
e.printStackTrace();
}
redirectAttributes.addFlashAttribute("message",
"You successfully uploaded " + file.getOriginalFilename() + "!");
return "redirect:/";
}
And last by not least
Here is my AddressExcel POJO
import lombok.Data;
#Data
public class AddressExcel {
private String address1;
private String address2;
private String city;
private String state;
private String zip;
public AddressExcel() {}
}
UPDATE (10/13/2016)
From Nghia Do's comments, I also created my own RowMapper instead of using the BeanWrapper to see if that was the issue. Still the same results.
public class AddressExcelRowMapper implements RowMapper<AddressExcel> {
#Override
public AddressExcel mapRow(RowSet rs) throws Exception {
AddressExcel temp = new AddressExcel();
temp.setAddress1(rs.getColumnValue(0));
temp.setAddress2(rs.getColumnValue(1));
temp.setCity(rs.getColumnValue(2));
temp.setState(rs.getColumnValue(3));
temp.setZip(rs.getColumnValue(4));
return temp;
}
}
All it seems I needed was to add the following to my ItemReader:
itemReader.afterPropertiesSet();
itemReader.open(new ExecutionContext());

Unreachable security context using Feign RequestInterceptor

The goal is to attach some data from security context using RequestInterceptor, but the problem, that the calling SecurityContextHolder.getContext().getAuthentication() always returns null even though it is not null (I am sure 100%).
As I understand that's because the Interceptor is created and is being run in other thread.
How could I solve this problem and get actual data from security context?
My service:
#FeignClient(value = "api", configuration = { FeignConfig.class })
public interface DocumentService {
#RequestMapping(value = "/list", method = RequestMethod.GET)
DocumentListOperation list();
}
My FeignConfig class:
#Bean
public RequestInterceptor requestInterceptor() {
return new HeaderInterceptor(userService);
}
public class HeaderInterceptor implements RequestInterceptor {
private UserService userService;
public HeaderInterceptor(UserService userService) {
this.userService = userService;
}
#Override
public void apply(RequestTemplate requestTemplate) {
Authentication a = SecurityContextHolder.getContext().getAuthentication()
requestTemplate.header("authentication", a.toString());
}
}
I managed to figure it out, thanks to the article I found here
Firstly you need to initiliaze HystrixRequestContext HystrixRequestContext.initializeContext();.
You have to create your own Context in which you will store information you need to pass to Hystrix child threads.
Here is example:
public class UserHystrixRequestContext {
private static final HystrixRequestVariableDefault<User> userContextVariable = new HystrixRequestVariableDefault<>();
private UserHystrixRequestContext() {}
public static HystrixRequestVariableDefault<User> getInstance() {
return userContextVariable;
}
}
You have to register new concurrency strategy that would wrap Callable interface
#Component
public class CustomHystrixConcurrencyStrategy extends HystrixConcurrencyStrategy {
public CustomHystrixConcurrencyStrategy() {
HystrixPlugins.getInstance().registerConcurrencyStrategy(this);
}
#Override
public <T> Callable<T> wrapCallable(Callable<T> callable) {
return new HystrixContextWrapper<T>(callable);
}
public static class HystrixContextWrapper<V> implements Callable<V> {
private HystrixRequestContext hystrixRequestContext;
private Callable<V> delegate;
public HystrixContextWrapper(Callable<V> delegate) {
this.hystrixRequestContext = HystrixRequestContext.getContextForCurrentThread();
this.delegate = delegate;
}
#Override
public V call() throws Exception {
HystrixRequestContext existingState = HystrixRequestContext.getContextForCurrentThread();
try {
HystrixRequestContext.setContextOnCurrentThread(this.hystrixRequestContext);
return this.delegate.call();
} finally {
HystrixRequestContext.setContextOnCurrentThread(existingState);
}
}
}
}
So before calling Callable object we set new thread's Context to parent's context.
After that is done you should be able to access your new defined context inside Hystrix child threads
User = UserHystrixRequestContext.getInstance().get();
Hope that will help someone.

ServletContext.log() not logging

Log output of my RemoteServiceServlet (GWT) is not shown in Logfiles or Stdout when using getServletContext().log("anything");
For dependency injection I use Google Guice. For my own log output I use slf4j-jdk14. I tried this in Tomcat 6 as well as in Jetty (GWT devmode).
To make it clear, my Servlet:
#Singleton
public class MyServiceServlet extends RemoteServiceServlet implements MyService {
private static final Logger log = LoggerFactory.getLogger(MyServiceServlet.class);
private final ADependency dep;
#Inject
public MyServiceServlet(ADependency dep) {
getServletContext().log("THIS IS NOT SHOWN IN MY LOGS");
log.error("THIS IS SHOWN IN MY LOGS");
this.dep = dep;
}
}
So, where can I find the missing log output or where can I configure the ServletContext-Log?
The ServletContext.log method behavior is container specific. The method I have used to make it consistent is to wrap the ServletConfig passed in through init() in order to create a wrapped ServletContext which uses our own provided logger (Slf4j in this case).
public class Slf4jServletConfigWrapper implements ServletConfig {
private final ServletConfig config;
private final Logger log;
public Slf4jServletConfigWrapper(Logger log, ServletConfig config) {
this.log = log;
this.config = config;
}
public ServletContext getServletContext() {
return new ServletContext() {
public void log(String message, Throwable throwable) {
log.info(message, throwable);
}
public void log(Exception exception, String msg) {
log.info(msg, exception);
}
public void log(String msg) {
log.info(msg);
}
...
Full Slf4jServletConfigWrapper.java code
In your Servlet override the init() method to use the ServletConfig wrapper
public void init(final ServletConfig config) throws ServletException {
super.init(new Slf4jServletConfigWrapper(log, config));
}