Centralized logging for Spring Boot Microservices using the Elastic Stack (Elasticsearch, Logstash and Kibana)

In this article, we will learn how we can use the elastic stack to get the logs from different microservices into a central system.

In the article, we will

  1. Create 2 Spring Boot Microservices
  2. Setup the Elasticsearch, Kibana and Logstash on our local system
  3. Ingest the logs from log files to Elasticsearch using Logstash
  4. View the aggregated logs in Kibana

Step 1 : Create the microservices

Note : The microservices in this article will be simple Spring Boot Projects. The focus of this article is to explain how to setup centralized logging for the services. The business logic of the services can be as complex as required.

Go to Spring Initializr

Create a Library service

Add the following dependencies :

  1. Spring Web
  2. Lombok

In "application . properties" add the following properties

# Specify the port at which you want to run the application
server.port=9090

# Specify the file location for logs of library service to be generated
logging.file.name=C:/logs/service.log

Add a model class for Library Book

LibraryBook.java

@Data
@AllArgsConstructor
@NoArgsConstructor
public class LibraryBook
{
    private long bookId;
    private String bookName;
}

Add a Rest Controller for exposing an endpoint to get list of all books

LibraryController.java

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.http.HttpStatus;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RestController;

import java.util.List;
import java.util.stream.Collectors;
import java.util.stream.Stream;

@RestController
public class LibraryController
{
    /***
    * Set up the logger
    */
    Logger logger= LoggerFactory.getLogger(LibraryController.class);


    /***
     * A Rest Endpoint that returns list of all books in library
     * @return List of all books
     */
    @GetMapping("/getAllBooks")
    public ResponseEntity<?> getAllBooks() {
        List<LibraryBook> books=getBooks();
        logger.info("/getAllBooks : Returning list of all books in the library");
        return new ResponseEntity<>(books, HttpStatus.OK);
    }

    /***
     * This method generates some static data for testing and simulation purpose
     * @return A list of Library books
     */
    private List<LibraryBook> getBooks() {
        return Stream.of(new LibraryBook(1, "Atomic Habits"),
                new LibraryBook(2, "Deep Work"),
                new LibraryBook(3, "The 5 AM Club"),
                new LibraryBook(4, "Think Like A Monk"))
                .collect(Collectors.toList());
    }
}

Create a Book service

Add the following dependencies :

  1. Spring Web
  2. Lombok

In "application . properties" add the following properties

# Specify the port at which you want to run the application
server.port=9091

# Specify the file location for logs of library service to be generated
logging.file.name=C:/logs/service.log

Add a model class for Book

Book.java

@Data
@AllArgsConstructor
@NoArgsConstructor
public class Book
{
    private long bookId;
    private String bookName;
    private String author;
    private String publisher;
}

Add a Rest Controller for exposing an endpoint to details of a particular book by book ID

BookController.java

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.RestController;

import java.util.List;
import java.util.stream.Collectors;
import java.util.stream.Stream;

@RestController
public class BookController
{
    // Set up the logger
    Logger logger= LoggerFactory.getLogger(BookController.class);

    /***
     * This end pint returns book matching the id given as path variable
     * @param id Id of the book
     * @return The book whose id matched the param
     */
    @GetMapping("/getBook/{id}")
    public Book getBookById(@PathVariable long id) {
        List<Book> books=getBooks();

        Book book= books.stream().
                filter(u->u.getBookId()==id).findAny().orElse(null);
        if(book!=null){
            logger.info("book found : {}",book);
            return book;
        }else{
            try {
                throw new Exception();
            } catch (Exception e) {
                e.printStackTrace();
                logger.error("No Book found with ID : {}",id);
            }
            return new Book();
        }
    }

    /***
     * This method generates some static data for testing and simulation purpose
     * @return A list books with book details
     */
    private List<Book> getBooks() {
        return Stream.of(new Book(1, "Atomic Habits", "James Clear", "Penguin Random House"),
                new Book(2, "Deep Work","Cal Newport","Grand Central Publishing"),
                new Book(3, "The 5 AM Club", "Robin Sharma","HarperCollins."),
                new Book(4, "Think Like A Monk","Jay Shetty","Simon & Schuster"))
                .collect(Collectors.toList());
    }
}

Step 2 : Setup the Elasticsearch, Kibana and Logstash on our local system

Download and unzip elasticsearch, kibana and logstash from the links below

  1. Elasticsearch

  2. Logstash

  3. Kibana

In the Elasticsearch folder, open command prompt, run

bin/elasticsearch.bat

You can verify the Elasticsearch instance running at localhost:9200

Next, in the kibana directory, open the config/kibana.yml in a text editor and uncomment the elasticsearch.hosts property. Save it. Then open command prompt and run

bin/kibana.bat

You can see the kibana instance running at localhost:5601

Next, in the logstash folder, create a new file in bin directory as library-management-logstash.conf

input {
    file {
        path => "C:/logs/service.logs"
        start_position => "beginning"
    }
}

output {
    stdout {
       codec => rubydebug
    }

    elasticsearch {
        hosts => ["localhost:9200"]
        index => "librarymanagement-%{+yyyy.MM.dd}"
    }
}

Now open command prompt in the logstash directory and run

logstash -f bin/library-management-logstash.conf

You can see the index created at localhost:9200/_cat/indices

Step 3 :

In the Kibana Dashboard, go to Management > Index Patterns and click on "Create Index Pattern". You will find the librarymanagement index that we created. Click on "Create Index Pattern"

Now you can hit the APIs multiple times

  • http://localhost:9090/getAllBooks
  • http://localhost:9091/getBook/{id}

Then click on discover button on Kibana dashboard and under our librarymanagement index you will see the logs in real time.

This brings us to the end of article. Here we used a common log file for services, we can go for separate log files for each service and use multiple Logstash instances to ingest data in elastic search. We can create separate indices in Kibana as well as per our need.

I hope you found the article useful.

Lets connect :

Happy Coding :) .