top of page
Search
  • Writer's pictureLeonard Anghel

Session-Level Batching (Hibernate 5.2 or Higher)

Updated: May 9, 2020

Motivation:

This article is useful if you want to vary the batching size per Hibernate Session. While global (application-level) batching size can be easily set via the hibernate.jdbc.batch_size property in application.properties, starting with Hibernate 5.2, we can programatically set a different value for batch size for each batch.


Description:

This application is an example of batch inserts via Hibernate session-level batching (Hibernate 5.2 or higher) against MySQL (of course, you can use any other database). The implementation from this example commits the database transaction after each batch execution. This way, we avoid long-running transactions and, in case of a failure, we rollback only the failed batch and don't lose the previous batches. Moreover, for each batch, the Persistence Context is flushed and cleared, therefore we maintain a thin Persistence Context. Keeping a small Persistence Context is one of the top practice to maintain a high-performance persistence layer. This way, the code is not prone to memory errors and performance penalties caused by slow flushes.


Key points:

  • The batch size is set programatically via Session#setJdbcBatchSize(Integer size) and get or obtained via Session#getJdbcBatchSize() as below:

Tam Ta Da Dam! :) The complete code is available on GitHub.


If you need a deep dive into the performance recipes exposed in this repository then I am sure that you will love my book "Spring Boot Persistence Best Practices".



335 views0 comments

Recent Posts

See All

How To Bulk Updates

Motivation: This article is useful if you need a fast way to update a significant amount of data in the database. Bulk operations (updates and deletes) can be a good choice. Nevertheless, when Hiberna

How To JDBC Batch a Big JSON File Via ForkJoinPool

Motivation: This article is useful if you want to JDBC batch inserts concurrently. For example, we want to batch the content of a huge file. By employing concurrent batching we can do it much faster t

bottom of page