이 콘텐츠는 선택한 언어로 제공되지 않습니다.
Chapter 13. Batch processing
A naive approach to inserting 100,000 rows in the database using Hibernate might look like this:
This would fall over with an
OutOfMemoryException
somewhere around the 50,000th row. That is because Hibernate caches all the newly inserted Customer
instances in the session-level cache. In this chapter we will show you how to avoid this problem.
If you are undertaking batch processing you will need to enable the use of JDBC batching. This is absolutely essential if you want to achieve optimal performance. Set the JDBC batch size to a reasonable number (10-50, for example):
hibernate.jdbc.batch_size 20
hibernate.jdbc.batch_size 20
Hibernate disables insert batching at the JDBC level transparently if you use an
identity
identifier generator.
You can also do this kind of work in a process where interaction with the second-level cache is completely disabled:
hibernate.cache.use_second_level_cache false
hibernate.cache.use_second_level_cache false
However, this is not absolutely necessary, since we can explicitly set the
CacheMode
to disable interaction with the second-level cache.
13.1. Batch inserts 링크 복사링크가 클립보드에 복사되었습니다!
링크 복사링크가 클립보드에 복사되었습니다!
When making new objects persistent
flush()
and then clear()
the session regularly in order to control the size of the first-level cache.