Recently I was working on an application where we are using iBatis data persistence framework with Oracle 10g and Java 1.4.2. This application basically plays with considerably huge amount of data which it reads from some file over the queues. Just few days back we received a huge file with around 66976 records. Application reads/parses all records, does some logic processing and then inserts all records in database as a JDBC batch through iBatis. To our surprise it was just inserting 1446 records and returned silently. It was behaving like everything went well and all records have been inserted.
After debugging a lot (for 2 days) at 2 in the morning we came to know that application's business logic is just working fine and is not gobbling up records due to some bug. Its the iBatis call to executeBatch was creating problem. All of a sudden one of the team members - Chaitanya(@chaituvj) - realized that missing number of records are 65536 (66976 - 1446 ) i.e 2^16. He had haunch that somehow iBatis is not able to handle a batch with more than 65536 records. Upon doing a small PoC this turned out to be true. In PoC we also wrote a small pure JDBC code where we created a batch of java.sql.Statement with 66000 records and tried to insert it. It just went fine. Further debugging iBatis code we came to know that iBatis internally calls java.sql.PreparedStatement.executeBatch() and till that point everything is fine. No records are lost. We almost came to a conclusion that iBatis is the culprit. To enforce our finding, we extended our PoC to have batch execution with MySQL to insert more than 65536 records with iBatis and pure JDBC calls. It went fine. No issues. So once again our understanding was reinforced by this PoC result - iBatis is the culprint.
After one more day of brain storming we noticed that in our pure JDBC code we had a batch of java.sql.Statement and iBatis is using java.sql.PreparedStatement. Since PreparedStatement extends Statement, there was no reason to doubt there might be any problem with PreparedStatement's executeBatch() method implementation. But still there was no satisfactory resolution to this strange behavior, we thought to give it a try and changed our pure JDBC PoC to use java.sql.PreparedStatement instead of java.sql.Statement and execute the program to insert 66000 records. To our surprise, it inserted only 464 records i.e. 66000 - 65536. So now we found our real culprit - Oracle implementation of java.sql.PreparedStatement with ojdbc14.jar
This is a real strange implementation of PreparedStatement's from Oracle's ojdbc driver. If Oracle guys are reading this hope to hear some nice explanation for such implementation. ;-)
[Update: 3rd Dec 09]: Posted the same issue on OTN forum and found that its a bug with oracle driver version 10.1.x. Check this link for more details.