APACHE NIFI CREATES TOO MANY RECORDS IN CLICKHOUSE
Thorsten Hock

Thorsten Hock @nadirhamburg

Joined:
Jun 5, 2025

APACHE NIFI CREATES TOO MANY RECORDS IN CLICKHOUSE

Publish Date: Jun 5
0 0

I have a MS SQL table with 9.642.846 records. I use APACHE Nifi 2.4.0 to read them and to store them in a local Clickhouse DB.

I do end up with 646.953.000 records in Clickhouse and this error message:

`PutDatabaseRecord[id=3a5be48d-0197-1000-1a40-742829bf6f54] Failed to put Records to database for FlowFile[filename=f5f20095-4a46-42d7-8f88-56b68292df86]. Routing to failure.: java.sql.SQLException: Connection pool shut down

  • Caused by: com.clickhouse.client.api.ClientException: Connection pool shut down
  • Caused by: java.lang.IllegalStateException: Connection pool shut down`

My assuption is, that it has something to do with the batch size of the PutDatabaseRecord processor, but I don't know if this is true or not. If I have more records than the given limit for the batch size, records get written multiple times. But I didn't get the 'logic' behind it.

If I have less incoming records than the batch size everything is fine.

Comments 0 total

    Add comment