2

I want to insert SQL rows into my_logging_table as a log whenever a NiFi Put-SQL-processor in my pipeline fails. So far I had to create another logging-PutSQL processor for every pipeline-processor which have the following sql-statement:

insert into schema.my_logging_table values
('Failed-NiFi-processor-name', 'failed', current_timestamp)

This obviously leads to double the number of NiFi processors as each has to have their own "logging-putsql-processor" so that the correct processor name can be logged.

Is there a way to have my pipeline-putsql-processors update a flowfile property on fail (Im thinking of passing on the name of the processor)?

This way I could route all failures to a single logging-putsql-processor which reads the failing processor's name from the file property and insert the row into the database. I noticed the "update_attribute" processor, but I would have to build one of them for every processor as well...

1 Answer 1

1

Yes, you'd need to stick an UpdateAttribute on the Failure branch, not all processors write failure attributes. You'll see what attributes they write in the processor docs - see the Writes Attributes section for PutSQL

Alternatively, you could

  1. Create a flow to parse nifi-app.log and hunt for the errors
  2. Create a SiteToSiteBulletinReportingTask flow that captures all bulletins, sends them to an RPG, filters for PutSQL failures, sends to DB, etc.
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.