1

I'm developing an ETL process in Azure SQL Database, in which I will have several T-SQL stored procedures performing the automated processing on the data. These processes will occasionally fail due to diverse reasons, so I need to implement a logging strategy that will allow me to determine the cause of the failures whenever they happen.

The simplest solution would be to create a log table in the same Azure SQL Database, but I would really like to leverage Azure Monitor's Log Analytics capabilities. I've searched all around the web, but I've found no way in which I can send custom logs from a T-SQL stored procedure running on Azure SQL Database to a Azure Log Analytics workspace. Is there any way in which I can achieve this?

4
  • 1
    How about using Azure Data Factory (ADF) to orchestrate the running of the stored procedures, doing some work in parallel where required eg with the For Each activity or just having parallel Stored Proc activities. Any stored procs erroring could bubble up their errors to ADF and subsequently Azure Monitor. Fetch details for the ADF pipeline run via the Monitor API. Example here. Commented Feb 8, 2022 at 22:09
  • Yeah, what you describe about orchestrating everything in ADF is exactly what I'm doing. The problem is that I want to be able to write very granular logs (practically after each DML query, i.e. insert, update, delete), without ending up with hundreds or thousands of single-DML-query stored procedures being orchestrated in ADF. Besides, I don't want to have only the error logs, but also the preceding successful logs are important to give context on the error. Also, the successful logs are important for analysis and screens on the ETL process itself. Commented Feb 9, 2022 at 5:02
  • You can't write directly to log analytics from a stored procedure. You can only write to a log table and have another process push that into log analytics. Commented Nov 2, 2023 at 9:42
  • It may or may not be adequate, but ADF usually captures the native SP error in it's log and that will be written to log analytics assuming that's linked up Commented Nov 2, 2023 at 9:46

1 Answer 1

0

Azure SQL Database provides Log Audit out of the box for you.

You just need to enable them at:

  • Server level
  • Database level

And they can be stored on:

  • Blob Storage
  • Log Analytics

enter image description here

see my answer here.

So no need to create and maintain ETL process.

Sign up to request clarification or add additional context in comments.

6 Comments

The OP is talking about cutom log entries written by stored procedures. This won't help for that.
But @Nick.Mc, MarioW is saying that "...processes will occasionally fail due to diverse reasons..." the standard Azure logs will tell you the reason. How can MarioW know more that Azure itself? Or maybe I don't understand about what logs are we talking about here
He's talking about for example a divide by zero in a stored procedure or a PK violation due to inserting records in a stored procedure. The question is will those verbose error messages appear in these logs?
Thank you @Nick.Mc. I cannot tell if the Audit Logs contains that. But what else could contain that information? What is the alternative? How can we catch errors otherwise? I cannot tell if there is another way to find errors apart from diving into the Audit Logs. This is what they are made for.
Here's a response from Aaron Bertrand for the same kind of thing. These errors are not captured anywhere unless you specifially catch them or run an expensive server side trace. These errors only appear against the calling process (i.e. ADF) at the time they occur and unless they are logged there they are lost forever. stackoverflow.com/a/7416528/1690193
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.