We are using the Execute Row SQL script node to execute large dynamic SQL scripts in SQL Server.
The code is metadata driven, so none of the below can be hard-coded. The SQL is compiled at run-time. The full SQL script (as a string) is passed to the node in the SQL Field Name parameter
It seems that if there are multiple commands in the script, and the first command is successful/valid, then pentaho does not return any subsequent errors found in the script.
I have a table as shown here in SQL Server:
| ID (PK) | Name |
|---|---|
| 1 | Bob |
| 2 | Charles |
If I run the following commands as one string directly in SQL Server Management Studio, I correctly get a PK insert error. This stops the process and all is happy
INSERT INTO dbo.table (ID, Name) VALUES (3, 'Anna')
INSERT INTO dbo.table (ID, Name) VALUES (3, 'Nathan')
BUT, if I run that exact same code in an Execute SQL row script, then the transformation is marked as completed with no error. This is not what should happen.
I have tried delimiting the commands with a ;, but this then in turn is a problem if we include dynamic variables in the script. E.g.
DECLARE @common_name VARCHAR(400) = 'Testface';
INSERT INTO dbo.table (ID, Name) VALUES (3, @common_name);
INSERT INTO dbo.table (ID, Name) VALUES (3, @common_name);
In the above scenario, the dynamic variables are not passed into the subsequent code block. I get the error:
Must declare the scalar variable @common_name
It's an incredibly intricate issue, but I have tried at length with no success. Any help would be very gratefully appreciated.
Please note, the code example is simplified for exposition.
SET NOCOUNT ON, perhaps it can help adding it at the start of your script