0

I am trying to upload multiple GZip files to Amazon S3 using Pentaho 9.3. I have also set the part size to maximum in the kettle property but am still facing the S3 multi-part error. Reference for kettle propertykettle property . Currently, I am using Pentaho 7.1 and haven't faced any issues to date. I want to upgrade to a higher version of Pentaho but am constantly getting an 'S3 multipart exception caught' error. I have also increased the JVM RAM, but it is not working. Error imageS3 error

I have created a Pentaho jobSampleJob and in this flow, it uploads the data to an Amazon S3 bucket. It only creates the S3 file, whose file size is small, and skips the larger file. For example, S3 output delete file size is less (in KB), so every time it just uploads for delete file and not for insert and update files (in MB). Reference for Amazon S3 bucketS3 Bucket

2
  • Your errorcode is Acces Denied... Is there not an issue with your credentials? Commented Mar 5, 2024 at 9:59
  • Yes there is no issue with credentials, Issue is with the large file as small size file are getting uploaded Commented Mar 5, 2024 at 11:51

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.