0

I am using a S3 compatible object store (CloudFlare R2) and trying to get EMR serverless to connect to it. R2 requires that you use the correct endpoint and pass the secret key and access key.

In the local machine, running spark works perfectly fine. But when running it on EMR serverless it does not respect the params I try passing to it. To be precise, fs.s3a.access.key and fs.s3a.secret.key.

I've tried setting it in the application, as part of job from the console and as --conf in spark parameters, but it does not work.

Any ideas on how to go about this is really appreciated, thanks!

1 Answer 1

0

use the EMR s3 connector -it's the one they support

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.