Skip to main content
Filter by
Sorted by
Tagged with
-2 votes
1 answer
19 views

I've built a Lambda function in the AWS Console, and it seems fine, but when I add an S3 Trigger, it gives me a message "Confugrations overlap. Configurations on the same bucket cannot share a ...
Sarah Messer's user avatar
  • 4,119
0 votes
0 answers
47 views

We are currently backing up all of our databases to a series S3-Compatibable storageGRID buckets hosted by our data center provider. Some of our larger databases backups run quite long (upwards of 8 ...
Ben Adams's user avatar
0 votes
1 answer
77 views

I'm trying to do my first CSV export into an existing AWS S3 bucket. Calling with type CSV: call dbms_cloud.export_data(credential_name => 'cred', file_uri_list => 'https://s3.us-east-2....
AC''s user avatar
  • 1
0 votes
0 answers
28 views

I’m hosting a static website on Amazon S3 behind CloudFront, with the entire setup created and deployed via Terraform. The odd behavior I’m running into is that the exact same HTML and CSS files ...
Andrea Peterson's user avatar
1 vote
2 answers
79 views

Trying to deploy a CloudFormation template that is stored in an S3 bucket as the template size is larger than 51kb so cant be done direct (from what I am led to believe and have seen when trying) ...
pelagos's user avatar
  • 1,091
-4 votes
1 answer
63 views

I am using lambda and Python and S3. # lambda_bootstrap_train.py import boto3 import time import json import os sm = boto3.client('sagemaker', region_name='us-east-2') s3 = boto3.client('s3', ...
Clint C.'s user avatar
  • 688
-1 votes
1 answer
64 views

Here is the IAM policy (Mostly by chatgpt) : { "Version": "2012-10-17", "Statement": [ { "Sid": "AllowListAllBucketsForConsole&...
Hugo Wong's user avatar
1 vote
1 answer
85 views

In a Lambda, I'm using AWS Wrangler to read data out of a date partitioned set of parquets and concatenate them together. I am doing this by calling wr.s3.read_parquet in a loop, compiling the loaded ...
AngusB's user avatar
  • 80
0 votes
1 answer
48 views

I need to create an AWS IAM policy that prevents users from disabling the "Block all public access" configuration on S3 buckets, but still allows them to re-enable it if it was already ...
Sarangan's user avatar
  • 1,146
0 votes
1 answer
54 views

I'm using spark version 3.4.4 and running spark structured streaming application. Data pipeline reads data from kafka continuously and do some aggregations on stream data and stores the result into ...
Karthik's user avatar
  • 1,171
2 votes
1 answer
55 views

I would like to set a file size limit in uploads through multipart upload in my application, but my problem is, using pre-signed-url, I can't check the file size of each part and all the options I ...
Luiz Kohler's user avatar
1 vote
0 answers
35 views

I'm setting up an Apache NiFi flow using the ListS3 processor to scan an AWS S3 bucket, but I keep getting errors, and the processor isn't listing any files from the bucket. NiFi Processor ...
vaibhav's user avatar
  • 61
1 vote
0 answers
40 views

trying to log the username that generates a presigned url for 'GET' method for auditing. Need the info to be available server-side. What I've tried: Concat the username to the url '&', this wont ...
מרואן ח.'s user avatar
0 votes
0 answers
127 views

I’m working on a Node.js backend that generates a signed URL for uploading files to AWS S3 via CloudFront. However, when I hit the signed URL endpoint from curl, I always get the following response: {&...
Koperumsozhan VR's user avatar
0 votes
0 answers
66 views

I am using PHP with Guzzle to send a document via Telegram Bot API using sendDocument. The file is hosted on Amazon S3 with a URL like: https://telegramfile.xxx.amazonaws.com/file/preview/chat/...
李西安子's user avatar
1 vote
2 answers
532 views

I'm running a Rails 8 app with a Option model that configures active storage like so: has_one_attached :picture do |attachable| attachable.variant :small, resize_to_limit: [75, 75] end config/...
croceldon's user avatar
  • 4,637
1 vote
0 answers
30 views

On data bricks I have mounted two s3 folders with virtually identical folder and file structure: s3a://mybucket/folder1 mounted to /mnt/original_folder s3a://mybucket/folder2 mounted to /mnt/kj/test2 ...
L Xandor's user avatar
  • 1,931
0 votes
1 answer
51 views

I have a problem with using S3 API and Go. To read an object and immediately transfer it to Minio i use Pipes: pr, pw := io.Pipe() For example, I will create an archive object using pipe: zipWriter :=...
TASK's user avatar
  • 345
0 votes
0 answers
48 views

I'm using createUploadDestinationForResource SP API Uploads API call to get an upload destination for a listing image to be uploaded. It works, but when I'm using returned destination upload URL to ...
ShamilS's user avatar
  • 1,654
-1 votes
2 answers
134 views

I'm trying to read some file from S3 with PySpark 4.0.1 and the S3AFileSystem. The standard configuration using hadoop-aws 3.4.1 works, but it requires the AWS SDK Bundle. This single dependency is ...
RobinFrcd's user avatar
  • 5,704
0 votes
0 answers
43 views

We are hosting apple-app-site-association files across multiple domains, in the US AWS regions it works fine, but for our EU-based domains (eu-west-1) it does not work. Setup: The apple-app-site-...
Kalaschni's user avatar
  • 2,447
0 votes
0 answers
93 views

I'm implementing a file upload workflow using Amazon S3 and want to integrate AWS GuardDuty for malware protection. The goal is to automatically scan uploaded files and delete any that are flagged as ...
Tarique's user avatar
  • 11
0 votes
3 answers
110 views

I'm getting a 403 response when trying to use a presigned S3 put object URL in a Javascript fetch call. Here's what I've verified: The IAM role that generates the presigned URL can upload an object ...
jones-chris's user avatar
0 votes
1 answer
85 views

I am setting remote_log for airflow to s3 minio. My airflow version is 3.0.5. Here is my configuration: AIRFLOW__LOGGING__REMOTE_TASK_HANDLER_KWARGS: '{"delete_local_copy": true}' ...
Tai Nguyen Huu's user avatar
0 votes
0 answers
38 views

I am trying to read all the parquet files from a dated folder in S3 using S3FileSystem glob method: def read_parquet_files_from_s3(self, table, schema, start_date, tenant_id): bucket_name = '...
Tester_Cary's user avatar

1
2 3 4 5
1020