The SSM client for boto3 uses AWS config settings as an authentication mechanism, allowing a python program to run commands on a remote ec2 instance.
I would like to upload files to the ec2 instance. Previous SO questions (How to scp to ec2 instance via ssm agent using boto3 and send file) indicate that this is possible over ssh.
Is it possible to upload files to the instance using SSM without an SSH keypair?
One way to do this may be something like:
with open('path/to/file', r) as f:
contents = f.read()
resp = boto3('ssm').send_command(
InstanceIds=[...],
Commands=[f'echo "{contents}" > file.txt']
)
but this seems very fragile.
Context: I am building a script that is meant to be run by non-technical users. The script sets up a new EC2 instance and programmatically runs several commands on that instance to set up a http server. As far as I know, there is not a good way to automatically generate ssh keypairs, and I dont want to have to manually manage multiple ssh keypairs for every ec2 instance that is deployed.
curlcommand. Boto3 is just using the underlying AWS APIs and it's not going to work around the ssh requirements.aws s3 cpcommands on the EC2 server via SSM.aws s3 cpthem from the bucket as part of your user data setup of the EC2.user-datainstead of using SSM?