0

I want to kick off the dataflow job once i drop one file in the cloud storage.And i start a cloud function to trigger it.But i don't know how to start up the datadlow job with Python?Someone can help?

const kickOffDataflow = (input, output) => {
var jobName = CONFIG.DATAFLOW_JOB_NAME;
var templatePath = CONFIG.TEMPLETE_FILE_PATH;
var request = {
    projectId: "test",
    requestBody: {
        jobName: jobName,
        parameters: {
            configFile: input,
            outputFile: output,
            mode: "cluster_test"
        },
        environment: {
            zone: "europe-west1-b"
        }
    },
    gcsPath: templatePath
}
console.log("Start to create " + jobName + " dataflow job");
return google.auth.getClient({
    scopes: ['https://www.googleapis.com/auth/cloud-platform']
}).then(auth => {
    request.auth = auth;
    return dataflow.projects.templates.launch(request);
}).catch(error => {
    console.error(error);
    throw error;
});

}

3
  • Please show what you have tried and where exactly you don't know how to continue. Commented Feb 7, 2019 at 14:44
  • and you can see ,i have code by node.js and it worked ,but i don't know how to rewritten by python, my boss need us to change python Commented Feb 7, 2019 at 15:03
  • Then you should reword your question to make this clear. Note that requests to program something for you are likely to stay unanswered. Commented Feb 7, 2019 at 15:42

1 Answer 1

1

Take a look at the Dataflow Cloud Composer Example. It describes how Cloud Composer can be used in combination with Cloud Functions to trigger a Python based Dataflow job when a new file arrives in a GCS bucket.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.