1

As to Google documents I can use the schema stored in a JSON file in command line as the following:

bq --location=location load \
--source_format=format \
project_id:dataset.table \
path_to_data_file \
path_to_schema_file

where path_to_schema_file is the path to the file that contains the schema

is there a way to do this in python and pass the schema from the json file to the LoadJobConfig().schema ? or I should read the schema manually and transform it into bigquery.TableSchema() object?

2 Answers 2

3

schema_from_json() seems to be what you're looking for.

It converts a file in json format to a list of schema properties (which seems to be what you need for LoadJobConfig().schema)

Sign up to request clarification or add additional context in comments.

Comments

0

did you try initializing the LoadJobConfig with the schema ?

e.g.:

with open('path_to_schema_file') as json_file:
    schema = json.load(json_file)
    LoadJobConfig(schema=schema)

2 Comments

actually this didn't work as LoadJobConfig.schema only takes list of SchemaFields() and doesn't take json
I thought it accepts a dict, because according to the docs , the type of schema can be a sequence of mappings

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.