2

I have a file with a simple xml structure that I want to load into a Postgres table.

<rows>
<field name="id">1</field>
<field name="age">75-84</field>
<field name="gndr">F</field>
<field name="inc">32000-47999</field>
</rows>

Is there an easy way of doing this?

2
  • Are you trying to throw it into a blob, or parse it out into a normalized table structure? What tool do you want to use (Perl, Java, etc.)? Commented Mar 31, 2014 at 22:12
  • Trying to parse into a normalized table. If PostGres doesn't have a function to take care of the parsing, Python would be my first choice. Commented Mar 31, 2014 at 22:20

1 Answer 1

5

Well, it can be done. I wrote a complete function for the same purpose a while back:

I use pg_read_file() to read in the whole file into a variable in plpgsql and proceed from there. This carries a few restrictions:

Only files within the database cluster directory and the log_directory can be accessed. Use a relative path for files in the cluster directory, and a path matching the log_directory configuration setting for log files. Use of these functions is restricted to superusers.

But I listed alternatives.

If you can convert the XML file to JSON it gets simpler. There are a bunch of tools for that available.

Starting with Postgres 9.3 you get pretty good support for the json data type.
Better yet, use the jsonb data type in Postgres 9.4+.

Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.