I'm writing a script in python where I will need to access a postgresql database multiple times and execute multiple select queries and insert queries. I am trying to reduce the time it takes for this script to run.
Currently I have written a secondary function which I pass a qry string, a boolean indicating if I am inserting or recieving data, and a list of parameters and then execute the query:
def sql_call(qry, insert, inputlist):
params = config_np()
with psycopg2.connect(**params) as conn:
cur = conn.cursor()
try:
cur.execute(qry, inputlist)
if insert:
conn.commit()
sqlrtn = True
else:
sqlrtn = cur.fetchall()
except (Exception, psycopg2.DatabaseError) as error:
print(error)
quit()
conn.close()
return sqlrtn
I'm working with a few hundred thousand entries and this takes forever to run. Is there a faster way to do it?
selectqueries can be run in parallel, e.g. withmultiprocessing.