0

I am in facing a performance problem in my code.I am making db connection a making a select query and then inserting in a table.Around 500 rows in one select query ids populated .Before inserting i am running select query around 8-9 times first and then inserting then all using cursor.executemany.But it is taking 2 miuntes to insert which is not qood .Any idea

def insert1(id,state,cursor):

   cursor.execute("select * from qwert where asd_id =%s",[id])

   if sometcondition:

        adding.append(rd[i])

cursor.executemany(indata, adding)

where rd[i] is a aray for records making and indata is a insert statement

#prog start here


cursor.execute("select * from assd")



for rows in cursor.fetchall()

if rows[1]=='aq':


  insert1(row[1],row[2],cursor)

if rows[1]=='qw':

  insert2(row[1],row[2],cursor)
4
  • 3
    Please give us some code to look at. Commented Aug 13, 2009 at 11:51
  • Can you post some example code? It's hard to suggest improvements without it. Commented Aug 13, 2009 at 11:51
  • 1
    We can't even know what database engine you're using, what module you're using, or identify where the bottleneck might remotely be. Plus my eyes are bleeding from the formatting of your post. Commented Aug 13, 2009 at 12:42
  • Maybe you can also tag this question with 'sqlalchemy' (which I assume it's what you are using) Commented Aug 13, 2009 at 14:24

1 Answer 1

3

I don't really understand why you're doing this.

It seems that you want to insert a subset of rows from "assd" into one table, and another subset into another table?

Why not just do it with two SQL statements, structured like this:

insert into tab1 select * from assd where asd_id = 42 and cond1 = 'set';
insert into tab2 select * from assd where asd_id = 42 and cond2 = 'set';

That'd dramatically reduce your number of roundtrips to the database and your client-server traffic. It'd also be an order of magnitude faster.

Of course, I'd also strongly recommend that you specify your column names in both the insert and select parts of the code.

Sign up to request clarification or add additional context in comments.

4 Comments

i have to check some conditions before inserting and in my select query i ma fectching data from 3 tables with some condition. I m using psycopg2 for db connection and using postgressql database
The answer still stands - for best performance you should try to move your logic to your SQL. You could use unions to join the result sets of the three source tables (given that the columns are the same types). The idea is that, say, six queries to insert 60k rows will always be faster than 60006 queries with their associated round trips.
i m using regex in my logic and i think regex cannot be used in db scripts.Since all my things are in python so i want some uniformity.
As requested by others, what database are you using? Regex may be supported by your database.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.