I have a very simple Python application (using cx_Oracle) that is writing a BLOB to an Oracle database, but it is painfully slow when writing large files. My application first downloads an image file from another system using a a SOAP API, and it downloads in a matter of a second or two. However, writing files to the DB of say 100k in size, takes a second or two, but a file of 1.5MB takes just over 1 minute.
My SQL is :-
INSERT INTO my_data (doc_id,doc_content) VALUES (:blobid, :blobdata), blobid=id_variable, blobdata=blob_variable
Is there a reason it's taking so long? Can I speed it up? It oughtn't take more than a few seconds I'd have thought?
(Disclaimer! I have never used cx_Oracle or indeed Oracle itself ever before, I am comparing performance to Postgres which I have used for this kind of thing, and whose performance is infinitely better.)
The full code block is here :-
def upload_docs(doc,blob,conn):
sql="INSERT INTO my_data (doc_id,doc_content) VALUES (:dociddata, :blobdata)"
cursor = conn.cursor()
cursor.execute(sql, dociddata=doc, blobdata=blob)
conn.commit()
cursor.close()
conn = cx_Oracle.connect(user="Myuser", password="MyPassword",dsn="ocm.server.here.com:1527/some_name",encoding="UTF-8")
doc_csv = "/tmp/document_list.csv"
csv_file=open(doc_csv, 'r')
for line in csv_file:
splitLineArray = line.split(',')
documentId = splitLineArray[17]
#Pull document down from SOAP API
documentData = (client.service.getDocument(int(documentId)))
upload_docs(documentId, documentData, conn)