2

I get 8000 rows back daily from an API and I have to save these results in my database. Doing this 1 by 1 is crazy, it takes way to long. Therefor I decided to give go routines a try, to find out that out of the 8000 rows only 1000 (plus minus) rows actually got inserted.

wg := sync.WaitGroup{}
wg.Add(len(xyz.Hours))

for _, e := range xyz.Hours {
    go createDatabaseEntry(e, &wg)
}

Using the Go routine, it is super fast but only around 1000 rows will be inserted and the rest will be left blank for whatever reason.

Can someone tell me what would be the appropriate way to store it to my Postgress database?

I am using GORM as database library.

When I use the code below (createDatabaseEntry fn) it takes well over 5 minutes to complete.

for _, e := range xyz.Hours {
    msInt, _ := strconv.ParseInt(strconv.Itoa(e.Timestamp), 10, 64)
    t := time.Unix(0, msInt*int64(time.Millisecond))

    ef := models.Xyz{
        Timestamp: t,
        Unit:      e.Unit,
        Value:     e.Value,
    }

    db.Create(&ef)
}

It are not the two first lines (strconv) who make it slow, I removed them and it still remains extremely slow adding the rows to Postgres.

I also tried to do a batch insert but then I get a long list of errors from Gorm and I dont understand why.

var test []models.XyzFlow

for _, e := range xyz.Hours {
    msInt, _ := strconv.ParseInt(strconv.Itoa(e.Timestamp), 10, 64)
    t := time.Unix(0, msInt*int64(time.Millisecond))

    test = append(test, models.XyzFlow{
        Timestamp: t,
        Unit:      e.Unit,
        Value:     e.Value,
    })
}

db.Create(&test)

2020/10/29 23:47:39 http: panic serving [::1]:52886: reflect: call of reflect.Value.Interface on zero Value goroutine 35 [running]:

6
  • "it takes way to long" --- 8k rows should take about 2-5 seconds to insert. Is it too slow for you? createDatabaseEntry --- what is its actual implementation? Commented Oct 29, 2020 at 22:32
  • With the code below it takes a lot of minutes... havent sit it out yet so dont know exactly but it takes well over 5 minutes. (Cant place code here so Ill put it in my main post) Commented Oct 29, 2020 at 22:40
  • 2
    Something else you might want to check out, to use instead of or alongside goroutine dispatch, is batch inserts (insert multiple rows with one SQL statement/DB request). Details here in the gorm docs: gorm.io/docs/create.html under the "Batch Insert" heading. Commented Oct 29, 2020 at 22:44
  • Very good call Mikerowehl, thank you very much! I tried that indeed but I always end up with an error that I dont really understand. I am kinda new to Go thats why I tried the Go routine stuff (which was really really fast). I edited my main post with my "batch insert" and the given error. Commented Oct 29, 2020 at 22:49
  • So, where is the complete createDatabaseEntry declaration? Does it mean you don't check for errors anywhere at all? Commented Oct 29, 2020 at 23:50

1 Answer 1

1

It would be helpful for you to refer to Transactions in GORM.

https://gorm.io/docs/transactions.html#A-Specific-Example

If you process and commit multiple queries within the transaction, I don't think there will be any significant performance load.

Sign up to request clarification or add additional context in comments.

2 Comments

Thank you Jihoon, I will give this a try and let you know the outcome!
Not sure why, but it is super fast now like 2-3 seconds in total to add 8 up to 9.000 rows. Thanks for your answer Jihoon Yeo (and everyone else too!!)

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.