I get 8000 rows back daily from an API and I have to save these results in my database. Doing this 1 by 1 is crazy, it takes way to long. Therefor I decided to give go routines a try, to find out that out of the 8000 rows only 1000 (plus minus) rows actually got inserted.
wg := sync.WaitGroup{}
wg.Add(len(xyz.Hours))
for _, e := range xyz.Hours {
go createDatabaseEntry(e, &wg)
}
Using the Go routine, it is super fast but only around 1000 rows will be inserted and the rest will be left blank for whatever reason.
Can someone tell me what would be the appropriate way to store it to my Postgress database?
I am using GORM as database library.
When I use the code below (createDatabaseEntry fn) it takes well over 5 minutes to complete.
for _, e := range xyz.Hours {
msInt, _ := strconv.ParseInt(strconv.Itoa(e.Timestamp), 10, 64)
t := time.Unix(0, msInt*int64(time.Millisecond))
ef := models.Xyz{
Timestamp: t,
Unit: e.Unit,
Value: e.Value,
}
db.Create(&ef)
}
It are not the two first lines (strconv) who make it slow, I removed them and it still remains extremely slow adding the rows to Postgres.
I also tried to do a batch insert but then I get a long list of errors from Gorm and I dont understand why.
var test []models.XyzFlow
for _, e := range xyz.Hours {
msInt, _ := strconv.ParseInt(strconv.Itoa(e.Timestamp), 10, 64)
t := time.Unix(0, msInt*int64(time.Millisecond))
test = append(test, models.XyzFlow{
Timestamp: t,
Unit: e.Unit,
Value: e.Value,
})
}
db.Create(&test)
2020/10/29 23:47:39 http: panic serving [::1]:52886: reflect: call of reflect.Value.Interface on zero Value goroutine 35 [running]:
createDatabaseEntry--- what is its actual implementation?createDatabaseEntrydeclaration? Does it mean you don't check for errors anywhere at all?