I have PostgreSQL table with ~ 50 million rows, I want to write Go code to select ~ 1 million rows from this table, and process them in efficient way.
Previous time i used nodejs and this NPM module pg-query-stream to generate readable stream of records found, so i can process them like any readable object stream.
Here I post simplified code I used to process data:
const pg = require('pg');
const QueryStream = require('pg-query-stream');
//pipe 1,000,000 rows to stdout without blowing up your memory usage
pg.connect((err, client, done) => {
if (err) throw err;
const query = new QueryStream('SELECT * FROM generate_series(0, $1) num', [1000000]);
const stream = client.query(query);
//release the client when the stream is finished
stream.on('end', done);
stream.on('data', function(data) {
stream.pause();
funcDoSomethingWithDataAsync(data, function(error) {
if(error) throw error;
stream.resume();
});
};
})
How can I emulate readable stream of database records in Go? Does sql.Scanner in Go works with streaming query results like nodejs module does?
I already have optimized queries that works ok, I just want to stream query execution result to Go, like its done in nodejs library.