I'm looking to learn node.js and mongodb which look suitable for something I'd like to make. As a little project to help me learn I thought I'd copy the "posts" table from a phpbb3 forum I have into a mongodb table so I did something like this where db is mongodb database connection, and client is a mysql database connection.
db.collection('posts', function (err, data) {
client.query('select * from phpbb_posts", function(err, rs) {
data.insert(rs);
});
this works ok when I do it on small tables, but my posts table has about 100000 rows in and this query doesn't return even when I leave it running for an hour. I suspect that it's trying to load the entire database table into memory and then insert it.
So what I would like to do is read a chunk of rows at a time and insert them. However I can't see how to read a subset of the rows in node.js, and even more of a problem, I can't understand how I can iterate through the queries one at a time when I only get notification via a callback that it's finished.
Any ideas how I can best do this? (I'm looking for solutions using node.js as I'd like to know how to solve this kind of problem, I could no doubt do it easily some other way)