-
Notifications
You must be signed in to change notification settings - Fork 2.5k
[HELP] How to do “for each” for 1,000,000 rows using node-mysql? #1370
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
you can use LIMIT and implement some pagination or if it's possible, filter the rows with WHERE, for example: select * from bigdata where field='something' limit 1, 1000 |
Thanks. I hoped maybe there is something built-in in the module. |
edited Node 4+ example:var stream = require('stream');
connection.query('select * from bigdata')
.stream()
.pipe(stream.Transform({
objectMode: true,
transform: function(data,encoding,callback) {
// do something with data...
callback()
}
})
.on('finish',function() { console.log('done');}) Node 0.12 example:In Node 4+ you can define the var Transform = require('stream').Transform,
myTransform = new (Transform({objectMode:true}));
myTransform._transform = function(data,encoding,callback) {
// do something with data...
callback()
};
connection.query('select * from bigdata')
.stream()
.pipe(myTransform)
.on('finish',function() { console.log('done');}) |
@ZJONSSON Thanks. |
Clarified a little the answer above. In transform stream you will only handle one record at a time (and call Each record coming into the transform will be an object representing a single row. For this reason the transform (and any other stream processing the data) needs to be defined with option: |
Closing - please reopen if any further issues |
Not working for me. Only the finish called connection.query('select id from stats limit 100')
.stream()
.pipe(stream.Transform({
objectMode:true,
transform:function(data,encoding,callback){
//This never be called
res.write(util.inspect(data))
callback()
}
}))
.on('finish',function(){console.log('done');res.end()}) |
Now it's working. I have used by mistake |
It's not working in pool. var conn=mysql.createPool(data)
conn
.query('select * from stats limit 100')
.stream()
.pipe(require('stream').Transform({
objectMode:true,
transform:function(object,encoding,callback){
console.log(object)
callback()
}
}))
conn.end() |
At first glance it seems you are closing the connection immediately. Wrapping it inside the finish event-handler i.e. |
You right. you can close it again. Thanks again |
Hi! I have some problem like @AminaG, but i don't want to receive a single object/row in each transform callback. Instead of i want to receive a small array of certain size which contains objects/rows . Is it possible ? var stream = require('stream');
connection.query('select * from bigdata')
.stream()
.pipe(stream.Transform({
objectMode: true,
transform: function(rows,encoding,callback) {
rows.forEach(row => { /* do something with each row */ })
callback()
}
})
.on('finish',() => connection.end() ) |
Hi @iSuperMan Streams are defined by Node.js code, and object mode streams will only ever give your transform function one object at a time. Since the objects are a row, you only get one row at a time. If you want to get multiple at once, you may need to ask how to do this, but it's a general Node.js stream question, not really anything to do with this module. |
http://stackoverflow.com/questions/36015279/how-to-do-for-each-for-1-000-000-rows-using-node-mysql
I'm using NodeJS. I want to do something to 1,000,000 rows without loading all the rows into the memory.
Before, when I used ASP Classic I did:
In node-mysql I didn't found anything that similar to cursor. only
The problem is, that I don't want to loads all the rows at one time.
The text was updated successfully, but these errors were encountered: