-
Notifications
You must be signed in to change notification settings - Fork 2.5k
Cannot retrieve a big field #922
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Here is some sample code I use to reproduce the problem. Note the indexOf result at the end: 16777287. var MAX_PACKET_LENGTH = Math.pow(2, 24) - 1; Code to reproduce: var mysql = require('mysql');
var randomstring = require("randomstring");
var host = 'localhost',
user = 'user',
password = 'xxxx',
db = 'yyyyy'
;
var connection = mysql.createConnection({
host : host,
user : user,
password : password,
database: db
});
connection.connect();
var size = (1024 * 1024) * 25;
// will produce a very long base64 string ~33MB
// This should work with a <= 16MB string
var bigField = new Buffer(randomstring.generate(size)).toString('base64');
console.log('size', bigField.length);
console.log('sample at start', bigField.substring(0, 200));
connection.query('CREATE TEMPORARY TABLE bigFieldTest (' +
' id INT NOT NULL AUTO_INCREMENT PRIMARY KEY,' +
' bigField LONGTEXT NOT NULL' +
')', function(err) {
if (err) throw err;
connection.query('INSERT INTO bigFieldTest(bigField) VALUES(?)', [bigField], function(err, rows, fields) {
if (err) throw err;
connection.query('SELECT id,bigField FROM bigFieldTest', function(err, rows) {
if (err) throw err;
// the id returned is NaN, so it mess the whole result
console.log('result id', rows[0].id);
console.log('result length', rows[0].bigField.length);
console.log('result sample', rows[0].bigField.substring(0, 100));
// it will return 16777287
console.log('indexOf', bigField.indexOf(rows[0].bigField));
});
});
}); PS: I know this may not be the ideal way of storing and retrieving data. If I had the choice I would propably go for MongoDB+GridFS. |
@antoinegomez can you confirm which version of this module you are using real quick? |
Oh, hey, it looks like I already fixed it a long time ago :) Can you run |
Actually, I can confirm it's fixed now, since I added a failing test case to go along with that patch that was on
I'm not going to judge you :) I can't thank you enough for the detailed report and code to re-create the issue. |
This has been published as 2.5.2 |
Thanks @antoinegomez for self-contained code! Fixing this for mysql2 :) |
Hello, thank you for the fast reply and handling of this error. I was AFK the whole weekend. Thanks again, will update to new version. |
It's no problem, @antoinegomez , enjoy AFK time :) If you do try out 2.5.2 and it works, it would be lovely if you popped in to confirmed it's indeed fixed :) |
Tested and it woks well, great job |
Hello!
I am stuck while trying to load data from a table with a big field.
The size of the field content is about 22MB. The data is stored in base64.
The field is a LONGTEXT, I also tested with a LONGBLOB but same result: the output is a 100 char. long base64. Once decoded it looks like non printable/binary data.
I use Knex.js but it use this driver. And I also tested with it: same problem.
I can insert data in the table, so the max packet size option on mysql server seems ok.
Tried to retrieve data with streams and it produced the same error.
Tested on two machines using GNU/Linux and mysql >5.5.
I managed to reproduced it generating a random string of >22MB and then trying to get it back.
Also tested with smaller data and found out it started to stop retrieving data between 16-17MB.
How can I and what is the best way to retrieve such big data from mysql with nodejs ?
Thanks!
The text was updated successfully, but these errors were encountered: