Open
Description
Description of the bug:
I followed the stack in the code:
Error originates from processStream
This is exclusively called from generateContentStream
Which is called from sendMessageStream (The function I can see outside of the library)
Lets look at why exactly response.body is undefined
.
response
is generated by a call to makeModelRequest
Now this function, and a helper function it calls are defined here].
I can't figure out how it is possible that body is undefined, while response.ok
is true.
Please help :)
Streaming is so cool!
Actual vs expected behavior:
const apiKey = get your own key;
const genAI = new GoogleGenerativeAI(apiKey);
const model = genAI.getGenerativeModel({model: "gemini-1.5-flash"});
const chat = model.startChat({});
try {
const request = await chat.sendMessageStream("Hello");
for await (const chunk of request.stream) {
if (chunk.promptFeedback) {
error(`${chunk.promptFeedback.blockReason} ${chunk.promptFeedback.safetyRatings} ${chunk.promptFeedback.blockReasonMessage}`);
}
}
const response = await request.response;
debug(`Test Stream Response: ${JSON.stringify(response)}`);
} catch (e) {
error(e);
}
try {
const request = await chat.sendMessage("Hello");
const response = request.response;
debug(`Test Non-Stream Response: ${JSON.stringify(response)}`);
} catch (e) {
error(e);
}
Expected result:
Test Stream Response: {"candidates":[{"content":{"parts":[{"text":"Hello there! How can I help you today?\n"}],"role":"model"},"finishReason":"STOP","avgLogprobs":-0.0011556809768080711}],"usageMetadata":{"promptTokenCount":2,"candidatesTokenCount":11,"totalTokenCount":13},"modelVersion":"gemini-1.5-flash"}
Test Non-Stream Response: {"candidates":[{"content":{"parts":[{"text":"Hello there! How can I help you today?\n"}],"role":"model"},"finishReason":"STOP","avgLogprobs":-0.0011556809768080711}],"usageMetadata":{"promptTokenCount":2,"candidatesTokenCount":11,"totalTokenCount":13},"modelVersion":"gemini-1.5-flash"}
Actual result:
Cannot read property 'pipeThrough' of undefined
Test Non-Stream Response: {"candidates":[{"content":{"parts":[{"text":"Hello there! How can I help you today?\n"}],"role":"model"},"finishReason":"STOP","avgLogprobs":-0.0011556809768080711}],"usageMetadata":{"promptTokenCount":2,"candidatesTokenCount":11,"totalTokenCount":13},"modelVersion":"gemini-1.5-flash"}
Any other information you'd like to share?
"@google/generative-ai": "^0.21.0",