Skip to content

Support for response streaming #154

Closed
@serg06

Description

@serg06

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the feature has not already been requested

🚀 Feature Proposal

AWS just released the ability to stream a Lambda function's response: https://aws.amazon.com/blogs/compute/introducing-aws-lambda-response-streaming/

It would be great if this framework supported it.

It requires wrapping the handler in streamifyResponse().

Motivation

This is an extremely useful feature which greatly decreases TTFB.

Example

// API which returns chunked data.
// Locally TTFB is 0, but on Lambda TTFB is 5s.
// Presumably because the handler is not wrapped in streamifyResponse.
fastify_streamtest.get('/chunked', (request, reply) => {
  // Create a buffer to hold the response chunks
  var buffer = new stream.Readable();
  buffer._read = () => {};

  // Generate 5 chunks with 1 second interval
  var count = 5;
  var emit = () => {
    var data = `Hello World ${count}`;
    console.log(`sending "${data}"`);
    buffer.push(data);

    count--;
    if (count > 0) {
      setTimeout(emit, 1000);
    } else {
      console.log('end sending.');
      buffer.push(null);
    }
  };

  emit();
  void reply.type('text/html').send(buffer);
});

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions