Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Handle file uploads #70

Closed
Rich-Harris opened this issue Oct 27, 2020 · 71 comments
Closed

Handle file uploads #70

Rich-Harris opened this issue Oct 27, 2020 · 71 comments
Labels
feature request New feature or request p1-important SvelteKit cannot be used by a large number of people, basic functionality is missing, etc.

Comments

@Rich-Harris
Copy link
Member

Currently, the body parser bails on multipart forms that include files

@Rich-Harris Rich-Harris added this to the 1.0 milestone Oct 29, 2020
@benmccann
Copy link
Member

It looks like the way you'd do this in Sapper is to add something like Multer or Formidable as a middleware which would parse the incoming request and stick the files on the request object. Probably the way to do this would be to expose a way to add middleware, so that users can utilize these packages, so I wonder if this is essentially a duplicate of #334. (side note, I wonder if we should have adapter-polka and adapter-express instead of adapter-node)

I was trying to figure out how Netlify, Lambda, etc. handle this. All the examples I see directly upload to S3 from the browser. I couldn't find how you'd handle a form upload inside a Lambda, so I guess that's something that's generally not supported

@thgh
Copy link

thgh commented Mar 24, 2021

Here is an example of file uploads to lambda: https://github.com/mpxr/lambda-file-upload/blob/master/handler.js

I may be biased, but I think serverless APIs typically don't use global middleware and instead have several awaits (one per middleware) at the start of the function. See https://nextjs.org/docs/api-routes/api-middlewares#connectexpress-middleware-support

@asos-tomp

This comment has been minimized.

@axmad386
Copy link

I think hooks is enough for handling Middleware, except the request (ServerRequest) is little bit different from polka or express. Multer require request.pipe and unpipe to handle File Upload. Formidable require request.on which is missing from request object on handle hooks. Cmiiw

@benmccann
Copy link
Member

You'd have to convert from SvelteKit request to Express request first before calling the middleware

@axmad386
Copy link

You'd have to convert from SvelteKit request to Express request first before calling the middleware

That's exactly I want, but how. Any references?

@vonadz
Copy link

vonadz commented May 2, 2021

Would it be possible to comment out this section

if (directives.filename) {
// TODO we probably don't want to do this automatically
throw new Error('File upload is not yet implemented');
}

for now? This will give users the chance to implement their own solutions in the handle hook.

EDIT: If anyone is trying to find a solution for this, or a work around, here is what I did for an app that will work with the adapter-node:

  1. Comment out the throw Error part of the code I linked above in your node_module @sveltejs/kit (I found this by just using
    grep -rin 'File upload is not yet implemented' ./node_modules/@sveltejs/)
  2. Install busboy
  3. Add the code below to your hooks file
import busboy from 'busboy'
import type { Handle } from '@sveltejs/kit'
export const handle: Handle = async ({ request, render }) => {
  console.log('in handle')
  console.log(request)
  if (request.path === 'your/path') {
    const files = {}
    await new Promise((resolve, reject) => {
      const bb = new busboy({
        headers: { 'content-type': request.headers['content-type'] },
      })
      bb.on('file', (fieldname, file, filename, encoding, mimetype) => {
        console.log(
          'File [%s]: filename=%j; encoding=%j; mimetype=%j',
          fieldname,
          filename,
          encoding,
          mimetype
        )
        console.log(file)
        const buffers = []
        files[fieldname] = {
          filename,
          encoding,
          mimetype,
        }
        file
          .on('data', (fileData) => {
            console.log('File [%s] got %d bytes', fieldname, fileData.length)
            buffers.push(fileData) //this adds files as buffers, which can be hard on memory. You can also write to a file using streams here.
          })
          .on('end', () => {
            console.log('File [%s] Finished', fieldname)
            const buffer = Buffer.concat(buffers)
            files[fieldname]['file'] = buffer
          })
      })
        .on('field', (fieldname, val) => {
          console.log('Field [%s]: value: %j', fieldname, val)
          request.body[fieldname] = val
        })
        .on('finish', () => {
          console.log('Done parsing form!')
          request.body['files'] = files
          resolve(request.body)
        })
        .on('error', (err) => {
          console.log('failed', err)
          reject(err)
        })

      bb.end(request.rawBody)
    })
  }
  const response = await render(request)
  return {
    ...response,
    headers: {
      ...response.headers,
    },
  }
}

EDIT 2: so I can't seem to get the above code to actually write nonbroken files. I've tried a couple of different methods, including substituting busboy for something like aws-lambda-multipart-parser, which gave files that were the correct size, but still corrupted or something. Maybe there is an issue in encoding/decoding processes for the rawBody that happens here:

if (!isNaN(length)) {
data = new Uint8Array(length);
let i = 0;
req.on('data', (chunk) => {
// TODO maybe there's a simpler way to copy data between buffers?
for (let j = 0; j < chunk.length; j += 1) {
data[i++] = chunk[j];
}
});
} else {

EDIT 3: so changing the Buffer type for content-type multipart/form-data in the above code to 'ascii' gives me files of the exact size that are sent, but they're still corrupted for some reason. Anyone know what I might be missing here?

@MTyson
Copy link

MTyson commented May 8, 2021

I just did this in Sapper, and I'll throw out my thoughts in case it might help.

I used formidable and filepond. Filepond sends the chunks, and I used formidable to write a temp file to disk (via node fs). Upon submitting the form, a unique id is used to retrieve the file and write it to its permanent home (mongodb in this case).

In a pure serverless environment, you could override formidable and handle the chunks yourself, writing them to memory, and do the same process. (This assumes we can import formidable into sveltekit).

In the end, there should be a simple way to interact with the filesystem if there is one at hand.

@vonadz
Copy link

vonadz commented May 9, 2021

I just did this in Sapper, and I'll throw out my thoughts in case it might help.

I used formidable and filepond. Filepond sends the chunks, and I used formidable to write a temp file to disk (via node fs). Upon submitting the form, a unique id is used to retrieve the file and write it to its permanent home (mongodb in this case).

In a pure serverless environment, you could override formidable and handle the chunks yourself, writing them to memory, and do the same process. (This assumes we can import formidable into sveltekit).

In the end, there should be a simple way to interact with the filesystem if there is one at hand.

Yeah I played around with using formidable, but I don't think it's a very nice solution. From my understanding, the handle hook would be the natural place for handling fileupload. The issue is that the request in the handle hook isn't a stream of chunks, like is normally expected for something like formidable or express-fileupload, but an object with a rawBody that is the complete stream in string format. The reason I was using busboy is because it can parse out form data out of a string, both fields and files.

@MTyson
Copy link

MTyson commented May 11, 2021

handle hook isn't a stream of chunks

Interesting. So something in sveltekit itself is handling the chunks and marshalling them into a string of bits?

I wonder if that could pose a limitation in cases where one wanted to stream the chunks directly to a datastore...?

@juanmait
Copy link

handle hook isn't a stream of chunks

Interesting. So something in sveltekit itself is handling the chunks and marshalling them into a string of bits?

I wonder if that could pose a limitation in cases where one wanted to stream the chunks directly to a datastore...?

I'm just looking for that. I'm working with streaming in the browser side and now reached the point of wanting to stream the result of the client pipeline to the server. Seems like there is no way to access the raw request in sveltekit endpoints. I was planning to use cloudflare workers but first giving it a try using nodejs.

I would vote for having a way to access the plain raw request object on whatever the platform you use. I guess this would be the job of every adapter.

@avhust
Copy link

avhust commented May 27, 2021

plain/text Content-Type allows to send data (e.g base64 coded image).
I use this code with SvelteKit to upload images:

// <input type="file" bind:files>
const getBase64 = (image) => {
	const reader = new FileReader();
	reader.onload = (e) => {
		uploadImage(e.target.result);
	};
	reader.readAsDataURL(image);
};
const uploadFunction = async (imgBase64) => {
	const imgData = imgBase64.split(',');
	await fetch(`/api/upload`, {
		method: 'POST',
		headers: {
			'Content-Type': 'text/plain',
			Accept: 'application/json'
		},
		body: imgData[1]
	});
};
getBase64(file[0]);

and in hook.ts to capture the image and process with sharp:

export const handle: Handle = async ({ request, render }) => {
	//...
	// Upload Image
	if (request.path === '/api/upload' && request.method === 'POST') {
		const imageName = 'some.webp';
		try {
			await sharp(Buffer.from(request.rawBody, 'base64'))
				.rotate()
				.resize({ width: 200, height: 200 })
				.webp()
				.toFile(`./${imageName}`);
		} catch (err) {
			console.log(err);
		}
		return {
			status: 201,
			body: JSON.stringify({
				src: imageName
			}),
			headers: {
				'Content-Type': 'application/json'
			}
		};
	}
	// ...
};

@juanmait
Copy link

@avhust It's a good solution for relatively small uploads. But if you're dealing with sizes in the range of the gigabytes its like you cannot afford to fill the whole rawBody especially in an environment like vercel or cloudflare. Streaming allows you to start processing the data as soon as it start coming in and you can kind of control the memory usage and also even pause the stream to also control the CPU usage.
I guess I could use some sort of proxy to redirect uploads to something outside sveltekit, or use CORS. But is an inconvenience 😅

@benmccann
Copy link
Member

There's an issue about streaming here: #1563

@MTyson
Copy link

MTyson commented May 30, 2021

It would seem strange to have the client-side content type dictated by the framework. Feels like a workaround right out the box.

@geoidesic
Copy link

https://github.com/mscdex/busboy ftw! @vonadz any luck with this yet?

@vonadz
Copy link

vonadz commented Jun 26, 2021

https://github.com/mscdex/busboy ftw! @vonadz any luck with this yet?

Nope. I just decided to use an external service (transloadit) to handle my file uploads for me. It made more sense for my use case anyway.

@geoidesic
Copy link

geoidesic commented Jun 26, 2021

@avhust I tried your work-around but I still get the dreaded Error: File upload is not yet implemented response. Doesn't seem to trigger any console.logs in the hooks.ts file, so I'm not convinced it's even getting that far.

export const handle: Handle = async ({ request, resolve }) => {
  console.log('handle hooks.ts')       //<---- this does get output on a normal request but not on a file upload
  const cookies = cookie.parse(request.headers.cookie || '');
  request.locals.userid = cookies.userid || uuid();

  // TODO https://github.com/sveltejs/kit/issues/1046
  if (request.query.has('_method')) {
    request.method = request.query.get('_method').toUpperCase();
  }

  // Upload Image
	if (request.path === '/examples/api/upload' && request.method === 'POST') {
    console.log('yo')         //<---- this does NOT get output on the route
    ...

@avhust
Copy link

avhust commented Jun 26, 2021

@geoidesic Looks like you doesn't use 'Content-Type': 'text/plain' in your POST request.

@benmccann benmccann added the feature request New feature or request label Jul 15, 2021
@UltraCakeBakery
Copy link

UltraCakeBakery commented Jul 30, 2021

As I am writing this I'm filling up my production svelte-kit server's error log budget... Svelte kit throws / logs an error, even after build. Anyone can simply add an empty file to their request and this causes the endpoint to break.

It is also annoying to work around the endpoint forcing a 500 error.

Can't instead a development environment only warning be thrown instead with the form data entry being set to a value of null?
I have some security concerns and shitty developer experience because of the current implementation..

@joelhickok
Copy link

I'm using the solution provided by @avhust for now. Can't wait until sveltejs/kit supports better server-side customizations and middleware. It's kind of a handicapped tool at this point, powerful as it is. Yes, I understand it's still under development.

My example uses .xlsx instead of an image. This works fine without using the text/plain Content-Type.

// client side code
const reader = new FileReader()

reader.onload = async (e) => {
    const file = e.target.result.split(',')

    const response = await request.post('/upload.json', file[1], {
        params: {
            filename: encodeURIComponent('file.xlsx'), // just to pass in the actual filename to use
        },
    }).catch(err => err)
}

reader.readAsDataURL(fileInput.files[0])

// hooks.js or .ts
export async function handle({ request, resolve }) {
    if (request.path === '/upload.json' && request.method === 'POST') {
        const filename = request.query.get('filename')
        try {
            // this example show use for a file such as .xlsx instead of the previous example
            // using sharp for an image
            fs.writeFileSync(`static/uploads/${filename}`, request.rawBody, { encoding: 'base64' })
            // file now available for use in local upload path, so create additional logic elsewhere
        } catch (err) {
            console.log(err)
        }
    }

    const response = await resolve(request)

    return {
        ...response,
        headers: {
            ...response.headers,
        }
    }

}

// upload.json.js
export async function post(req) {
    console.log(`HANDLER: ${req.path}`)
    try {
        // do something else, or just return a response letting
        // the client know the upload went okay

        return {
            body: {
                success: true,
                filename: req.query.get('filename'),
            },
        }
    } catch (err) {
        return {
            body: err.message
        }
    }
}

@benmccann benmccann added the p1-important SvelteKit cannot be used by a large number of people, basic functionality is missing, etc. label Aug 4, 2021
@benmccann benmccann removed this from the 1.0 milestone Aug 4, 2021
@benmccann benmccann added the p2-nice-to-have SvelteKit cannot be used by a small number of people, quality of life improvements, etc. label Aug 4, 2021
@yanick
Copy link

yanick commented Nov 22, 2021

The secret lies in the docs of https://github.com/sveltejs/kit/tree/master/packages/adapter-node

I've updated my example. The key is the file src/server.mjs, which is the entry point for the prod server.

@vonadz
Copy link

vonadz commented Nov 22, 2021

The secret lies in the docs of https://github.com/sveltejs/kit/tree/master/packages/adapter-node

I've updated my example. The key is the file src/server.mjs, which is the entry point for the prod server.

Oh yeah the whole middleware section is a new update since I last tried implementing this, which substantially simplifies things.

@jack-y
Copy link

jack-y commented Nov 22, 2021

@yanick Great job! Thank you so much. I've updated my repo too.

FYI, I tried with fastify instead of polka, but without success.
As the doc says: "The adapter exports a middleware (req, res, next) => {} that's compatible with Express / Connect / Polka."

@m4tr1k
Copy link

m4tr1k commented Dec 8, 2021

Hi everyone!

So I came across this problem as well, since I needed a way to have access to the file on sveltekit endpoint, but I didn't want to use the above solutions using adapter-node because of my backend implementation.
So I decided to rewrite the get-multipart function to support file uploading. I used a dependency (parse-multipart-data) to parse the multipart form data request, and then I just had to present that information in a different way.

That's my implementation:

import multipart from 'parse-multipart-data';

(...)

function parse_body(raw, headers) {
	         (...)
		case 'multipart/form-data': {
			const boundary = directives.find((directive) => directive.startsWith('boundary='));
			if (!boundary) throw new Error('Missing boundary');
                         // I added the raw body request to use it in parse-multipart-data
			return get_multipart(text(), boundary.slice('boundary='.length), raw);
		}
}

(...)

function get_multipart(text, boundary, raw) {
	let data = {};
	const parts = text.split(`--${boundary}`);

	if (parts[0] !== '' || parts[parts.length - 1].trim() !== '--') {
		throw new Error('Malformed form data');
	}


	let names = [];
	parts.slice(1, -1).forEach((part) => {
		const match = /\s*([\s\S]+?)\r\n\r\n([\s\S]*)\s*/.exec(part);
		if (!match) {
			throw new Error('Malformed form data');
		}
		const raw_headers = match[1];
		raw_headers.split('\r\n').forEach((str) => {
			const [raw_header, ...raw_directives] = str.split('; ');
			let [name, value] = raw_header.split(': ');

			name = name.toLowerCase();

			/** @type {Record<string, string>} */
			const directives = {};
			raw_directives.forEach((raw_directive) => {
				const [name, value] = raw_directive.split('=');
				directives[name] = JSON.parse(value);
			});

			if (name === 'content-disposition') {
				if (value !== 'form-data') throw new Error('Malformed form data');

				if (directives.name) {
					names = [...names, directives.name];
				}
			}
		});
	});

	const bodyParts = multipart.parse(raw, boundary);

	for(let i = 0; i < bodyParts.length; i++) {
		const bodyPart = bodyParts[i];
		if(!bodyPart.type){
			data[names[i]] = bodyPart.data.toString();
		} else {
			data[names[i]] = {
				data: bodyPart.data,
				type: bodyPart.type
			};
		}
	}

	return data;
}

Please keep in mind that I'm aware that this may not be the smartest and most effective way to do this (because you can see that I did not erase a part of the code and probably there are more elegant ways to do this), but as a small fix until the full implementation is fully done I think it is a pretty good solution. I would like to know your opinion on this as well.

Thanks 😄

@KrustyC
Copy link

KrustyC commented Dec 14, 2021

I am running into the same issue and I am just wondering what's the latest on this? Is this going to be part of release 1.0? If not, is there any estimate about when this will be made available? I think it's an important feature and a very common use case (as proven from all the comments above). I would be happy to help if possible, although I have never worked with the svelte-kit codebase before

@m4tr1k
Copy link

m4tr1k commented Dec 14, 2021

@KrustyC hey, check out my solution right above your comment. I developed a way to be able to access the files on Svelte endpoints. Hope it helps!

@Rich-Harris
Copy link
Member Author

RFC

Prompted by #3086 (thanks @pixelmund), I've spent some time thinking about how we should rethink request body handling to enable both file uploads and other large bodies. I'd be grateful for feedback on the following:

  • if the request's content-type is text/* or application/json, keep the existing behaviour (i.e. request.body contains text or a parsed JSON object)

  • if the request's content-type is application/x-www-form-urlencoded or multipart/form-data, request.body is an async iterable, each of whose values represents one of the form values and is also an async iterable, whose values correspond to the actual data:

    export async function post({ body }) {
      for await (const entry of body) {
        console.log(`receiving ${entry.name}`);
        for (const chunk of entry) {
          console.log(`received ${chunk.length} bytes of data`);
        }
      }
    
      return {
        status: 201
      };
    }

    It would be great if we could keep the existing behaviour for application/x-www-form-urlencoded, but we can't, because in their infinite wisdom the web's architects decided that a regular form submission should have that content type, but a fetch with the exact same data (i.e. body: new FormData(event.target)) will have a multipart/form-data content type. In order for endpoints to behave the same way with JS-triggered form submissions as JS-less ones, we need to have consistent behaviour.

    We could recover some of the lost convenience by exposing a body.toFormData() method:

    // src/routes/multiply.js
    export async function post({ body }) {
    - const a = +body.get('a');
    - const b = +body.get('b');
    + const data = await body.toFormData();
    + const a = +data.get('a');
    + const b = +data.get('b');
    
      return {
        status: 200,
        body: a * b
      };
    }

    (In the Lambda case, where the body is already buffered, this might seem like more work than is necessary, but there's a lot of value in having a consistent interface.)

  • Finally, for all other cases (application/octet-stream, image/* and so on) we can expose the raw data as an async iterable:

    // src/routes/upload-image.js
    import { appendFileSync } from 'fs'; // purely for demo purposes
    import { v4 as uuid } from '@lukeed/uuid';
    
    export async function post({ headers, body }) {
      if (headers['content-type'] !== 'image/jpeg') {
        return {
          status: 400,
          body: 'expected a JPEG'
        };
      }
    
      const id = uuid();
      const file = `static/__images/${id}.jpg`;
    
      for await (const chunk of body) {
        appendFileSync(file, chunk);
      }
    
      return {
        status: 307,
        location: `/view/${id}`
      };
    }

This would, of course, be a breaking change. We'd also need to figure out what to do about rawBody, which we currently expose for use cases like #831. I think the logical choice here would be to make it available for text/JSON bodies, but otherwise have it be null.

@thgh
Copy link

thgh commented Dec 29, 2021

How about using the standard Request object?

// src/routes/multiply.js
- export async function post({ body }) {
- const data = await body.toFormData();
+ export async function post(req: Request) {
+ const data = await req.formData();
const a = +data.get('a');
const b = +data.get('b');

  return {
    status: 200,
    body: a * b
  };
}

@maxicarlos08
Copy link

That doesn't exist in Nodejs as far as I know...

@thgh
Copy link

thgh commented Dec 29, 2021

That doesn't exist in Nodejs as far as I know...

The node adapter depends on node-fetch.
And node-fetch seems to have the formData method.

So this should work:

import { Request } from 'node-fetch'

const data: FormData = await new Request(url, { headers, body }).formData();
const file: File = data.get('file-input')
console.log(file.name)
console.log(file.size)
console.log(file.type)

@Rich-Harris
Copy link
Member Author

@thgh I've thought about it. There is something nice about aligning with the platform. But it risks being a straitjacket, and I think we can provide better ergonomics when it comes to streaming. Glossing over the fact that params and locals need to be present in the argument signature, this is how you'd process an octet stream using Request...

export async function post(req) {
  const reader = req.body.getReader();

  let chunk;
  while (chunk = await reader.read()) {
    if (chunk.done) {
      return {
        status: 204
      };
    }

    await do_something_with(chunk.value);
  }
}

...versus the proposal:

export async function post({ body }) {
  for await (const chunk of body) {
    await do_something_with(chunk.value);
  }

  return {
    status: 204
  };
}

For forms with files (where we want to handle things in streaming fashion, i.e. formData() is a non-starter), it gets much more complicated; we'd need to provide a helper of some kind that would turn Request into an iterable-of-iterables as above (or a stream-of-streams, if we wanted to do things that way, but I think this is a case where language > platform).

Even for trivial cases, it's an extra hoop to jump through — if the content-type is application/json then calling await req.json() feels like busywork.

That doesn't exist in Nodejs as far as I know...

That's not a problem, it's provided in SvelteKit (either by the environment, if you're using Deno etc, or the adapter if you're using Node). But it is true that we'd be creating a Request object ourselves, which feels like unnecessary overhead.

@Rich-Harris
Copy link
Member Author

So this should work:

import { Request } from 'node-fetch'

const data: FormData = await new Request(url, { headers, body }).formData();
const file: File = data.get('file-input')
console.log(file.name)
console.log(file.size)
console.log(file.type)

Unfortunately this assumes that the body is already buffered

@darbymanning
Copy link

Sorry if a tad off-topic - but whilst processing files in requests for endpoints isn’t available yet, is it possible to add a platform specific serverless function, that’s handled by the provider? eg. A platform specific endpoint not managed through SvelteKit (in my case a Vercel Serverless function). I haven’t checked but I assume they can support formdata being provided. I love the platform agnostic approach SK has, but as a temporary solution perhaps that could work until we have support in SK endpoints.

Alternative to this I guess is making a whole separate app/site in Vercel which is ‘vanilla’, purely to create the serverless function. None are great solutions, but I do kinda need the functionality!

@yanick
Copy link

yanick commented Jan 4, 2022

Just to drop info to prevent some teeth-gnashing: the latest node-adapter versions renamed the produced middleware.js to handler.js. I'll try to update my example repo to reflect that soon. (@jack-y, that might be of interest to you too)

❤️

@jack-y
Copy link

jack-y commented Jan 8, 2022

@yanick, thank you so much for letting me know! My example repo is also updated.

@jack-y
Copy link

jack-y commented Jan 9, 2022

I just created an example repo using FileReader and an endpoint. No FormData.
Beware: this IS NOT a solution for the initial issue. It's just a workaround that might help.

@benmccann
Copy link
Member

@Rich-Harris your suggestions look pretty reasonable to me. The only question in my mind would be making sure we nail down the streaming API, which this seems to build on top of. There were some interesting use cases in that thread, so we might want to check if we can handle cancellation or jump around a video stream. The streaming portion is not my area of expertise, but the changes to the form API here look good to me.

@quentincaffeino
Copy link

quentincaffeino commented Jan 9, 2022

@Rich-Harris

if the request's content-type is text/* or application/json, keep the existing behaviour (i.e. request.body contains text or a parsed JSON object)
if the request's content-type is application/x-www-form-urlencoded or multipart/form-data, request.body is an async iterable, each of whose values represents one of the form values and is also an async iterable, whose values correspond to the actual data:

I don't really like having it hard-coded like that.

For example from personal experience with JSON I see two reasons why streaming might be useful:

  1. Processing large JSON inputs, this just isn't possible with regular JSON.parse or res.json. Yes having big jsons is very niche, but asynchronous parsing could speedup some apps.
  2. Semantic Web and JsonLD. This one is even more niche. One can parse JsonLD string into JS object with proxies of linked data. Problem is not every client sets the correct header. Since it's still a json many just use regular application/json instead of application/ld+json.

There most definitely other cases that I am missing. But generally speaking my point is that streaming any of the types of data could be useful.

My proposal would be and I believe I haven't seen this yet: is to leave it as is by default and export config object (or separate properties) from the endpoint, similar to how it's done on .svelte files (eg.: <svelte:options accessors={true}/>).

I imagine it looking something like this:

export const postConfig = {
	body: "default" | "raw" | "stream"
};

Reason I believe it is better to use an object is because this would allow other extensions to the config in the future (middlewares?).

Hope my idea isn't too crazy.

Edit:

Did a little research, turns out my proposal is similar to what Nextjs has. It allows to opt-out from body parsing: https://nextjs.org/docs/api-routes/api-middlewares#custom-config

@Rich-Harris
Copy link
Member Author

@benmccann

There were some interesting use cases in that thread, so we might want to check if we can handle cancellation or jump around a video stream.

#1563 is about responses rather than requests, so the issues are related but different. Reading through it, I am warming to the idea of supporting ReadableStream responses directly. Firstly, as the issue points out, it gives us an avenue for cancellation:

return {
  status: 200,
  body: new ReadableStream({
    start(controller) {
      // streaming logic goes here
    },
    cancel() {
      // cleanup goes here
    }
  })
};

Achieving the same thing with async iterables would involve some contortions.

Secondly, with a minor change — allowing headers to be a Headers object rather than a POJO — we would get support for this for free, which would make it incredibly easy to proxy to external sites and APIs:

export const get = ({ url }) => fetch('https://en.wikipedia.org' + url.pathname);

I think this fetch would be cancellable by the underlying platform (res.body.cancel()) even if we don't pass in an AbortSignal? (I'm not particularly well versed on this.)

Thirdly, it means less work, since we wouldn't need to convert the iterable to a ReadableStream in order to pass the response to platforms that expect Response objects, like Cloudflare Workers and Deno. (This might even save SvelteKit users money, in some cases.)

Of course, this involves a change to the contract between adapters and Kit. If app.render returns a Response, we can delete some code from adapter-cloudflare, but Node- and lambda-shaped adapters would need to do some more work to pipe a ReadableStream to a Node res or buffer it for a lambda return value (Kit can expose helpers for doing that), and adapter-node would need to listen for req.on('close') to cancel the body.

Given all the above it might seem odd to still favour async iterables for request bodies, but I do — I think the ergonomics for consuming versus producing streams are sufficiently different, and cancellation is easier to model (if the requester cancels for await (const chunk of body) can throw; the endpoint could proactively cancel the request by returning).


Re video streams — the endpoint will always need to handle Range headers, Kit can't do that for you.


@quentincaffeino

My proposal would be and I believe I haven't seen this yet: is to leave it as is by default and export config object (or separate properties) from the endpoint

I like this. I think that's probably a separate conversation as the configuration could take many forms, and it's something that could be implemented after we've nailed the basics of streamed requests/responses.

@Rich-Harris
Copy link
Member Author

It's now possible to upload files, as long as they're not large: #3384.

Handling streams is still a TODO, but there's an issue for it. Closing this in favour of #3419

@maxicarlos08
Copy link

Could we make use of the Request and Response globals that will soon be available in node nodejs/node#41749?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request New feature or request p1-important SvelteKit cannot be used by a large number of people, basic functionality is missing, etc.
Projects
None yet
Development

Successfully merging a pull request may close this issue.