Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Review OffscreenCanvas, including ImageBitmapRenderingContext #141

Closed
junov opened this issue Oct 24, 2016 · 58 comments
Closed

Review OffscreenCanvas, including ImageBitmapRenderingContext #141

junov opened this issue Oct 24, 2016 · 58 comments

Comments

@junov
Copy link

junov commented Oct 24, 2016

Requesting that the TAG review the ImageBitmapRenderingContext interface spec'ed here: https://html.spec.whatwg.org/multipage/scripting.html#the-imagebitmap-rendering-context

@slightlyoff
Copy link
Member

Hey Justin,

Do you happen to have an explainer doc somewhere for this feature? Or sample code that shows the use-cases and how this API addresses them?

Regards

@junov
Copy link
Author

junov commented Oct 31, 2016

Hi, the original proposal doc was part of the OffscreenCanvas feature proposal. That doc got cleaned-up after ImageBitmapRenderingContext landed in the whatwg standard (to avoid confusion with respect to landed vs unlanded parts of the spec). Here is the pre-cleanup revision of the OffscreenCanvas proposal, which includes ImageBitmapRenderingContext: https://wiki.whatwg.org/index.php?title=OffscreenCanvas&oldid=10087

@junov
Copy link
Author

junov commented Oct 31, 2016

@domenic
Copy link
Member

domenic commented Oct 31, 2016

We'd be happy to have more examples and an introduction section (cf. https://html.spec.whatwg.org/#introduction-13) in the spec. Last I talked I think you were waiting to do that until OffscreenCanvas also got integrated into the spec, but maybe there's value in doing it separately as well.

@junov
Copy link
Author

junov commented Oct 31, 2016

Right. I'll take care of that

@dbaron dbaron changed the title ImageBitmapRenderingContext ImageBitmapRenderingContext (part of offscreen canvas) Nov 2, 2016
@dbaron
Copy link
Member

dbaron commented Nov 2, 2016

The TAG has been looking into this at our face-to-face meeting just now. It seems like it might be helpful to see a little more of an explainer, with examples. (The best explainer we know of right now is from #141 (comment) .) We're also, at least so far, a little confused by the naming of some of the objects and methods here.

It's not clear to me how the ImageBitmapRenderingContext part of the proposal relates to the transferToImageBitmap()/commit() part of the proposal.

And there seems to be a good bit of concern about lack of things like requestAnimationFrame, and lack of ability to synchronize with audio/video.

So, again, it would be good to see some more end-to-end examples of how this is to be used.

@junov
Copy link
Author

junov commented Nov 14, 2016

Added an intro with an example: whatwg/html#2045

The example is a bit weak for now, there will be stronger examples once OffscreenCanvas is added.

It's not clear to me how the ImageBitmapRenderingContext part of the proposal relates to the transferToImageBitmap()/commit() part of the proposal.

OffscreenCanvas has two modes of operation: the ImageBitmap way, and the commit way. The idea is that you'd typically use one or the other (ImageBitmaps or commit) depending on the requirements of your use case.

The commit way requires no script intervention on the main thread (the browsing context's event loop). You just call commit and the results are pushed to the display, this path is the most performant, and allows implementations to take all sort of shortcuts to reduce graphics update overhead. However, the commit flow does not allow commited frames to be synchronized with graphics updates of the surrounding page, which can be a problem. The association between the OffscrenCanvas and its placeholder canvas element is established when the OffscreenCanvas is created by calling transferControlToOffscreen() on the placeholder canvas.

The other way to use OffscreenCanvas is to produce explicit frames by calling transferToImageBitmap. The resulting ImageBitmap object can then be postMessage'd back to the main event loop, where it can be taken in with ImageBitmapRenderingContext.transferFromImageBitmap(). Then the graphics update that reveals the new canvas content can be synchronized with other changes to the DOM.

And there seems to be a good bit of concern about lack of things like requestAnimationFrame, and lack of ability to synchronize with audio/video.

requestAnimationFrame is on the way.

Synchronization with audio/video cannot be done on a worker (without any major additions to the API) since those tags are not accessible in workers. However, by using the ImageBitmap flow (as opposed to commit), it would be possible to render in workers, and take care of synchronizing the graphics update with audio/video in the main event loop. But this realy is not something OffscreenCanvas is well suited for IMHO.

@dbaron
Copy link
Member

dbaron commented Feb 8, 2017

Given #144, we're going to make this issue cover all of OffscreenCanvas.

@dbaron dbaron changed the title ImageBitmapRenderingContext (part of offscreen canvas) Review OffscreenCanvas, including ImageBitmapRenderingContext Feb 8, 2017
@dbaron dbaron added this to the tag-telcon-2017-03-08 milestone Feb 8, 2017
@dbaron
Copy link
Member

dbaron commented Apr 28, 2017

OK, we've just discussed this in a breakout at our face-to-face meeting.

I think we're more comfortable with the two different modes of operation now, and why both modes are valuable (avoiding (maybe) the UI thread for speed vs. letting the UI thread control when the update happens to synchronize with other changes), although it again took us a while to step through both of them and understand how they work. There was a little bit of concern about the number of objects involved (and types in general), although we don't have any suggestions for how to reduce that.

I think @travisleithead was a little concerned about why both ImageData and ImageBitmap exist, although that's water under the bridge at this point because it's already shipping.

I think we're pretty close to being able to close this issue, but curious first if you have any feedback on the above comments.

@cynthia cynthia added the Progress: pending external feedback The TAG is waiting on response to comments/questions asked by the TAG during the review label Apr 28, 2017
@travisleithead
Copy link
Contributor

image
Some of our raw notes...

@dbaron
Copy link
Member

dbaron commented May 30, 2017

@junov, curious if you have any feedback on #141 (comment) above -- particularly on whether there was any consideration of ways to reduce the number of objects involved here. It seems a little awkward, but we also didn't see any obvious alternatives.

@kenrussell
Copy link

The current thinking is that the performance and latency of transferToImageBitmap / transferFromImageBitmap will be OK for multithreaded content which absolutely has to perform updates in sync with the DOM.

We want to gain experience with both this and the commit()-based rendering style which is fully decoupled from DOM updates. I don't see a good way to change commit() to make it optionally, implicitly, sync up with DOM updates on the main thread.

@greggman
Copy link

greggman commented May 17, 2018

I feel like the problem with the current APIs are they ignore what's really happening under the hood. That canvas are just pairs of textures attached to framebuffers. WebGL just followed the path of least resistance and though of canvases like OS windows. Parts of if it were not actually designed, they were just done without thinking about it based on what 2D canvas was doing (if thought was given I'm sorry I'm just trying to provoke some thought now).

In particular the way contexts work right now it was assumed every canvas should have it's own context. Did anyone question that? It would have been just as valid to create contexts separately from canvases and then bind that context to whatever canvas you currently want to render to. If important it could have been which ever context is first bound to that canvas is the only context that can ever be bound to that canvas but that the same context can be used with as many canvases as you want.

That design would have solved sharing resources and broken absolutely nothing.

Imagine the WebGL API was this

const gl = new WebGLRenderingContext();
gl.bindCanvas(someCanvas);  // we're now rendering to someCanvas
gl.draw(...)
gl.bindCanvas(someOtherCanvas);  // an implicit GL flush, and copy/flip to someCanvas happens here
gl.draw(...);  // renders to someOtherCanvas

Similarly people have wanted to be able to not commit the changes (no implicit swapbuffers/resolve). So commit could have been part of the original API

const gl = new WebGLRenderingContext();
gl.bindCanvas(someCanvas);  // we're now rendering to someCanvas
gl.draw(...)
g.commit();  // does swapbuffers/resolve for someCanvas
gl.bindCanvas(someOtherCanvas);
gl.draw(...);  // renders to someOtherCanvas
g.commit();  // does swapbuffers/resolve for someOtherCanvas

That would have solved all of the resource sharing issues. It would also have solved rendering over multiple frames without having to manually render to a framebuffer. Note that native apps expect this behavior, nothing shows up until they call swapbuffers. They don't have that option in the current WebGL API and instead have to write other solutions.

If that was how WebGL worked before workers would the suggested APIs for workers change? My point is that canvases are just really 2 textures. A displayBuffer and a drawingBuffer. They don't really need a context per canvas and they don't need to implicitly swap. We went that way mostly because we followed canvas 2d. If we were to imagine a past where we had chosen this other path would we be coming up with better/simpler solutions? I worry about an API where I'm told "browsers will figure out magic behind the scenes to make things performant" when it's possible a different API could remove the magic and just be performant by design.

@dbaron
Copy link
Member

dbaron commented May 17, 2018

As someone coming at this with a bit less graphics background: what are the resources that you're talking about sharing?

@greggman
Copy link

Resources are basically textures and geometry data. As it is now if you load a 2048x2048 texture (16meg) into WebGL it can only be used directly on a single canvas. If you want to use the same texture in another canvas you have to load that texture again into the context for the 2nd canvas because textures (and geometry) can not be shared across contexts (one context per canvas, all resources belong to a single context.

The creative workaround is to make a canvas offscreen, render to that, then draw that canvas into the other canvases. That's slow. It can be optimized but no amount of optimization will erase the large copy that's happening to copy from one canvas to another.

Why one context per canvas? Because honestly I think it was just assumed to be the right thing to do (if other ways were discussed I'm unaware of them). Why can't you share resources across contexts? Because OpenGL has some very strict rules on how 2 contexts share resources and when changes to a resource made in one context are visible in another context. Those rules are extremely hard to enforce so as to make sure that sharing will work correctly everywhere across all platforms, something important for WebGL.

So, sharing was never enabled but that's all based on the idea that there should be one context per canvas. If instead contexts and canvases were disconnected and realizing that canvases are really just themselves textures the entire problem of sharing disappears. If you want to share make one context and use it with as many canvases as you want. If you don't want to share make a new context.

@dbaron
Copy link
Member

dbaron commented May 17, 2018

So in OpenGL API these resources are scoped to a GL context? It seems like one could also address that by retaining the concept of a canvas context, but adding a GL context object (distinct from the canvas context) that could be shared between multiple canvas contexts by passing it to them when they're initialized?

@kenrussell
Copy link

@greggman if bindCanvas were the primitive of rendering to multiple canvases from a single WebGL context, there would still be synchronization issues if rendering from workers. Fundamentally, workers are not synced with the main thread. It'd be possible to invent new web primitives like the concept of swap groups (see GLX_SGIX_swap_group and GLX_SGIX_swap_barrier), but after much design it was decided to phrase these primitives in terms of existing primitives on the web platform (ImageBitmap, Transferables, postMessage). For rendering to a single canvas from a worker, when those updates are not synchronized with the main thread's DOM updates, OffscreenCanvas and commit() will have excellent performance.

The recycling of older textures would be roughly equally complicated if binding a single WebGL context to multiple canvases or OffscreenCanvases, as it would be if resizing a single OffscreenCanvas to multiple dimensions repeatedly and calling transferToImageBitmap / ImageBitmapRenderingContext.transferFromImageBitmap to display the frames. The bindCanvas model would have new gotchas, like what would happen if attempting to bindCanvas one canvas to multiple WebGL contexts, that would have to be thought through.

I think we should finish implementing the current APIs and measure the performance characteristics. In Chrome the implementation is finally almost done. If it turns out the ergonomics of the API aren't good or it doesn't perform well for real-world uses then we can look into recasting it.

@RByers
Copy link

RByers commented Jun 28, 2018

Note that there's now a blink intent-to-ship thread for this feature

@jhm-ciberman
Copy link

Hello. Is there currently a way to have a single WebGLRenderingContext drawing for multiple HTMLCanvasElement (or OffscreenCanvas)?

@greggman
Copy link

greggman commented Jun 30, 2018

I'm not sure where to bring this up but under the current blocking OffscreenCanvas.commit() proposal how are pages that are not the front tab handled?

With rAF the browser just doesn't call the callback. With blocking commit though what's the plan?

  • if commit blocks forever then the worker is stuck unable to process other events.
  • if commit is a no-op or doesn't block then the worker is wasting time even though the user is not viewing the page

I can imagine the following patterns with commit

  1. commit in spin loop

    // in worker
    
    while(true) {
        render();
        offscreenCanvas.commit();
    }
    
  2. commit in raf loop

     // in worker
     const socket = new WebSocket("ws://www.example.com/socketserver");
     socket.onmessage = handleMessagesFromServer;
    
     function loop() {
         render();
         requestAnimationFrame(loop);
         offscreenCanvas.commit();
     }
     requestAnimationFrame(loop);
    

In case 1 above blocking commit if the page is not the front page seems reasonable. In case 2 it does not because the worker was expecting to be able to process events. How will browsers be able to handle the 2 cases?

@greggman
Copy link

greggman commented Jul 2, 2018

So I tried it and at least in Chrome commit seems to be broken

This example does this

const appInfo = {
  clientWidth: 300,
  clientHeight: 150,
};
function render() {
    resize if canvas size does not match client size
    render scene
    requestAnimationFrame(render);
    gl.commit();
}
onmessage = update appInfo clientWidth and clientHeight

The worker as no way of knowing that size the drawingbuffer needs to be so the main thread sends that info whenever it changes. But, once the worker starts no messages ever arrive from the main thread even though the main thread is sending them. Given that gl.commit is synchronous and many other things are going on it seemed best to call rAF before gl.commit so that the next animation frame comes as soon as possible.

In this sample I swapped the order to rAF after gl.commit but it also never receives messages from the main thread

Also note using gl.commit no events are delivered period. Here's an example that tries to load a texture using fetch. The fetch callback is never received

This seems far from completely specced. Having commit basically make the entire rest of the platform fail to work seems wrong but the spec does not make it clear what is supposed to happen. My guess is chrome promotes rAF events to the top of the event queue so regardless of what other events are pending the rAF event gets run first and then gl.commit blocks processing.

That could be a bug in Chrome but AFAICT it's not wrong based on the spec. I think the spec should be clear how these messages get processed when there's a raf+commit loop as well as just commit loop.

Here's a sample with a commit loop (no rAF) as in

  while(true) {
      ...render...
      gl.commit();
  }

It also tries to download an image with fetch and update the canvas size by having messages passed from the main thread. No messages ever arrive and the fetch callback is never called.

@greggman
Copy link

greggman commented Jul 4, 2018

As the original rAF creator any input @rocallahan on this seemingly platform breaking API?

@greggman
Copy link

greggman commented Jul 4, 2018

Here's yet another use case that's not working

Lets say you'd prefer to run at a smooth fps below 60fps instead of a janky sometimes 60 and sometimes something slower

So you do something like

  let then = 0;
  function loop(time) {
    time *= 0.001;
    const elapsed = time - then;
    if (elapsed > 1 / 15) {
      then = time;
      render();
      gl.commit();
    }
    requestAnimationFrame(loop);
  }
  requestAnimationFrame(loop);

http://jsfiddle.net/greggman/y36aujrf/

apparently this is not supposed to work?!??! It flickers like crazy in Chrome and the bug report response is "works as intended". Really?

@kenrussell
Copy link

It looks like when requestAnimationFrame was added to workers (which is still in progress in whatwg/html#3677 ), the behavior chosen was to automatically commit frames for OffscreenCanvas contexts which are modified in the rAF callback. This behavior was probably chosen in order to behave most similarly to rAF on the main thread.

In this definition it's an error to call commit() from inside a rAF callback on a worker. The spec and implementations should be updated to throw an exception from commit() in this case, to make it clearer.

Thanks for putting together your examples. Per your Chrome bug report http://crbug.com/859275 , here's a revised version of your resizing example which removes the call to commit() and which works well:

http://jsfiddle.net/gmpkvu30/2/

and a revised version of your 15 FPS animation example which removes the call to commit() and which works well:

http://jsfiddle.net/y36aujrf/3/

commit() can usefully be used within setTimeout() callbacks on a worker, and also within WebAssembly threads, where it's basically the only way to get frames out for display. Yes, in this case events aren't processed on the thread, but WebAssembly threads will expect to use the heap's SharedArrayBuffer to transfer data in and out.

@travisleithead
Copy link
Contributor

Note, we've taken a look at this issue jointly with #288, please refer to that issue for our latest comments.

@travisleithead
Copy link
Contributor

Let's consolidate our discussion into #288. Closing this issue.

@slightlyoff
Copy link
Member

Just to clarify from the F2F discussion today, we're happy with the shape off ImageBitmapRenderingContext and OffscreenCanvas, modulo the discussion of commit(), which is what's addressed over in #288. Thanks again to everyone who has participated here and has helped to improve our level of understanding of this complex space.

@Swoorup
Copy link

Swoorup commented Jan 17, 2024

@greggman

I'm not sure if this belongs here but I don't know where else to post it sooo...

I'm hoping OffscreenCanvas allows one WebGL context to be used to efficiently update multiple canvases and it's not clear to me how that is solved in the current proposal or if it's even supposed to.

I am reading this thread, to gain an understanding of doing multiple canvas rendering and am a bit confused as to what is the best way of doing this right now?

@greggman
Copy link

For WebGL see this for some ideas on multiple canvases. Even on WebGPU, which is designed from the ground up to render to multiple canvases, the technique at the bottom of that article is the fastest technique.

For more details on multiple canvas you should probably ask on stackoverflow or some other more general help forum/discussion

@Swoorup
Copy link

Swoorup commented Jan 18, 2024

For WebGL see this for some ideas on multiple canvases. Even on WebGPU, which is designed from the ground up to render to multiple canvases, the technique at

I am already using that technique, the only caveat being is there is a certain lag/sync issue when scrolling. From your earlier answer, I was instead pondering about drawing to a large canvas (where views are allocated to certain offset) rather than constantly switching draw textures/buffers and then generating a bitmap, and subsequently rendering the parts of the image using ctx.drawImage(bmp, x, y).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests