Skip to content

Dealing with integrated and dedicated graphics cards #873

@BryanChrisBrown

Description

@BryanChrisBrown

Hi all,

This is admittedly more of an implementation question than a spec question.

I’ve been using webcodecs with great success on systems with one GPU, so far both my m1 MacBook and my windows desktop (ryzen cpu + nvidia gpu) are working fantastic, especially with the WebGPU interop which is crucial for my use case, which is decoding videos and running them through a graphics pipeline for Looking Glass holographic displays.

I’ve encountered an interesting problem on windows based laptops which commonly have two graphics cards, one on the cpu (integrated graphics) and a dedicated GPU. In this case I’ve tested both intel/nvidia and amd/nvidia. In both these cases it appears that the dual GPU setup introduces an extra cpu -> GPU -> GPU to CPU copy. This greatly affects the performance when using the resulting video frames with webGPU.

Right now my m1 MacBook is out performing my nvidia RTX 3060 which is a bit unexpected 😅

My main query here is, is there a way to force webcodecs to use a specific GPU on the system? So far I’ve tried adjusting the windows hardware preference for chrome to use the nvidia card. I’ve also tried adjusting the angle backend, changing from Chrome’s default to d3d11in12 has helped a bit but there’s still an extra copy in there.

I’m happy to share an example that demonstrates my current pipeline, but am curious if there’s another flag or checkbox I can look at in the chrome configuration.

I’ve been really enjoying using webCodecs and WebGPU and these new APIs are fantastic for what they unlock! Many thanks to all the contributors!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions