-
Notifications
You must be signed in to change notification settings - Fork 153
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Question]: Is amf (amd GPU) supports hardware encoding in gpu paravirtualization (GPU-PV)? #509
Comments
Sounds like a bug with newer AMD drivers jamesstringerparsec/Easy-GPU-PV#392 |
Based on conversation with virtualization team it looks like known and recently fixed issue. We will notify when the fix appears in the public driver. |
Hi @MikhailAMD, I have a similar problem to what @Zhenia-l originally posted. I am using driver version 24.1.1 and using GPU-P I am able to use AMF correctly in a Guest Windows Server or even Windows client based VM as @Zhenia-l has been trying to do with a newer version of the driver. However, I'm trying to get the same workflow to work in a Windows Server based Container with the GPU passed through from the host itself and even after I put the required DLLs in place, I am getting a similar error as above with Is there any reason why this wouldn't work on a Windows Container as opposed a GPU-P enabled VM? |
Manually coping DLLs is unsupported situation. There are many things can go wrong. But based on the log you provided AMF failed to load another DLL it uses: amdenc64.dll. |
Thank you for the quick reply @MikhailAMD. Also I did try with
|
I see. The DLL was loaded but AMF failed to initialize encoding with D3D11 driver. Could it be some mismatch between AMF DLL(s) and D3D11 driver versions? They can be only used from the same driver package. |
Hmm, I'm not sure myself. I am using the same driver package from the host, in fact in Containers it seems to copy the drivers automatically into the Containers When I follow almost the same exact method for GPU-P VM's that seems to work, I wonder if there's some limitations within Containers that are being set, because natively within Containers, DirectX graphics frameworks work out-of-the-box but external ones like AMF do not. |
Hmmm... AMF runtime should be HostDriverStore (amfrtdrv64.dll). System32\amfrt64.dll is just a stub. |
Yes, the Windows Container seems to automatically create a replica of the Host drivers and puts them in the Would you be able to guide me as to how I can attach a debugger and log full paths and versions from the Container? |
If you can run the app in the container from Visual Studio for debugging, VS has Modules view with list of loaded DLLs, paths etc. But how to debug with VS inside container - I don't have experience. |
Thank you I'll try to see if I can triage this a bit more tomorrow. With respect to the driver version, I'm using an older driver because it works well with my Windows Server host. Traditionally, AMD drivers don't install on Windows Server for some reason, so I have to use a custom one from RDN-ID that is signed for it which hasn't been updated in some time. I understand that of course would not be supported, but my driver install is basically stock, just signed to install on Windows Server. Additionally, as per the link that @Zhenia-l pasted, isn't there a bug with newer versions that causes encoding to fail? |
Hi @MikhailAMD, I went down the path of trying to update my driver to 24.12.1 and noticed something really weird that I was hoping you could help with before I triage the AMF within a container issue. Since I'm running on Windows Server 2025, of course I can't normally install the driver without modifying the inf to support my version of Windows for the official drivers, so for all the installations I will mention below I had to use that method, followed by enabling testsigning mode in order to get it to install. My graphics are the Radeon 780M which is built in to Ryzen 7840HS processor. So when I go about trying to install 24.12.1, it installs the driver but Windows disables it right away with a bunch of Code 31 or Code 43 errors. If I restart my computer, it will go into a boot loop. Ironically if I uninstall Hyper-V from my machine, then the driver loads properly, not sure why that is, but for me Hyper-V is required so I can't disable it. I had done this test with 24.10.1 as well as 24.9.1 and they both failed as well in the same way. If I look into the Event Viewer logs for the System right after installing it where Windows disables it I see the following:
If I go back to either version 24.3.1 or 24.1.1 and install those, they work perfectly and survive reboots as well even with Hyper-V enabled. Any idea why this might be happening and is there some problem with newer drivers and Hyper-V? |
Hi @MikhailAMD, Did some more testing on this and found something definitely wrong with the latest drivers. So I went ahead to dual boot a normal client Windows 11 Pro install on my machine to rule out any compatibility issues with Windows Server. Even on Windows 11, the latest 24.12.1 drivers fail to install with Code 31 or Code 43 errors IF you have Hyper-V installed, and if I restart my computer with them installed, the machine also enters a boot loop in the same way. So either I need to disable Hyper-V to use the latest drivers, or I need to downgrade them. After some extensive testing, I noticed that the latest driver that DOES work with Hyper-V installed is 24.8.1. Anything above that exhibits the behaviour I mentioned above. The problem though with 24.8.1 is that it has the bug inside AMF which you had mentioned and that @Zhenia-l initially opened this issue for. So I'm having to resort to using 24.3.1 which is the latest version that doesn't have the AMF bug and that works with Hyper-V installed. AMF however still doesn't work inside of Windows Containers, so I will need to begin triaging that further now that I have the driver versions set to something more stable. Have you seen this behaviour on your end with the latest drivers as well? I'll get back to you about more testing with AMF in Windows Containers when I get a chance. |
GPU: RX 6700XT
host/VM: windows 11 24h2
driver: 24.9.1
I created a VM using Hyper-V GPU Paravirtualization. The GPU is visible in the VM, I can run games and they work without problems, as on the host computer. But I ran into a problem trying to use hardware encoding in sunshine while trying to start stream. I get the following errors:
When trying to use h265, the client reports that the computer (VM) does not support h265.
Hardware encoding (both h264 and h265) on the host works without issues. Software encoding on the VM also works.
Should hardware encoding work in the use case described above?
The text was updated successfully, but these errors were encountered: