Remote desktop graphics card

So as most people know, when you use RDP to connect to your desktop, it disables the graphics card and uses generic CUDA.

I don't want Windows to revert to using CUDA instead of the Graphics Card. I have a GTX 780ti in the computer but it isn't being used by RDP. Is there any way to force Windows to use the hardware graphics card?

I've tried TightVNC, RealVNC and LogMeIn, but I want to use RDP as it is the fastest and works best for me.

4

3 Answers

Firstly, you are getting your terms mixed up. CUDA is an NVIDIA technology for programming their GPU (and other things, but that's the simplest description).

Microsoft's RDP uses a it's own graphics driver which converts the rendered screen into network packets to send to the client.

This is the core of how RDP works and you cannot change it.

On the server, RDP uses its own video driver to render display output by constructing the rendering information into network packets by using RDP protocol and sending them over the network to the client. On the client, RDP receives rendering data and interprets the packets into corresponding Microsoft Windows graphics device interface (GDI) API calls.

Source: (v=vs.85).aspx

3

Everything in the above answer is correct except for "This is the core of how RDP works and you cannot change it". Never say never.

There are two ways to utilize a better graphics driver over RDP without 3rd party slow laggy software and without modifying any windows DLLs.

  1. (super hard) Install windows server 2012 r2 on a physical host. Then use Hyper V to create a virtual desktop environment and install your OS as one of those virtual desktops. Install and configure the server roles for Remote Desktop services. Then you will be able to add a virtualized GPU to your virtual machines running on the server. When you RDP to those machines they will use RemoteFX. RemoteFX is capable of 3d rendering and DX11.

  2. (medium hard) Install windows server 2008 r2 on a physical host. Install the server role for remote desktop services. With this installed there is a registry setting that will allow you to pass your physical GPU rendering on to RDP users. There is also one that lets you use the vGPU called RemoteFX if you want. Yes, you can even run a server with no physical GPU. This method ONLY works on windows server 2008 R2.

RDP stands for Remote Desktop PROTOCOL. It is simply a step by step procedure on how to break down the image, sound, and control variables into network packet frames to send. RDP has nothing to do with the rendering or hardware acceleration. If you look at your event viewer right after you "RDP" into a machine, you can find where windows originally loads the graphics drivers for your local machine, then immediately after, disables those and loads the default terrible diver.

1

One of many RDP group policy settings on the RDP host (Computer configuration \ Administrative Templates \ Windows Components \ Remote Desktop Services \ Remote Desktop Session Host \ Remote Session Environment):enter image description here(From a W10 Enterprise 1809)

Direct reg key to enable (REG_DWORD): HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Windows NT\Terminal Services\bEnumerateHWBeforeSW = 1

But how does it play in real life?

TEST-SPEC:

  • Host-server: openSUSE KVM/QEMU hypervisor with GPU pass-through
  • Virtual-Machine: Win10 Enterprise 1809
  • Video: mp4 file, 2K resolution played in VLC in WM

TEST-RESULTS:

RDP-HARDWARE-RENDERING (in Win10 VM RDP host):

enter image description hereenter image description hereenter image description here

TEST-COMMENTS:

Sound and video seems to play fine, no lack or missing frames in the RDP session. Very interesting to see that VLC use < 1% CPU usage!

Overall system-watt-usage on the KVM host: 30-35 Watt. Not playing video usage is about 20- 25W (yes, it is low but believe me it is true!).

RDP-SOFTWARE-RENDERING (in Win10 VM RDP host):

enter image description hereenter image description here

TEST-COMMENTS:

The GPU is not used at all, good to see theory and practice match. And the CPU is using 100%, primarily used by the VLC process. When the video starts to play it take longer time for the video to "initialize"...the first few seconds I got lot of lack/missing frames, but it eventually starts to play okay incl. sound. BUT the Win10 is SLOW. You cannot do anything else e.g. office work as a result of the 100% CPU usage. I would expect other RDP sessions on the server will be useless too.

Overall system-watt-usage: 40-45 Watt, so you do achieve better power-consumption using hardware-rendering.

ADDITIONAL GENERAL COMMENTS ON MY SETUP:

The Nvidia Geforce GT 1030 is low-profile PCIe display adapter, so it fits in my 2U server and use passive heat-sink, and has rather low-power consumption. Was newer able to pass-through a GeForce GT 520! The built-in Intel GPU cannot be use as pass-through on my mobo. Don't know if you can enable hardware-rendering if you install the Windows host directly on the bare metal, I suppose so (and would probably be preferable to an AMD/Nvidia GPU device unless you use some 3D hungry software). When I did 3D benchmark in GFXBench the 3D rendering was parsed to the GPU even if RDP HW-rendering was disabled. So it seems it only apply for 2D/Video (?)

You Might Also Like