Why can't the MSVC runtime see the graphics card on my Azure VM?

Tommy Herbert 1 Reputation point

I have this function in my Visual C++ app:

wstring TestHarness::getAcceleratorDetails() {
    auto acc = accelerator(accelerator::default_accelerator);
    wstring details{L"default accelerator: " + acc.device_path + L" (" + acc.description + L")"};
    auto accs = accelerator::get_all();
    details += accs.size() == 1 ? L"\nthere are no other accelerators" : L"\nother accelerators:";
    for (auto other : accs) {
        if (other.device_path != acc.device_path) {
            details += L"\n" + other.device_path + L" (" + other.description + L")";
    return details;

(And at the top of the file I include
and declare that I'm using

On my laptop I get this output:

default accelerator: PCI\VEN_8086&DEV_5916&SUBSYS_075B1028&REV_02\3&11583659&0&10 (Intel(R) HD Graphics 620)
other accelerators:
direct3d\warp (Microsoft Basic Render Driver)
direct3d\ref (Microsoft Basic Render Driver)
cpu (CPU accelerator)

I wanted to test my code against a separate graphics card, so I spun up a virtual machine on Azure with their Nvidia driver extension. The driver appears to have installed correctly:

C:\Program Files\NVIDIA Corporation\NVSMI>nvidia-smi.exe -L
GPU 0: Tesla K80 (UUID: GPU-9742a843-bed1-45e1-f291-c9337e415fd9)

But when I copy my executable and runtime DLLs to the VM and run my app, the graphics card isn't listed:

default accelerator: direct3d\warp (Microsoft Basic Render Driver)
other accelerators:
cpu (CPU accelerator)

Why is that? And how else can I check if the graphics card is available to the operating system?

Not Monitored
Not Monitored
Tag not monitored by Microsoft.
37,737 questions
{count} votes