ILearningModelDeviceFactoryNative.CreateFromD3D12CommandQueue method
Creates a LearningModelDevice that will run inference on the user-specified ID3D12CommandQueue.
HRESULT CreateFromD3D12CommandQueue(
ID3D12CommandQueue * value,
[out] IUnknown ** result);
Parameters
Name | Type | Description |
---|---|---|
value | ID3D12CommandQueue* | The ID3D12CommandQueue which the LearningModelDevice will be run against. |
result | IUnknown** | The LearningModelDevice to be created. |
Returns
HRESULT The result of the operation.
Examples
// 1. create the d3d device.
com_ptr<ID3D12Device> pD3D12Device = nullptr;
CHECK_HRESULT(D3D12CreateDevice(
nullptr,
D3D_FEATURE_LEVEL::D3D_FEATURE_LEVEL_11_0,
__uuidof(ID3D12Device),
reinterpret_cast<void**>(&pD3D12Device)));
// 2. create the command queue.
com_ptr<ID3D12CommandQueue> dxQueue = nullptr;
D3D12_COMMAND_QUEUE_DESC commandQueueDesc = {};
commandQueueDesc.Type = D3D12_COMMAND_LIST_TYPE_DIRECT;
CHECK_HRESULT(pD3D12Device->CreateCommandQueue(
&commandQueueDesc,
__uuidof(ID3D12CommandQueue),
reinterpret_cast<void**>(&dxQueue)));
com_ptr<ILearningModelDeviceFactoryNative> devicefactory =
get_activation_factory<LearningModelDevice, ILearningModelDeviceFactoryNative>();
com_ptr<::IUnknown> spUnk;
CHECK_HRESULT(devicefactory->CreateFromD3D12CommandQueue(dxQueue.get(), spUnk.put()));
See also
Requirements
Requirement | |
---|---|
Minimum supported client | Windows 10, build 17763 |
Minimum supported server | Windows Server 2019 with Desktop Experience |
Header | windows.ai.machinelearning.native.h |
Note
Use the following resources for help with Windows ML:
- To ask or answer technical questions about Windows ML, please use the windows-machine-learning tag on Stack Overflow.
- To report a bug, please file an issue on our GitHub.