https://onnxruntime.ai/docs/build/inferencing.html
Choose a temporary build directory such as : D:\workspace\onnx_build.
Choose a workspace and cd to the workspace then clone the onnx repo.
Run a Visual Studio 2022 command prompt for the following (with python installed).
At the time of this post, 1.16.3 is the latest release. So adjust the git checkout to the latest release at the time you need to do this build.
git clone --recursive https://github.com/Microsoft/onnxruntime.git
cd onnxruntime
git checkout 1.16.3
.\build.bat --use_dml --config RelWithDebInfo --build_shared_lib --parallel --compile_no_warning_as_error --skip_submodule_sync --build_shared_lib --cmake_extra_defines CMAKE_INSTALL_PREFIX=D:\workspace\onnx_build
msbuild INSTALL.vcxproj
Now copy the D:\workspace\onnx_build\ contents to `C:\Program Files\onnxruntime\` with admin level.
Now edit the environment variables and create a new variable:
ML_ONNX_BUILD=C:\Program Files\onnxruntime
Now you can use the variable $(ML_ONNX_BUILD) in your Visual Studio project settings to add additional include directories and additional link search directories. You can also add pre-build actions to copy the dll files into your output directory.