三、ONNX Runtime添加一個新的execution provider

ONNX Runtime添加一個新的execution provider

execution provider從本質上來講就是一個針對不同硬件平臺的executor,ONNX Runtime目前提供了以下

  • MLAS (Microsoft Linear Algebra Subprograms)

  • NVIDIA CUDA

  • Intel MKL-ML

  • Intel DNNL - subgraph optimization

  • Intel nGraph

  • NVIDIA TensorRT

  • Intel OpenVINO

  • Nuphar Model Compiler

  • DirectML

  • ACL (in preview, for ARM Compute Library)
    10種特定的execution provider,涵蓋了ARM,Intel CPU,Nvidia的GPU,不同的操作系統,但是這對於目前紛繁的AI硬件市場遠遠不夠,假如我們自己想添加一個自己定製平臺的execution provider,就需要安裝如下教程添加自己的execution provider。

  1. 在 onnxruntime/core/providers下創建一個新的文件夾,取個名字如your_provider_name。
  2. 在 include/onnxruntime/core/providers下創建一個新的文件夾, 名字應該和第一步一樣爲your_provider_name.
  3. 創建一個新的類,必須繼承於IExecutionProvider。實現的源碼需要放在’onnxruntime/core/providers/[your_provider_name]'目錄下。
  4. 在 include/onnxruntime/core/providers/[your_provider_name]目錄下創建一個頭文件。這個有文件應該提供創建OrtProviderFactoryInterface的函數接口。你可以使用 'include/onnxruntime/core/providers/cpu/cpu_provider_factory.h’作爲模板,需要注意的是創建 MemoryInfo函數不是必須的。
  5. 在 'onnxruntime/core/providers/[your_provider_name]'目錄下面創建一個symbols.txt文件,該文件應該包含本execution provider導出的所有的函數名。通常只需一個函數即可創建execution provider工廠。
  6. 在onnxruntime_providers.cmake文件中增加你的execution provider編譯代碼,並編譯成一個靜態庫。
  7. 在 cmake/onnxruntime.cmake增加一行target_link_libraries函數,使得onnxruntime能鏈接上你的execution provider庫。
  • Create a folder under onnxruntime/core/providers
  • Create a folder under include/onnxruntime/core/providers, it should has the same name as the first step.
  • Create a new class, which must inherit from IExecutionProvider. The source code should be put in ‘onnxruntime/core/providers/[your_provider_name]’
  • Create a new header file under include/onnxruntime/core/providers/[your_provider_name]. The file should provide one function for creating an OrtProviderFactoryInterface. You may use ‘include/onnxruntime/core/providers/cpu/cpu_provider_factory.h’ as a template. You don’t need to provide a function for creating MemoryInfo.
  • Put a symbols.txt under ‘onnxruntime/core/providers/[your_provider_name]’. The file should contain all the function names that would be exported from you provider. Usually, just a single function for creating provider factory is enough.
  • Add your provider in onnxruntime_providers.cmake. Build it as a static lib.
  • Add one line in cmake/onnxruntime.cmake, to the ‘target_link_libraries’ function call. Put your provider there.

Examples:

Using the execution provider

  1. Create a factory for that provider, by using the c function you exported in ‘symbols.txt’
  2. Put the provider factory into session options
  3. Create session from that session option
    e.g.
  OrtEnv* env;
  OrtInitialize(ORT_LOGGING_LEVEL_WARNING, "test", &env)
  OrtSessionOptions* session_option = OrtCreateSessionOptions();
  OrtProviderFactoryInterface** factory;
  OrtCreateCUDAExecutionProviderFactory(0, &factory);
  OrtSessionOptionsAppendExecutionProvider(session_option, factory);
  OrtReleaseObject(factory);
  OrtCreateSession(env, model_path, session_option, &session);
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章