Skip to content
/

Unable to find an entry point named ‘OrtGetApiBase’ in DLL ‘onnxruntime’ with Microsoft.ML.OnnxTransformer 1.5.0

When trying to have an application work with an ONNX model I downloaded from the Microsoft Custom Vision portal, I got the following exception:

System.TypeInitializationException: 'The type initializer for 'Microsoft.ML.OnnxRuntime.NativeMethods' threw an exception.'

EntryPointNotFoundException: Unable to find an entry point named 'OrtGetApiBase' in DLL 'onnxruntime'.

Searching for the error online did not yield any solutions.

After solving the problem, I wanted to share the solution for anyone else running into the same exception.

The premise

I started with following the tutorial how to Detect objects using ONNX in ML.NET.

The tutorial instructs you to install the following NuGet packages:

  • Microsoft.ML
  • Microsoft.ML.ImageAnalytics
  • Microsoft.ML.OnnxTransformer

It does not mention any other dependencies to add, or steps to take, to make the application run.

The problem

I was very unlucky in my timing, as the Microsoft machine learning libraries got updated to version 1.5.0 on May 27, that's 2 weeks before I started working on the application.

Before version 1.5.0 the Microsoft.ML.OnnxTransformer package had a dependency on Microsoft.ML.OnnxRuntime >= 0.5.1. But version 1.5.0 now has a dependency on Microsoft.ML.OnnxRuntime.Managed.

The release notes of ONNX Runtime v1.2.0 are mentioning the change in package structure:

Nuget package structure updated. There is now a separate Managed Assembly (Microsoft.ML.OnnxRuntime.Managed) shared between the CPU and GPU Nuget packages. The "native" Nuget will depend on the "managed" Nuget to bring it into relevant projects automatically. 

...

Note that this should transparent for customers installing the Nuget packages.

Commenting on the last remark, it is transparent, if you would start with the Microsoft.ML.OnnxRuntime as it now depends on the Managed one. But the Microsoft.ML.OnnxTransformer package doesn't depend directly on the Microsoft.ML.OnnxRuntime as the authors of that package allow the developer to be free to choose between a CPU or GPU runtime.

That the package structure change would result in a runtime exception and confused developers was predicted in the GitHub issue introducing this change.

The solution

Now knowing I was missing the actual runtime, it is an easy fix, just add one of the runtime packages to your solution:

  • Microsoft.ML.OnnxRuntime
  • Microsoft.ML.OnnxRuntime.Gpu
  • Microsoft.ML.OnnxRuntime.MKLML

Now my application works, and I can start classifying images locally.