OpenVINO™ Mode Server
is scalable inference server for models optimized with OpenVINO™ for Intel
CPU, iGPU, GPU and NPU.
apiBase
to running OVMS instance. Refer to this demo on official OVMS documentation to easily set up your own local server.
Example configuration once OVMS is launched:
config.yaml