This article guides you through the process of sideloading models required for Procyon AI Text Generation Benchmark. Note that an active internet connection is still required to run the test.
Models for ONNX Runtime
Model | Download Link |
Phi-3.5-mini-instruct-onnx-int4 | benchmarks.ul.com/downloads/onnxdml_Phi-3.5-mini-instruct-onnx-int4.zip |
Mistral-7B-Instruct-v0.2-onnx-int4 | benchmarks.ul.com/downloads/onnxdml_mistral-7b-instruct-v0.2-ONNX.zip |
llama-3_1-8b-instruct-onnx-int4 | benchmarks.ul.com/downloads/onnxdml_llama-3_1-8b-instruct-onnx-int4.zip |
Llama-2-13b-chat-hf-onnx-int4 | benchmarks.ul.com/downloads/onnxdml_Llama-2-13b-chat-hf-onnx-int4.zip |
Models for OpenVINO 2025.0 Runtime Update (Workload 1.0.82)
Model | Download Link |
Phi-3.5-mini-instruct-ov-int4 | benchmarks.ul.com/downloads/openvino_2025_0_Phi-3.5-mini-instruct-ov-int4.zip |
Mistral-7B-Instruct-v0.2-ov-int4 | benchmarks.ul.com/downloads/openvino_2025_0_Mistral-7B-Instruct-v0.2-ov-int4.zip |
llama-3_1-8b-instruct-ov-int4 | benchmarks.ul.com/downloads/openvino_2025_0_llama-3_1-8b-instruct-ov-int4.zip |
Llama-2-13b-chat-hf-ov-int4 | benchmarks.ul.com/downloads/openvino_2025_0_Llama-2-13b-chat-hf-ov-int4.zip |
Installing the Models
You may choose to download individual models or all, based on your configuration and benchmarking needs.
1. By default, the benchmark is installed in
%ProgramData%\UL\Procyon\chops\dlc\ai-textgeneration-benchmark\
2. Create a subfolder named "models", if it does not exist;
%ProgramData%\UL\Procyon\chops\dlc\ai-textgeneration-benchmark\models
3. Unzip the downloaded models and copy them over to the models folder. Note that the extracted folders should not contain the prefix "zip_". The models directory with all the downloaded models should look like this;
Models for OpenVINO Runtime (launch version, Workload 1.0.73)
Model | Download Link |
Phi-3.5-mini-instruct-ov-int4 | benchmarks.ul.com/downloads/openvino_Phi-3.5-mini-instruct-ov-int4.zip |
Mistral-7B-Instruct-v0.2-ov-int4 | benchmarks.ul.com/downloads/openvino_Mistral-7B-Instruct-v0.2-ov-int4.zip |
llama-3_1-8b-instruct-ov-int4 | benchmarks.ul.com/downloads/openvino_llama-3_1-8b-instruct-ov-int4.zip |
Llama-2-13b-chat-hf-ov-int4 | benchmarks.ul.com/downloads/openvino_Llama-2-13b-chat-hf-ov-int4.zip |