Reputation: 2710
I am trying to convert OpenAi Whisper model to Onnx with Olive, to merge the Model Files into one file, using:
python prepare_whisper_configs.py --model_name openai/whisper-tiny.en
python -m olive.workflows.run --config whisper_cpu_fp32.json --setup
python -m olive.workflows.run --config whisper_cpu_fp32.json
Github: https://github.com/microsoft/Olive
Olive Documentation: https://microsoft.github.io/Olive/overview/quicktour.html
I am getting error:
Traceback (most recent call last): File "C:\Users\User\anaconda3\envs\OliveEnv\lib\runpy.py", line 196, in _run_module_as_main return _run_code(code, main_globals, None, File "C:\Users\User\anaconda3\envs\OliveEnv\lib\runpy.py", line 86, in run_code exec(code, run_globals) File "C:\Users\User\anaconda3\envs\OliveEnv\lib\site-packages\olive\workflows\run_main.py", line 17, in run(**vars(args)) File "C:\Users\User\anaconda3\envs\OliveEnv\lib\site-packages\olive\workflows\run\run.py", line 187, in run return engine.run( File "C:\Users\User\anaconda3\envs\OliveEnv\lib\site-packages\olive\engine\engine.py", line 347, in run run_result = self.run_accelerator( File "C:\Users\User\anaconda3\envs\OliveEnv\lib\site-packages\olive\engine\engine.py", line 412, in run_accelerator return self.run_no_search( File "C:\Users\User\anaconda3\envs\OliveEnv\lib\site-packages\olive\engine\engine.py", line 483, in run_no_search should_prune, signal, model_ids = self._run_passes( File "C:\Users\User\anaconda3\envs\OliveEnv\lib\site-packages\olive\engine\engine.py", line 887, in _run_passes model_config, model_id = self._run_pass( File "C:\Users\User\anaconda3\envs\OliveEnv\lib\site-packages\olive\engine\engine.py", line 985, in _run_pass output_model_config = host.run_pass(p, input_model_config, data_root, output_model_path, pass_search_point) File "C:\Users\User\anaconda3\envs\OliveEnv\lib\site-packages\olive\systems\local.py", line 32, in run_pass output_model = the_pass.run(model, data_root, output_model_path, point) File "C:\Users\User\anaconda3\envs\OliveEnv\lib\site-packages\olive\passes\olive_pass.py", line 367, in run output_model = self._run_for_config(model, data_root, config, output_model_path) File "C:\Users\User\anaconda3\envs\OliveEnv\lib\site-packages\olive\passes\onnx\conversion.py", line 58, in _run_for_config return self._convert_model_on_device(model, data_root, config, output_model_path, "cpu") File "C:\Users\User\anaconda3\envs\OliveEnv\lib\site-packages\olive\passes\onnx\conversion.py", line 73, in convert_model_on_device component_model = model.get_component(component_name) File "C:\Users\User\anaconda3\envs\OliveEnv\lib\site-packages\olive\model_init.py", line 696, in get_component user_module_loader = UserModuleLoader(self.model_script, self.script_dir) File "C:\Users\User\anaconda3\envs\OliveEnv\lib\site-packages\olive\common\user_module_loader.py", line 22, in init self.user_module = import_user_module(user_script, script_dir) File "C:\Users\User\anaconda3\envs\OliveEnv\lib\site-packages\olive\common\import_lib.py", line 39, in import_user_module return import_module_from_file(user_script) File "C:\Users\User\anaconda3\envs\OliveEnv\lib\site-packages\olive\common\import_lib.py", line 27, in import_module_from_file spec.loader.exec_module(new_module) File "", line 883, in exec_module File "", line 241, in call_with_frames_removed File "C:\OpenAI\Olive\examples\whisper\code\user_script.py", line 10, in from olive.model import PyTorchModelHandler ImportError: cannot import name 'PyTorchModelHandler' from 'olive.model' (C:\Users\User\anaconda3\envs\OliveEnv\lib\site-packages\olive\model_init.py)
I have a Conda environment: OliveEnv
Packages installed:
(OliveEnv) C:\OpenAI\Olive\examples\whisper>pip list
Package Version
---------------------- ----------
alembic 1.13.1
certifi 2024.2.2
charset-normalizer 3.3.2
colorama 0.4.6
coloredlogs 15.0.1
colorlog 6.8.2
contextlib2 21.6.0
contourpy 1.2.0
cycler 0.12.1
Deprecated 1.2.14
filelock 3.13.1
flatbuffers 23.5.26
fonttools 4.48.1
fsspec 2024.2.0
greenlet 3.0.3
huggingface-hub 0.20.3
humanfriendly 10.0
idna 3.6
Jinja2 3.1.3
joblib 1.3.2
kiwisolver 1.4.5
Mako 1.3.2
MarkupSafe 2.1.5
matplotlib 3.8.2
mpmath 1.3.0
networkx 3.2.1
neural-compressor 2.4.1
numpy 1.26.4
olive-ai 0.3.2
onnx 1.15.0
onnxruntime 1.17.0
opencv-python-headless 4.9.0.80
optuna 3.5.0
packaging 23.2
pandas 2.2.0
pillow 10.2.0
pip 23.3.1
prettytable 3.9.0
protobuf 3.20.3
psutil 5.9.8
py-cpuinfo 9.0.0
pycocotools 2.0.7
pydantic 1.10.14
pyparsing 3.1.1
pyreadline3 3.4.1
python-dateutil 2.8.2
pytz 2024.1
PyYAML 6.0.1
regex 2023.12.25
requests 2.31.0
safetensors 0.4.2
schema 0.7.5
scikit-learn 1.4.0
scipy 1.12.0
setuptools 68.2.2
six 1.16.0
SQLAlchemy 2.0.25
sympy 1.12
tabulate 0.9.0
threadpoolctl 3.2.0
tokenizers 0.15.1
torch 2.2.0
torchmetrics 0.10.3
tqdm 4.66.1
transformers 4.37.2
typing_extensions 4.9.0
tzdata 2023.4
urllib3 2.2.0
wcwidth 0.2.13
wheel 0.41.2
wrapt 1.16.0
I have tried three different versions of olive, 0.3.1, 0.3.2, 0.4.0
A very good video, showing the use of Olive is MS Build:
https://www.youtube.com/watch?v=7_0N1VL5ZmA https://www.youtube.com/watch?v=Qj-l0tGKPf8
Upvotes: 0
Views: 670
Reputation: 2710
I installed from git:
pip install git+https://github.com/microsoft/Olive#egg=olive-ai[cpu]
See: https://microsoft.github.io/Olive/0.2.1/getstarted/installation.html
Installing git: https://git-scm.com/download/win
See: https://www.activestate.com/resources/quick-reads/pip-install-git/
No explanation, its now working. Odd!
Upvotes: -1