nndocs:a770
Differences
This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| nndocs:a770 [2024/03/05 16:26] – [Intel] fix apt install command naptastic | nndocs:a770 [2024/03/05 16:34] (current) – [Ooba] Forgot important note: requirements.txt doesn't work naptastic | ||
|---|---|---|---|
| Line 72: | Line 72: | ||
| Holy crap this takes a long time. But now I can inst--hang on I'm not sure why this is necessary: | Holy crap this takes a long time. But now I can inst--hang on I'm not sure why this is necessary: | ||
| + | conda activate textgen | ||
| conda install intel-extension-for-pytorch=2.1.10 pytorch=2.1.0 -c intel -c conda-forge | conda install intel-extension-for-pytorch=2.1.10 pytorch=2.1.0 -c intel -c conda-forge | ||
| - | Holy crap this takes a long time. IPEX and pytorch weren' | + | Holy crap this takes a long time. IPEX and pytorch weren' |
| + | python -c " | ||
| + | 2.1.0a0+cxx11.abi | ||
| + | 2.1.10+xpu | ||
| + | [0]: _DeviceProperties(name=' | ||
| + | |||
| + | Interestingly, | ||
| + | (textgen) root@sadness: | ||
| + | Platform #0: Intel(R) OpenCL | ||
| + | `-- Device #0: Intel(R) Core(TM) i7-6700K CPU @ 4.00GHz | ||
| + | Platform #1: Intel(R) OpenCL Graphics | ||
| + | `-- Device #0: Intel(R) Arc(TM) A770 Graphics | ||
| + | |||
| + | And llama.cpp... I have no explanation for this output. | ||
| + | GGML_SYCL_DEBUG=0 | ||
| + | ggml_init_sycl: | ||
| + | ggml_init_sycl: | ||
| + | found 3 SYCL devices: | ||
| + | Device 0: Intel(R) Arc(TM) A770 Graphics, | ||
| + | max compute_units 512, max work group size 1024, max sub group size 32, global mem size 255012864 | ||
| + | Device 1: Intel(R) Core(TM) i7-6700K CPU @ 4.00GHz, | ||
| + | max compute_units 8, max work group size 8192, max sub group size 64, global mem size 33517965312 | ||
| + | Device 2: Intel(R) Arc(TM) A770 Graphics, | ||
| + | max compute_units 512, max work group size 1024, max sub group size 32, global mem size 255012864 | ||
| ====llama-cpp-python==== | ====llama-cpp-python==== | ||
| Line 87: | Line 111: | ||
| pip install rich accelerate gradio==3.50.* markdown transformers datasets peft | pip install rich accelerate gradio==3.50.* markdown transformers datasets peft | ||
| + | Actually using requirements.txt creates a conflict. I haven' | ||
| =====State===== | =====State===== | ||
| I can launch Ooba and it behaves as expected: | I can launch Ooba and it behaves as expected: | ||
nndocs/a770.1709655963.txt.gz · Last modified: 2024/03/05 16:26 by naptastic
