User Tools

Site Tools


nndocs:a770

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
nndocs:a770 [2024/03/05 16:26] – [Intel] fix apt install command naptasticnndocs:a770 [2024/03/05 16:34] (current) – [Ooba] Forgot important note: requirements.txt doesn't work naptastic
Line 72: Line 72:
  
 Holy crap this takes a long time. But now I can inst--hang on I'm not sure why this is necessary: Holy crap this takes a long time. But now I can inst--hang on I'm not sure why this is necessary:
 +    conda activate textgen
     conda install intel-extension-for-pytorch=2.1.10 pytorch=2.1.0 -c intel -c conda-forge     conda install intel-extension-for-pytorch=2.1.10 pytorch=2.1.0 -c intel -c conda-forge
  
-Holy crap this takes a long time. IPEX and pytorch weren't already in the pytorch-gpu env? Ok...+Holy crap this takes a long time. IPEX and pytorch weren't already in the pytorch-gpu env? Ok... well, that got the A770 to show up in the sanity check anyway. 
 +    python -c "import torch; import intel_extension_for_pytorch as ipex; print(torch.__version__); print(ipex.__version__); [print(f'[{i}]: {torch.xpu.get_device_properties(i)}') for i in range(torch.xpu.device_count())];" 
 +    2.1.0a0+cxx11.abi 
 +    2.1.10+xpu 
 +    [0]: _DeviceProperties(name='Intel(R) Arc(TM) A770 Graphics', platform_name='Intel(R) Level-Zero', dev_type='gpu, support_fp64=0, total_memory=243MB, max_compute_units=512, gpu_eu_count=512) 
 + 
 +Interestingly, clinfo -l shows both i915 devices in the system; I don't know if that's a problem or not. Perhaps I should disable the iGPU? 
 +    (textgen) root@sadness:~# clinfo -l 
 +    Platform #0: Intel(R) OpenCL 
 +     `-- Device #0: Intel(R) Core(TM) i7-6700K CPU @ 4.00GHz 
 +    Platform #1: Intel(R) OpenCL Graphics 
 +     `-- Device #0: Intel(R) Arc(TM) A770 Graphics 
 + 
 +And llama.cpp... I have no explanation for this output. 
 +    GGML_SYCL_DEBUG=0 
 +    ggml_init_sycl: GGML_SYCL_F16:   yes 
 +    ggml_init_sycl: SYCL_USE_XMX: yes 
 +    found 3 SYCL devices: 
 +      Device 0: Intel(R) Arc(TM) A770 Graphics, compute capability 1.3, 
 +        max compute_units 512, max work group size 1024, max sub group size 32, global mem size 255012864 
 +      Device 1: Intel(R) Core(TM) i7-6700K CPU @ 4.00GHz, compute capability 3.0, 
 +        max compute_units 8, max work group size 8192, max sub group size 64, global mem size 33517965312 
 +      Device 2: Intel(R) Arc(TM) A770 Graphics, compute capability 3.0, 
 +        max compute_units 512, max work group size 1024, max sub group size 32, global mem size 255012864
  
 ====llama-cpp-python==== ====llama-cpp-python====
Line 87: Line 111:
     pip install rich accelerate gradio==3.50.* markdown transformers datasets peft     pip install rich accelerate gradio==3.50.* markdown transformers datasets peft
  
 +Actually using requirements.txt creates a conflict. I haven't dug into it yet.
 =====State===== =====State=====
 I can launch Ooba and it behaves as expected: I can launch Ooba and it behaves as expected:
nndocs/a770.1709655963.txt.gz · Last modified: 2024/03/05 16:26 by naptastic