Skip to content

Conversation

@leizhenyuan
Copy link
Contributor

@leizhenyuan leizhenyuan commented Apr 15, 2025

Hi unsloth, we are going to support unsloth intel GPU with several prs.
For the first step we are aiming to support several models with LoRA, and increase our feature in the future (including BNB, FlashAttention, xformers), below is our plan for the first step:

  1. add intel dependent packages for PyTorch 2.6 in pyproject.toml
  2. generalize device types and refactor device-bias code in init.py
  3. refactor device-bias code in utils
  4. refactor device-bias code in kernels
  5. refactor device-bias code for unsloth-zoo
  6. refactor device-bias code for models

Currently, we would like to start on Linux with torch2.6.
In order to reduce the pressure of your review, we strive to ensure the simplicity of each PR.
For this PR, we change pyproject.toml to download intel GPU required packages.

We have tested our series prs on Llama-3.2, DeepSeek-R1-Distill-Qwen and Qwen2, below are the log from Llama-3.2 LoRA BF16

image

@leizhenyuan leizhenyuan changed the title Enable intel GPU for unsloth [1/N] [1/N] Enable intel GPU for unsloth Apr 15, 2025
@leizhenyuan leizhenyuan force-pushed the intel_zhenyuan_enable_xpu_1 branch from da6291b to b9b56b9 Compare April 15, 2025 02:31
@leizhenyuan leizhenyuan marked this pull request as draft April 15, 2025 05:13
@leizhenyuan leizhenyuan marked this pull request as ready for review April 15, 2025 05:23
@Sweaterdog
Copy link

Does this support Intel CPU fine tuning as well?

@leizhenyuan
Copy link
Contributor Author

Does this support Intel CPU fine tuning as well?

This wheel support the cpu functionality for PyTorch, but this pr is not aiming to support CPU fine tuning, since finetune LLM is still to heavy for cpu device

@Sweaterdog
Copy link

I figured it is too much, but good to know!

@danielhanchen
Copy link
Contributor

Ok this looks good - sorry on the delay!

@danielhanchen danielhanchen merged commit 2264725 into unslothai:main May 12, 2025
@danielhanchen
Copy link
Contributor

Also would it be possible to edit https://github.com/unslothai/unsloth/blob/main/unsloth/_auto_install.py thanks :)

@zejun-chen
Copy link

zejun-chen commented May 13, 2025

Also would it be possible to edit https://github.com/unslothai/unsloth/blob/main/unsloth/_auto_install.py thanks :)

yes. It makes sense. @leizhenyuan In future, we can add the xpu into the installation file here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants