Skip to content

Commit 9ca0bba

Browse files
danielhanchentimothelaborieeltociearErland366
authored
Fix 4.47 issue (#1182)
* Fix TRL * Update mistral.py * Patch processing_class * Update tokenizer_utils.py * Update tokenizer_utils.py * Update tokenizer_utils.py * Update tokenizer_utils.py * Update tokenizer_utils.py * Update tokenizer_utils.py * Installation guide (#1165) * chore: update chat_templates.py (#1166) orginal -> original * Disable Flex Attention * Update tokenizer_utils.py * Update _utils.py * n_items * Update cross_entropy_loss.py * Fix DPO, ORPO * Update _utils.py * Update _utils.py * fix/transformers-unpack (#1180) * Fix DPO, ORPO (#1177) * Fix TRL * Update mistral.py * Patch processing_class * Update tokenizer_utils.py * Update tokenizer_utils.py * Update tokenizer_utils.py * Update tokenizer_utils.py * Update tokenizer_utils.py * Update tokenizer_utils.py * Installation guide (#1165) * chore: update chat_templates.py (#1166) orginal -> original * Disable Flex Attention * Update tokenizer_utils.py * Update _utils.py * n_items * Update cross_entropy_loss.py * Fix DPO, ORPO * Update _utils.py --------- Co-authored-by: timothelaborie <97834767+timothelaborie@users.noreply.github.com> Co-authored-by: Ikko Eltociear Ashimine <eltociear@gmail.com> * Add warning for missing Unpack and KwargsForCausalLM in older Transformers versions --------- Co-authored-by: Daniel Han <danielhanchen@gmail.com> Co-authored-by: timothelaborie <97834767+timothelaborie@users.noreply.github.com> Co-authored-by: Ikko Eltociear Ashimine <eltociear@gmail.com> * Update cross_entropy_loss.py * Update _utils.py * Update _utils.py --------- Co-authored-by: timothelaborie <97834767+timothelaborie@users.noreply.github.com> Co-authored-by: Ikko Eltociear Ashimine <eltociear@gmail.com> Co-authored-by: Edd <68678137+Erland366@users.noreply.github.com>
1 parent 4f1c474 commit 9ca0bba

File tree

2 files changed

+22
-0
lines changed

2 files changed

+22
-0
lines changed

unsloth/kernels/cross_entropy_loss.py

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -388,6 +388,14 @@ def fast_cross_entropy_loss(
388388
List,
389389
Tuple,
390390
)
391+
392+
# Transformers 4.47 need Unpack, KwargsForCausalLM
393+
try:
394+
from transformers.models.llama.modeling_llama import Unpack, KwargsForCausalLM
395+
except:
396+
pass
397+
pass
398+
391399
import inspect, re
392400
function = inspect.getsource(LlamaForCausalLM.forward)
393401
function = function.split("\n")

unsloth/models/_utils.py

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -162,6 +162,20 @@ def patch_mistral_nemo_config(config):
162162
pass
163163
# =============================================
164164

165+
# =============================================
166+
# Weird Databricks errors
167+
from transformers.utils import is_openai_available
168+
if is_openai_available():
169+
try:
170+
from openai import OpenAI
171+
except:
172+
print("Unsloth: OpenAI failed to import - ignoring for now.")
173+
import transformers.utils
174+
def _is_openai_available(): return False
175+
transformers.utils.is_openai_available = _is_openai_available
176+
pass
177+
pass
178+
165179
# =============================================
166180
# Get Flash Attention v2 if Ampere (RTX 30xx, A100)
167181
import bitsandbytes as bnb

0 commit comments

Comments
 (0)