Dynamic Quantization on BERT notebook - huggingface transformer version #1114
Labels
docathon-h1-2023
A label for the docathon in H1 2023
easy
quantization
Issues relating to quantization tutorials
🐛 Bug
The notebook for Dynamic Quantization on BERT should use fixed version of huggingface/transformers, I've tried and it works with
transformers==2.0.0
Withouth this change it gives error in section 3.2:
TypeError: glue_convert_examples_to_features() got an unexpected keyword argument 'pad_on_left'
The text was updated successfully, but these errors were encountered: