Update modeling_phi3.py
#101
by
TarunSinghal
- opened
updated 'get_max_length()' to 'get_max_cache_shape()[2]'. Now no need to downgrade transformers. This will run with latest transformers library without any error
TarunSinghal
changed pull request status to
closed
TarunSinghal
changed pull request status to
open
TarunSinghal
changed pull request status to
closed