PEFT How to use quantumaikr/falcon-180B-chat-instruct-LoRA with PEFT:
from peft import PeftModel
from transformers import AutoModelForCausalLM
base_model = AutoModelForCausalLM.from_pretrained("tiiuae/falcon-180B-chat")
model = PeftModel.from_pretrained(base_model, "quantumaikr/falcon-180B-chat-instruct-LoRA")