--- base_model: - SicariusSicariiStuff/Negative_LLAMA_70B datasets: - SicariusSicariiStuff/UBW_Tapestries language: - en library_name: transformers license: apache-2.0 quantized_by: SicariusSicariiStuff --- # Note This was made to test extremely low quants. 70B is **usable** at 2.15 bpw. A custom strategy for quanting was used, for details check EXL3 repo.