EXAONE Easy Contract LoRA Adapter (v1.1)

주택 μž„λŒ€μ°¨ κ³„μ•½μ„œ 쑰항을 일반인이 이해할 수 μžˆλ„λ‘
쉽고 λΆ€λ“œλŸ¬μš΄ μ„€λͺ…μ²΄λ‘œ ν’€μ–΄ μ„€λͺ…ν•˜λ„λ‘ ν•™μŠ΅λœ LoRA μ–΄λŒ‘ν„°μž…λ‹ˆλ‹€.

Base Model

  • LGAI-EXAONE/EXAONE-3.5-2.4B-Instruct

What This Adapter Does

  • κ³„μ•½μ„œ μ‘°ν•­μ˜ 의미λ₯Ό ν’€μ–΄ μ‰¬μš΄ μ„€λͺ…μœΌλ‘œ λ³€ν™˜
  • 원문에 μ—†λŠ” λ‚΄μš©μ€ μΆ”κ°€ν•˜μ§€ μ•Šλ„λ‘ ν•™μŠ΅
  • β€œ~ν•΄μ•Ό λΌμš” / ~μ—μš” / ~둜 λ˜μ–΄ μžˆμ–΄μš”β€ μ„€λͺ…체 μœ μ§€
  • ν•œ μ‘°ν•­λ‹Ή ν•˜λ‚˜μ˜ 문단 μ„€λͺ… 생성

Intended Use

  • 주거용 뢀동산 μž„λŒ€μ°¨ κ³„μ•½μ„œ μ‰¬μš΄ μ„€λͺ… 생성
  • κ³„μ•½μ„œ 기반 Q&A λ˜λŠ” μ‰¬μš΄ κ³„μ•½μ„œ 리포트
  • FastAPI / RAG 기반 μ„œλΉ„μŠ€μ™€ κ²°ν•© μ‚¬μš©

How to Use

from peft import PeftModel
from transformers import AutoModelForCausalLM

base = AutoModelForCausalLM.from_pretrained(
    "LGAI-EXAONE/EXAONE-3.5-2.4B-Instruct",
    device_map="auto",
    trust_remote_code=True
)

model = PeftModel.from_pretrained(
    base,
    "temdy/exaone-3.5-2.4b-easycontract-qlora-v1.1"
)
Training Overview
Training method: Supervised Fine-Tuning (SFT)

Adaptation: LoRA / QLoRA

Data: 주택 μž„λŒ€μ°¨ κ³„μ•½μ„œ μ‘°ν•­ λ‹¨μœ„ μ„€λͺ… 데이터

Frameworks: PEFT, TRL, Transformers

Limitations
법λ₯  μžλ¬Έμ„ λŒ€μ²΄ν•˜μ§€ μ•ŠμŒ

μ‹€μ œ 계약 νŒλ‹¨μ€ 원문 κ³„μ•½μ„œμ™€ μ „λ¬Έκ°€ κ²€ν†  ν•„μš”

License
Apache-2.0
Downloads last month
1
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for temdy/exaone-3.5-2.4b-easycontract-qlora-v1.1

Adapter
(22)
this model