| | --- |
| | license: other |
| | license_name: exaone |
| | license_link: LICENSE |
| | language: |
| | - en |
| | - ko |
| | tags: |
| | - lg-ai |
| | - exaone |
| | - exaone-3.5 |
| | pipeline_tag: text-generation |
| | library_name: transformers |
| | --- |
| | |
| | # Updates in EXAONE-3.5 |
| |
|
| | ## Key Changes |
| | - **RoPE Scaling Parameter**: Added to support longer `context_length`. |
| | - **Memory Optimization**: For the 2.4B model, `tie_word_embeddings` is set to `True` for improved memory efficiency. |
| |
|
| | ⚠️ Using the original [Llamafy script](https://huggingface.co/maywell/EXAONE-3.0-7.8B-Instruct-Llamafied) as-is may lead to performance degradation. |
| |
|
| | To address this, I have updated the script and uploaded the Llamafied version of the model. |
| |
|
| | ## Special Thanks |
| |
|
| | - **[@maywell](https://huggingface.co/maywell)** |
| | For updating the code and uploading the model. |
| |
|
| | - **LG AI Research** |
| | For releasing the original model. |
| | Check out the [original release here](https://huggingface.co/collections/LGAI-EXAONE/exaone-35-674d0e1bb3dcd2ab6f39dbb4). |