YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

Vocabulary Trimmed facebook/mbart-large-50: duongttr/japanese-trimmed-mbart-large

This model is a trimmed version of facebook/mbart-large-50 by vocabtrimmer, a tool for trimming vocabulary of language models to compress the model size. Following table shows a summary of the trimming process.

facebook/mbart-large-50 duongttr/japanese-trimmed-mbart-large
parameter_size_full 610,879,488 416,319,488
parameter_size_embedding 256,055,296 61,495,296
vocab_size 250,054 60,054
compression_rate_full 100.0 68.15
compression_rate_embedding 100.0 24.02

Following table shows the parameter used to trim vocabulary.

language dataset dataset_column dataset_name dataset_split target_vocab_size min_frequency
ja vocabtrimmer/mc4_validation text ja validation 60000 2
Downloads last month
17
Safetensors
Model size
0.4B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support