merged_ko_ties_gemma_3_1b_cpt_dpo_0707_v0_20250709

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the TIES merge method using Chang-Hoo/gemma-3-1b-cpt-dpo-0707 as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

base_model: Chang-Hoo/gemma-3-1b-cpt-dpo-0707
dtype: bfloat16
merge_method: ties
models:
- model: Chang-Hoo/gemma-3-1b-cpt-dpo-0707
  parameters:
    density: 0.5
    weight: 0.5
- model: google/gemma-3-1b-it
  parameters:
    density: 0.5
    weight: 0.5
parameters:
  int8_mask: true
  normalize: true
Downloads last month
7
Safetensors
Model size
1.0B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for gsjang/merged_ko_ties_gemma_3_1b_cpt_dpo_0707_v0_20250709