Language Models are Super Mario: Absorbing Abilities from Homologous Models as a Free Lunch
Paper
•
2311.03099
•
Published
•
30
This model is used as an intermediate model for future merges. This is a merge of 4 pre-trained language models based upon Mistral-7B-v0.1 created using mergekit.
In combination with DiscoLM_German_7b_v1 this 'clear'-model is the 'base' model to build the successor of my first 'VA_Disco_7B': VerwaltungsAnthologie_Disco_7B
This model was merged using the DARE TIES merge method using mistralai/Mistral-7B-v0.1 as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
# works but never stops
models:
- model: mistralai/Mistral-7B-v0.1
# No parameters necessary for base model
- model: VAGOsolutions/SauerkrautLM-7b-LaserChat
parameters:
density: 0.53
weight: 0.15
- model: hiig-piai/simba-01d-ftb
parameters:
density: 0.53
weight: 0.55
- model: DRXD1000/Phoenix
parameters:
density: 0.53
weight: 0.15
- model: OpenPipe/mistral-ft-optimized-1227
parameters:
density: 0.53
weight: 0.15
merge_method: dare_ties
base_model: mistralai/Mistral-7B-v0.1
parameters:
int8_mask: true
dtype: bfloat16
name: VerwaltungsAnthologie_clear_simbad_7B