user23980166
user23980166

Reputation: 1

mem cost of training LLM in mac studio with mlx_lm

I am traing LLM with mlx_lm on a mac studio with M1 ultra and 128GB unified memory. The model I choose to finetune is CodeLlama-13b-Instruct-hf and CodeQwen1.5-7B-Chat. No matter how I change the config file when I use top to monitor the MEM cost of training process. The MEM cost is always around 90GB. I try to change parameters like lora layers and batch size. But with lora layers as 8, the MEM cost is always 90GB with different batch size.

Why the MEM cost is Always 90GB. How can I reduce it.

Upvotes: 0

Views: 136

Answers (0)

Related Questions