Fine-tuning all the weights of a model
Is it possible to fine-tune all the weights of a model when training ?
TLDR; Probably, but it's not useful.
You can use QLoRA to fine-tune only some weight as seen in Using QLora to fine-tune a model. This costs a fraction of the CPU/GPU cost while allowing for decent performance.
According to the LoRa paper LoRa gives better training speed AND training quality than fully retraining all weights.