nip.language_model_server.types.LmLoraAdapterConfig#
- class nip.language_model_server.types.LmLoraAdapterConfig(*, r: int, lora_alpha: int, lora_dropout: float)[source]#
Configuration for a LoRA adapter to be applied on top of a base model.
See Yu et al. [YYK+23] for the original LoRA paper.
Attributes
__fields_set__model_computed_fieldsmodel_configConfiguration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
model_extraGet extra fields set during validation.
model_fieldsmodel_fields_setReturns the set of fields that have been explicitly set on this model instance.
rThe rank of the LoRA adapter, controlling the number of trainable parameters.
lora_alphaThe scaling factor for the LoRA adapter, for the strength of the adapter.
lora_dropoutThe dropout rate for the LoRA layers.
Methods