Skip to content

Possible typo on Ch 12 page 370 #82

@Nick243

Description

@Nick243

Thank you for writing this excellent and extremely conscience yet informative book. I thoroughly enjoyed it.

If at all helpful, I wondered if there might be a possible typo on page 370. The text seems to describe setting the lora_alpha parameter to something close to twice the size of r. In the code r = 64 and lora_alpha = 32.

I was not sure if this was intended (or if maybe I missed the rational for deviating from the recommendation for this example).

Prepare LoRA Configuration

peft_config = LoraConfig(
lora_alpha=32, # LoRA Scaling
lora_dropout=0.1, # Dropout for LoRA Layers
r=64, # Rank
bias="none",
task_type="CAUSAL_LM",
target_modules= # Layers to target
['k_proj', 'gate_proj', 'v_proj', 'up_proj', 'q_proj', 'o_proj', 'down_proj']
)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions