-
-
Notifications
You must be signed in to change notification settings - Fork 195
390 convert optimizers class to num power #391
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: 3.0
Are you sure you want to change the base?
390 convert optimizers class to num power #391
Conversation
…pplied consistent style across optimizer tests and Parameter class.
|
|
||
| ++$this->t; | ||
|
|
||
| return NumPower::multiply($gradient, $rate); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there an API to do $gradient->multiply($rate)?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, all operations currently implemented only in NumPower class, not in NDArray
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This PR converts optimizer classes to use the NumPower library, refactoring existing optimizers and adding several new optimizer implementations with comprehensive test coverage and mathematical documentation.
Key changes:
- Refactored existing Stochastic optimizer to improve code formatting and test structure
- Added six new optimizer implementations: StepDecay, RMSProp, Momentum, Cyclical, Adam, AdaMax, and AdaGrad
- Introduced new Adaptive interface for optimizers that require cache warming
- Enhanced all optimizer documentation with detailed mathematical formulations in LaTeX
Reviewed changes
Copilot reviewed 26 out of 26 changed files in this pull request and generated 10 comments.
Show a summary per file
| File | Description |
|---|---|
| tests/NeuralNet/Optimizers/Stochastic/StochasticTest.php | Refactored tests to use data providers and added missing #[Test] attributes |
| tests/NeuralNet/Optimizers/StepDecay/StepDecayTest.php | New comprehensive test suite for StepDecay optimizer |
| tests/NeuralNet/Optimizers/RMSProp/RMSPropTest.php | New test suite including cache initialization tests |
| tests/NeuralNet/Optimizers/Momentum/MomentumTest.php | New test suite with lookahead behavior tests |
| tests/NeuralNet/Optimizers/Cyclical/CyclicalTest.php | New test suite for cyclical learning rate optimizer |
| tests/NeuralNet/Optimizers/Adam/AdamTest.php | New test suite for Adam optimizer with dual-cache tests |
| tests/NeuralNet/Optimizers/AdaMax/AdaMaxTest.php | New test suite for AdaMax variant |
| tests/NeuralNet/Optimizers/AdaGrad/AdaGradTest.php | New test suite for AdaGrad optimizer |
| src/NeuralNet/Parameters/Parameter.php | Code formatting improvements (spacing in return types) |
| src/NeuralNet/Optimizers/Stochastic/Stochastic.php | Added mathematical documentation and improved error message formatting |
| src/NeuralNet/Optimizers/StepDecay/StepDecay.php | New optimizer with step-based learning rate decay |
| src/NeuralNet/Optimizers/RMSProp/RMSProp.php | New adaptive optimizer with RMS gradient scaling |
| src/NeuralNet/Optimizers/Momentum/Momentum.php | New optimizer with velocity accumulation and optional Nesterov lookahead |
| src/NeuralNet/Optimizers/Cyclical/Cyclical.php | New optimizer with cyclical learning rate schedule |
| src/NeuralNet/Optimizers/Base/Adaptive.php | New interface for optimizers requiring parameter cache warming |
| src/NeuralNet/Optimizers/Adam/Adam.php | New adaptive optimizer combining momentum and RMS properties |
| src/NeuralNet/Optimizers/AdaMax/AdaMax.php | New optimizer extending Adam with infinity norm |
| src/NeuralNet/Optimizers/AdaGrad/AdaGrad.php | New adaptive optimizer with accumulated gradient scaling |
| docs/neural-network/optimizers/stochastic.md | Added mathematical formulation documentation |
| docs/neural-network/optimizers/step-decay.md | New documentation with mathematical formulation |
| docs/neural-network/optimizers/rms-prop.md | Updated documentation with mathematical formulation |
| docs/neural-network/optimizers/momentum.md | Updated documentation with mathematical formulation |
| docs/neural-network/optimizers/cyclical.md | New documentation with mathematical formulation |
| docs/neural-network/optimizers/adamax.md | Updated documentation with mathematical formulation |
| docs/neural-network/optimizers/adam.md | Updated documentation with mathematical formulation |
| docs/neural-network/optimizers/adagrad.md | Updated documentation with mathematical formulation |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| @@ -0,0 +1,126 @@ | |||
| <?php | |||
|
|
|||
| declare(strict_types=1); | |||
Copilot
AI
Dec 30, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The spacing in the strict_types declaration is inconsistent with other optimizer test files in this PR. Most files use declare(strict_types = 1); with spaces around the equals sign, while this file uses declare(strict_types=1); without spaces.
| declare(strict_types=1); | |
| declare(strict_types = 1); |
| @@ -0,0 +1,141 @@ | |||
| <?php | |||
|
|
|||
| declare(strict_types=1); | |||
Copilot
AI
Dec 30, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The spacing in the strict_types declaration is inconsistent with other optimizer test files in this PR. Most files use declare(strict_types = 1); with spaces around the equals sign, while this file uses declare(strict_types=1); without spaces.
| declare(strict_types=1); | |
| declare(strict_types = 1); |
| @@ -0,0 +1,94 @@ | |||
| <?php | |||
|
|
|||
| declare(strict_types=1); | |||
Copilot
AI
Dec 30, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The spacing in the strict_types declaration is inconsistent with other optimizer test files in this PR. Most files use declare(strict_types = 1); with spaces around the equals sign, while this file uses declare(strict_types=1); without spaces.
| declare(strict_types=1); | |
| declare(strict_types = 1); |
andrewdalpino
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good I just wish we could change the API of NumPower to look like
$a->multiply($b);instead of
NumPower::multiple($a, $b);
No description provided.