-
-
Notifications
You must be signed in to change notification settings - Fork 195
392 convert layers to numpower #393
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: 3.0
Are you sure you want to change the base?
Conversation
…pplied consistent style across optimizer tests and Parameter class.
…neural network layers.
…hensive unit tests and fixed broken source file link in the documentation
…d updated documentation with fixed source file link. Added `Parametric` interface to define parameterized layers.
…ith `NumPower` utilities
…ical stability during inference, and gradient computation logic
…ical stability during inference, and gradient computation logic
…itional tests and updated shape handling
…itional tests and updated shape handling
…interface definition for output layers.
…entation and unit tests
…d/backward passes
…ence/backward passes, unit tests, and documentation updates
…ith `NumPower` utilities
…rd/inference/backward passes, unit tests
…ference/backward passes, unit tests
…rward/inference/backward passes, unit tests
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This pull request converts neural network layers and optimizers to use the NumPower library for numerical operations. The changes include implementing new optimizer classes (StepDecay, RMSProp, Momentum, Cyclical, Adam, AdaMax, AdaGrad), new layer implementations (Swish, Placeholder1D, PReLU, Noise, Multiclass, Dropout, Dense, Continuous, Binary, BatchNorm, Activation), updating tests to use PHPUnit attributes, and reorganizing code into namespaced subdirectories with corresponding documentation updates.
Key Changes
- Migration from tensor library to NumPower for all neural network operations
- Reorganization of optimizers and layers into subdirectory namespaces (e.g.,
Adam/Adam.phpinstead ofAdam.php) - Conversion of PHPUnit test annotations to attributes (
#[Test],#[TestDox], etc.)
Reviewed changes
Copilot reviewed 71 out of 71 changed files in this pull request and generated 2 comments.
Show a summary per file
| File | Description |
|---|---|
| tests/Specifications/SamplesAreCompatibleWithDistanceTest.php | Updated to use PHPUnit attributes for test annotations |
| tests/NeuralNet/Optimizers/Stochastic/StochasticTest.php | Refactored test with data providers and PHPUnit attributes |
| tests/NeuralNet/Optimizers/StepDecay/StepDecayTest.php | New comprehensive test suite for StepDecay optimizer |
| tests/NeuralNet/Optimizers/RMSProp/RMSPropTest.php | New test suite covering RMSProp optimizer with NumPower |
| tests/NeuralNet/Optimizers/Momentum/MomentumTest.php | New test suite for Momentum optimizer |
| tests/NeuralNet/Optimizers/Cyclical/CyclicalTest.php | New test suite for Cyclical learning rate optimizer |
| tests/NeuralNet/Optimizers/Adam/AdamTest.php | New comprehensive Adam optimizer tests |
| tests/NeuralNet/Optimizers/AdaMax/AdaMaxTest.php | New AdaMax optimizer test suite |
| tests/NeuralNet/Optimizers/AdaGrad/AdaGradTest.php | New AdaGrad optimizer tests |
| tests/NeuralNet/Layers/*/Test.php (multiple) | New test suites for all layer types using NumPower |
| tests/NeuralNet/FeedForwardTest.php | Updated to use PHPUnit attributes |
| tests/Helpers/GraphvizTest.php | Updated to use PHPUnit attributes |
| src/Traits/AssertsShapes.php | Updated exception import and error message |
| src/NeuralNet/Parameters/Parameter.php | Formatting improvements and method signature updates |
| src/NeuralNet/Optimizers/*/Optimizers.php (multiple) | New optimizer implementations using NumPower |
| src/NeuralNet/Layers/*/Layers.php (multiple) | New layer implementations using NumPower |
| src/NeuralNet/Initializers/*/Initializers.php (multiple) | Added explicit loc parameter to NumPower calls |
| phpunit.xml | Increased memory limit to 256M |
| docs/neural-network/**/*.md (multiple) | Updated file paths and added mathematical formulations |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
|
||
| if ($lower > $upper) { | ||
| throw new InvalidArgumentException( | ||
| 'Lower bound cannot be reater than the upper bound.' |
Copilot
AI
Dec 30, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Typo in error message: "reater" should be "greater".
| \text{cycle} &= \left\lfloor 1 + \frac{t}{2\,\text{steps}} \right\rfloor \\ | ||
| x &= \left| \frac{t}{\text{steps}} - 2\,\text{cycle} + 1 \right| \\ | ||
| \text{scale} &= \text{decay}^{\,t} \\ | ||
| \eta_t &= \text{lower} + (\text{upper} - \text{lower})\,\max\bigl(0\,1 - x\bigr)\,\text{scale} \\ |
Copilot
AI
Dec 30, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Typo in mathematical formulation: missing comma after "0" in max function. Should be "max(0, 1 - x)" not "max(0,1 - x)".
| \eta_t &= \text{lower} + (\text{upper} - \text{lower})\,\max\bigl(0\,1 - x\bigr)\,\text{scale} \\ | |
| \eta_t &= \text{lower} + (\text{upper} - \text{lower})\,\max\bigl(0, 1 - x\bigr)\,\text{scale} \\ |
No description provided.