Skip to content

Conversation

@apphp
Copy link

@apphp apphp commented Nov 7, 2025

No description provided.

@apphp apphp self-assigned this Nov 7, 2025
@apphp apphp changed the base branch from master to 3.0 November 7, 2025 15:40

++$this->t;

return NumPower::multiply($gradient, $rate);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there an API to do $gradient->multiply($rate)?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, all operations currently implemented only in NumPower class, not in NDArray

@andrewdalpino andrewdalpino requested review from a team and Copilot and removed request for SkibidiProduction December 30, 2025 00:57
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR converts optimizer classes to use the NumPower library, refactoring existing optimizers and adding several new optimizer implementations with comprehensive test coverage and mathematical documentation.

Key changes:

  • Refactored existing Stochastic optimizer to improve code formatting and test structure
  • Added six new optimizer implementations: StepDecay, RMSProp, Momentum, Cyclical, Adam, AdaMax, and AdaGrad
  • Introduced new Adaptive interface for optimizers that require cache warming
  • Enhanced all optimizer documentation with detailed mathematical formulations in LaTeX

Reviewed changes

Copilot reviewed 26 out of 26 changed files in this pull request and generated 10 comments.

Show a summary per file
File Description
tests/NeuralNet/Optimizers/Stochastic/StochasticTest.php Refactored tests to use data providers and added missing #[Test] attributes
tests/NeuralNet/Optimizers/StepDecay/StepDecayTest.php New comprehensive test suite for StepDecay optimizer
tests/NeuralNet/Optimizers/RMSProp/RMSPropTest.php New test suite including cache initialization tests
tests/NeuralNet/Optimizers/Momentum/MomentumTest.php New test suite with lookahead behavior tests
tests/NeuralNet/Optimizers/Cyclical/CyclicalTest.php New test suite for cyclical learning rate optimizer
tests/NeuralNet/Optimizers/Adam/AdamTest.php New test suite for Adam optimizer with dual-cache tests
tests/NeuralNet/Optimizers/AdaMax/AdaMaxTest.php New test suite for AdaMax variant
tests/NeuralNet/Optimizers/AdaGrad/AdaGradTest.php New test suite for AdaGrad optimizer
src/NeuralNet/Parameters/Parameter.php Code formatting improvements (spacing in return types)
src/NeuralNet/Optimizers/Stochastic/Stochastic.php Added mathematical documentation and improved error message formatting
src/NeuralNet/Optimizers/StepDecay/StepDecay.php New optimizer with step-based learning rate decay
src/NeuralNet/Optimizers/RMSProp/RMSProp.php New adaptive optimizer with RMS gradient scaling
src/NeuralNet/Optimizers/Momentum/Momentum.php New optimizer with velocity accumulation and optional Nesterov lookahead
src/NeuralNet/Optimizers/Cyclical/Cyclical.php New optimizer with cyclical learning rate schedule
src/NeuralNet/Optimizers/Base/Adaptive.php New interface for optimizers requiring parameter cache warming
src/NeuralNet/Optimizers/Adam/Adam.php New adaptive optimizer combining momentum and RMS properties
src/NeuralNet/Optimizers/AdaMax/AdaMax.php New optimizer extending Adam with infinity norm
src/NeuralNet/Optimizers/AdaGrad/AdaGrad.php New adaptive optimizer with accumulated gradient scaling
docs/neural-network/optimizers/stochastic.md Added mathematical formulation documentation
docs/neural-network/optimizers/step-decay.md New documentation with mathematical formulation
docs/neural-network/optimizers/rms-prop.md Updated documentation with mathematical formulation
docs/neural-network/optimizers/momentum.md Updated documentation with mathematical formulation
docs/neural-network/optimizers/cyclical.md New documentation with mathematical formulation
docs/neural-network/optimizers/adamax.md Updated documentation with mathematical formulation
docs/neural-network/optimizers/adam.md Updated documentation with mathematical formulation
docs/neural-network/optimizers/adagrad.md Updated documentation with mathematical formulation

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@@ -0,0 +1,126 @@
<?php

declare(strict_types=1);
Copy link

Copilot AI Dec 30, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The spacing in the strict_types declaration is inconsistent with other optimizer test files in this PR. Most files use declare(strict_types = 1); with spaces around the equals sign, while this file uses declare(strict_types=1); without spaces.

Suggested change
declare(strict_types=1);
declare(strict_types = 1);

Copilot uses AI. Check for mistakes.
@@ -0,0 +1,141 @@
<?php

declare(strict_types=1);
Copy link

Copilot AI Dec 30, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The spacing in the strict_types declaration is inconsistent with other optimizer test files in this PR. Most files use declare(strict_types = 1); with spaces around the equals sign, while this file uses declare(strict_types=1); without spaces.

Suggested change
declare(strict_types=1);
declare(strict_types = 1);

Copilot uses AI. Check for mistakes.
@@ -0,0 +1,94 @@
<?php

declare(strict_types=1);
Copy link

Copilot AI Dec 30, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The spacing in the strict_types declaration is inconsistent with other optimizer test files in this PR. Most files use declare(strict_types = 1); with spaces around the equals sign, while this file uses declare(strict_types=1); without spaces.

Suggested change
declare(strict_types=1);
declare(strict_types = 1);

Copilot uses AI. Check for mistakes.
@apphp apphp requested a review from andrewdalpino January 4, 2026 17:56
Copy link
Member

@andrewdalpino andrewdalpino left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good I just wish we could change the API of NumPower to look like

$a->multiply($b);

instead of

NumPower::multiple($a, $b);

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants