Skip to content

[GSoC 2026] QMLHEP7: Add Quantum Attention baseline & repository maintenance#46

Open
r-karra wants to merge 3 commits intoML4SCI:mainfrom
r-karra:contribution/quantum-attention-optimization
Open

[GSoC 2026] QMLHEP7: Add Quantum Attention baseline & repository maintenance#46
r-karra wants to merge 3 commits intoML4SCI:mainfrom
r-karra:contribution/quantum-attention-optimization

Conversation

@r-karra
Copy link

@r-karra r-karra commented Feb 26, 2026

Summary of Contributions
This PR introduces a baseline for the GSoC 2026 QMLHEP7 project and provides maintenance updates for existing repository modules.

  1. New Research Baseline:

Added JAX+Flax Quantum Attention implementation to support the Particle Transformer research.

  1. Reproducibility Updates:

Added requirements.txt and setup cells to the EQNN_Cosmos_Dong folder to ensure consistent execution across environments (Colab/Codespaces).

  1. Documentation:

Created comprehensive README.md files for Quantum_AE and dev_notebooks subfolders to improve project navigability.

I have verified all notebooks execute correctly in a GitHub Codespace environment.

- Quark-Gluon: physics goal, dataset info, notebook overview
- Electron-Photon: EM calorimeter classification, model descriptions
- MNIST: benchmark suite with classical and quantum autoencoders
- Pennylane: advanced QAE variants, optimizers (QNG), physics applications

Improves visibility and documentation for researchers exploring quantum
autoencoders in high-energy physics.
@r-karra r-karra changed the title [GSoC 2026] Add JAX+Flax Quantum Attention baseline for QMLHEP7 project [GSoC 2026] QMLHEP7: Add Quantum Attention baseline & repository maintenance Feb 26, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant