Skip to content
#

low-rank-factorization

Here are 21 public repositories matching this topic...

🧠 A neuromorphic-inspired neural architecture. Attention Neuron rethinks dense layers as dynamically modulated low-rank systems. Instead of training millions of weights, it trains neuron-centric modulation vectors over a frozen random substrate achieving competitive accuracy with a fraction of the parameters.

  • Updated Apr 25, 2026
  • Python

Improve this page

Add a description, image, and links to the low-rank-factorization topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the low-rank-factorization topic, visit your repo's landing page and select "manage topics."

Learn more