Discussion
Loading...

Post

  • About
  • Code of conduct
  • Privacy
  • Users
  • Instances
  • About Bonfire
Dan Goodman
@neuralreckoning@neuromatch.social  ·  activity timestamp 3 days ago

Psst - #neuromorphic folks. Did you know that you can solve the SHD dataset with 90% accuracy using only 22 kB of parameter memory by quantising weights and delays? Check out our preprint with Pengfei Sun and Danyal Akarca:

https://arxiv.org/abs/2510.27434

Or check out the TLDR thread on Bsky:

https://bsky.app/profile/did:plc:niqde7rkzo7ua3scet2rzyt7/post/3m5jpksani22m

#SpikingNeuralNetworks #ComputationalNeuroscience #Neuroscience

https://bsky.app
View
arXiv.org

Exploiting heterogeneous delays for efficient computation in low-bit neural networks

Neural networks rely on learning synaptic weights. However, this overlooks other neural parameters that can also be learned and may be utilized by the brain. One such parameter is the delay: the brain exhibits complex temporal dynamics with heterogeneous delays, where signals are transmitted asynchronously between neurons. It has been theorized that this delay heterogeneity, rather than a cost to be minimized, can be exploited in embodied contexts where task-relevant information naturally sits contextually in the time domain. We test this hypothesis by training spiking neural networks to modify not only their weights but also their delays at different levels of precision. We find that delay heterogeneity enables state-of-the-art performance on temporally complex neuromorphic problems and can be achieved even when weights are extremely imprecise (1.58-bit ternary precision: just positive, negative, or absent). By enabling high performance with extremely low-precision weights, delay heterogeneity allows memory-efficient solutions that maintain state-of-the-art accuracy even when weights are compressed over an order of magnitude more aggressively than typically studied weight-only networks. We show how delays and time-constants adaptively trade-off, and reveal through ablation that task performance depends on task-appropriate delay distributions, with temporally-complex tasks requiring longer delays. Our results suggest temporal heterogeneity is an important principle for efficient computation, particularly when task-relevant information is temporal - as in the physical world - with implications for embodied intelligent systems and neuromorphic hardware.
  • Copy link
  • Flag this post
  • Block
ggdupont
@gdupont@framapiaf.org replied  ·  activity timestamp 3 days ago

@neuralreckoning
I haven't looked back into spiking net for a while. Theoretically they should have an edge on time dependant/stream inputs like audio and video.

Not sure where the academic state is at on this.

  • Copy link
  • Flag this comment
  • Block
Dan Goodman
@neuralreckoning@neuromatch.social replied  ·  activity timestamp 3 days ago

@gdupont for small networks I think they probably do, but it's proven hard to scale training for them so best performance still always goes to ANNs. Working on it!

  • Copy link
  • Flag this comment
  • Block
ggdupont
@gdupont@framapiaf.org replied  ·  activity timestamp 3 days ago

@neuralreckoning
Always the hard-to-beat transformer on gpu training efficiency... I can't wait for this one to be displaced by something new.

  • Copy link
  • Flag this comment
  • Block
Log in

bonfire.cafe

A space for Bonfire maintainers and contributors to communicate

bonfire.cafe: About · Code of conduct · Privacy · Users · Instances
Bonfire social · 1.0.0 no JS en
Automatic federation enabled
  • Explore
  • About
  • Members
  • Code of Conduct
Home
Login