Here, you’ll find some of my projects on:
-
🔬 Neural Network Compression (NNC)
Preprocessing (parameter / precision reduction), encoding, decoding, and transmission -
🔢 Quantization- and Explainability-Aware NN Training
Using XAI and information theory within quantization-aware training to build efficient 2-4 bit neural networks -
📑 Research Papers, Challenges, and Demos
Direct links to code, resources, and contributions -
🔜 Future work
- Extensions for transformer-based models
- Neural codecs and general-purpose compressors with language models
I've long been fascinated by neurophysiological processes and stimulus processing in the human brain. Translating these concepts into artificial neurons and synapses, alongside foundational information theory principles, forms a research area that continues inspiring me.
Beyond my work, you might find tools related to my record collection 💽, playlist organization 📻, and synthesizers 🎹.