2 min read
|
Saved October 29, 2025
|
Copied!
Do you care about this?
The FGFP framework introduces a novel method for compressing deep neural networks using fractional Gaussian filters and adaptive unstructured pruning. By minimizing the number of parameters and leveraging Grünwald-Letnikov fractional derivatives, it achieves significant model size reductions with minimal impact on accuracy, as demonstrated on benchmarks like CIFAR-10 and ImageNet2012.
If you do, here's more
Click "Generate Summary" to create a detailed 2-4 paragraph summary of this article.
Questions about this article
No questions yet.