Skip to content

Latest commit

 

History

History
5 lines (3 loc) · 1.85 KB

2410.23213.md

File metadata and controls

5 lines (3 loc) · 1.85 KB

ELMGS: Enhancing memory and computation scaLability through coMpression for 3D Gaussian Splatting

3D models have recently been popularized by the potentiality of end-to-end training offered first by Neural Radiance Fields and most recently by 3D Gaussian Splatting models. The latter has the big advantage of naturally providing fast training convergence and high editability. However, as the research around these is still in its infancy, there is still a gap in the literature regarding the model's scalability. In this work, we propose an approach enabling both memory and computation scalability of such models. More specifically, we propose an iterative pruning strategy that removes redundant information encoded in the model. We also enhance compressibility for the model by including in the optimization strategy a differentiable quantization and entropy coding estimator. Our results on popular benchmarks showcase the effectiveness of the proposed approach and open the road to the broad deployability of such a solution even on resource-constrained devices.

由于神经辐射场(Neural Radiance Fields)首次引入的端到端训练潜力,以及最近3D高斯分裂模型的推动,三维模型逐渐流行起来。后者具有快速训练收敛和高度可编辑性的显著优势。然而,由于该研究领域仍处于初期阶段,关于模型的可扩展性方面的文献仍存在空白。在本研究中,我们提出了一种方法,以实现此类模型的内存和计算扩展性。具体而言,我们提出了一种迭代剪枝策略,用于移除模型中编码的冗余信息。同时,我们通过在优化策略中引入可微量化和熵编码估计器,进一步提高模型的可压缩性。我们在常用基准测试上的结果展示了所提方法的有效性,为这种解决方案在资源受限设备上的广泛部署铺平了道路。