Gopi Raju Matta* Reddypalli Trisha Kaushik Mitra
This paper was accepted by the 1st Workshop on "Event-based Vision in the Era of Generative AI - Transforming Perception and Visual Innovation", held at WACV 2025.
We explore the possibility of recovering sharp radiance fields (Gaussian splats) and camera motion trajectory from a single motion-blurred image. This allows BeSplat to decode the underlying sharp scene representation and video from a single blurred image and its corresponding event stream.
- Project homepage is now live! Check it out here.
- Training, testing, and evaluation codes, along with datasets, are now available.
Follow the setup instructions for 3D Gaussian Splatting for environment requirements and setup.
We use real-world datasets from E2NeRF, captured using the DAVIS346 color event camera, and synthetic datasets from BeNeRF for evaluations.
- The real-world datasets contain five scenes: letter, lego, camera, plant, and toys.
- The synthetic datasets from BeNeRF include three sequences from Unreal Engine: livingroom, whiteroom, and pinkcastle, and two sequences from Blender: tanabata and outdoorpool.
You can download the datasets from the following links:
python train.py -s <path to dataset> --eval --deblur # Train with train/test split
Additional Command Line Arguments for train.py
blur_sample_num
: number of key frames for trajectory time samplingdeblur
: switch the deblur modemode
: models of camera motion trajectory (i.e. Linear, Spline, Bezier)bezier_order
: order of the Bézier curve when use Bézier curve for trajectory modeling
python train.py -s <path to dataset> --eval # Train with train/test split
python render.py -m <path to trained model> # Generate renderings
python metrics.py -m <path to trained model> # Compute error metrics on renderings
Additional Command Line Arguments for render.py
optim_pose
: optimize the camera pose to align with the dataset
python render_video.py -m <path to trained model>
You can check our results at the following link.
If you find this repository useful, please consider citing our paper:
@InProceedings{Matta_2025_WACV,
author = {Matta, Gopi Raju and Reddypalli, Trisha and Mitra, Kaushik},
title = {BeSplat: Gaussian Splatting from a Single Blurry Image and Event Stream},
booktitle = {Proceedings of the Winter Conference on Applications of Computer Vision (WACV) Workshops},
month = {February},
year = {2025},
pages = {917-927}
}
In our work, the camera trajectory optimization was inspired by Deblur-GS, and the event stream integration into Gaussian Splatting was inspired by the methodology used in BeNeRF. The overall code framework is based on both repositories. We appreciate the efforts of the contributors to these amazing projects.