Skip to content

Commit

Permalink
first commit
Browse files Browse the repository at this point in the history
  • Loading branch information
nnanhuang committed May 30, 2024
1 parent 40f0994 commit e2663f5
Show file tree
Hide file tree
Showing 1,626 changed files with 284,457 additions and 3 deletions.
83 changes: 83 additions & 0 deletions LICENSE.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,83 @@
Gaussian-Splatting License
===========================

**Inria** and **the Max Planck Institut for Informatik (MPII)** hold all the ownership rights on the *Software* named **gaussian-splatting**.
The *Software* is in the process of being registered with the Agence pour la Protection des
Programmes (APP).

The *Software* is still being developed by the *Licensor*.

*Licensor*'s goal is to allow the research community to use, test and evaluate
the *Software*.

## 1. Definitions

*Licensee* means any person or entity that uses the *Software* and distributes
its *Work*.

*Licensor* means the owners of the *Software*, i.e Inria and MPII

*Software* means the original work of authorship made available under this
License ie gaussian-splatting.

*Work* means the *Software* and any additions to or derivative works of the
*Software* that are made available under this License.


## 2. Purpose
This license is intended to define the rights granted to the *Licensee* by
Licensors under the *Software*.

## 3. Rights granted

For the above reasons Licensors have decided to distribute the *Software*.
Licensors grant non-exclusive rights to use the *Software* for research purposes
to research users (both academic and industrial), free of charge, without right
to sublicense.. The *Software* may be used "non-commercially", i.e., for research
and/or evaluation purposes only.

Subject to the terms and conditions of this License, you are granted a
non-exclusive, royalty-free, license to reproduce, prepare derivative works of,
publicly display, publicly perform and distribute its *Work* and any resulting
derivative works in any form.

## 4. Limitations

**4.1 Redistribution.** You may reproduce or distribute the *Work* only if (a) you do
so under this License, (b) you include a complete copy of this License with
your distribution, and (c) you retain without modification any copyright,
patent, trademark, or attribution notices that are present in the *Work*.

**4.2 Derivative Works.** You may specify that additional or different terms apply
to the use, reproduction, and distribution of your derivative works of the *Work*
("Your Terms") only if (a) Your Terms provide that the use limitation in
Section 2 applies to your derivative works, and (b) you identify the specific
derivative works that are subject to Your Terms. Notwithstanding Your Terms,
this License (including the redistribution requirements in Section 3.1) will
continue to apply to the *Work* itself.

**4.3** Any other use without of prior consent of Licensors is prohibited. Research
users explicitly acknowledge having received from Licensors all information
allowing to appreciate the adequacy between of the *Software* and their needs and
to undertake all necessary precautions for its execution and use.

**4.4** The *Software* is provided both as a compiled library file and as source
code. In case of using the *Software* for a publication or other results obtained
through the use of the *Software*, users are strongly encouraged to cite the
corresponding publications as explained in the documentation of the *Software*.

## 5. Disclaimer

THE USER CANNOT USE, EXPLOIT OR DISTRIBUTE THE *SOFTWARE* FOR COMMERCIAL PURPOSES
WITHOUT PRIOR AND EXPLICIT CONSENT OF LICENSORS. YOU MUST CONTACT INRIA FOR ANY
UNAUTHORIZED USE: [email protected] . ANY SUCH ACTION WILL
CONSTITUTE A FORGERY. THIS *SOFTWARE* IS PROVIDED "AS IS" WITHOUT ANY WARRANTIES
OF ANY NATURE AND ANY EXPRESS OR IMPLIED WARRANTIES, WITH REGARDS TO COMMERCIAL
USE, PROFESSIONNAL USE, LEGAL OR NOT, OR OTHER, OR COMMERCIALISATION OR
ADAPTATION. UNLESS EXPLICITLY PROVIDED BY LAW, IN NO EVENT, SHALL INRIA OR THE
AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE
GOODS OR SERVICES, LOSS OF USE, DATA, OR PROFITS OR BUSINESS INTERRUPTION)
HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING FROM, OUT OF OR
IN CONNECTION WITH THE *SOFTWARE* OR THE USE OR OTHER DEALINGS IN THE *SOFTWARE*.
104 changes: 101 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,102 @@
# gs4d
1.0 = waymo + 4d gs
# $\textit{S}^3$Gaussian: Self-Supervised Street Gaussians for Autonomous Driving
### [Paper]() | [Project Page]()

2.0 = dx + dsh + semantic feature + dx reg + dshs reg
<!-- > SelfOcc: Self-Supervised Vision-Based 3D Occupancy Prediction, CVPR 2024 -->

> [Nan Huang](https://github.com/nnanhuang)\*,
[Xiaobao Wei](https://ucwxb.github.io/), [Wenzhao Zheng](https://wzzheng.net/) $\dagger$, Pengju An, [Ming Lu](https://lu-m13.github.io/), [Wei Zhan](https://zhanwei.site/),[Masayoshi Tomizuka](https://me.berkeley.edu/people/masayoshi-tomizuka/),
[Kurt Keutzer](https://people.eecs.berkeley.edu/~keutzer/),[Shanghang Zhang](https://www.shanghangzhang.com/)$\ddagger$

\* Work done while interning at UC Berkeley $\dagger$ Project leader $\ddagger$ Corresponding author

<p align="center">
<img src="./assets/compare.jpg" width=800>
</p>

## News
- **[2023/5/30]** Training code release.
- **[2023/5/30]** Evaluation code release.
- **[2024/5/30]** Paper released on [arXiv](https://arxiv.org/abs/2311.12754).
- **[2024/5/30]** Demo release.

## Demo

### Trained results compared to baseline:

![demo](./assets/visual.gif)

<!-- ### More demo videos can be downloaded [here](https://cloud.tsinghua.edu.cn/d/640283b528f7436193a4/). -->

## Overview
![overview](./assets/pipeline.png)

- To tackle the challenges in self-supervised street scene decomposition, our method consists of a Multi-resolution Hexplane Structure Encoder to encode 4D grid into feature planes and a multi-head Gaussian Decoder to decode them into deformed 4D Gaussians. The entire pipeline is optimized without extra annotations in a self-supervised manner, leading to superior scene decomposition ability and rendering quality.

## Getting Started

### Environmental Setups
Our code is developed on Ubuntu 22.04 using Python 3.9 and pytorch=1.13.1+cu116. We also tested on pytorch=2.2.1+cu118. We recommend using conda for the installation of dependencies.

```bash
git clone https://github.com/nnanhuang/S3Gaussian.git --recursive
cd S3Gaussian
conda create -n S3Gaussian python=3.9
conda activate S3Gaussian

pip install -r requirements.txt
pip install -e submodules/depth-diff-gaussian-rasterization
pip install -e submodules/simple-knn
```



### Preparing Dataset

Follow detailed instructions in [Prepare Dataset](docs/prepare_data.md).


### Training

For training first clip (eg. 0-50 frames), run

```
python train.py -s $data_dir --port 6017 --expname "waymo" --model_path $model_path
```
If you want to try novel view synthesis, use
```
--configs "arguments/nvs.py"
```

For training next clip (eg. 51-100 frames), run
```
python train.py -s $data_dir --port 6017 --expname "waymo" --model_path $model_path
--prior_checkpoint "$prior_dir/chkpnt_fine_50000.pth"
```
Also, you can load an existing checkpoint with:

```python
python train.py -s $data_dir --port 6017 --expname "waymo" --start_checkpoint "$ckpt_dir/chkpnt_fine_30000.pth"
```
For more scripts examples, please check [here](scripts).
### Evaluation and Visualization

You can visualize and eval a checkpoints follow:
```python
python train.py -s $data_dir --port 6017 --expname "waymo" --start_checkpoint "$ckpt_dir/chkpnt_fine_50000.pth"
--eval_only
```
Then you can get rendering RGB videos, ground truth RGB videos, depth videos, dynamic rgb videos and static rgb videos.
## Related Projects

Our code is based on [4D Gaussian Splatting for Real-Time Dynamic Scene Rendering](https://github.com/hustvl/4DGaussians/tree/master) and [EmerNeRF](https://github.com/NVlabs/EmerNeRF?tab=readme-ov-file).

Thanks to these excellent open-sourced repos!


## Citation

If you find this project helpful, please consider citing the following paper:
```
@article{
}
```
Loading

0 comments on commit e2663f5

Please sign in to comment.