⌛ Warning ⌛ This code has been in the fridge for a while. We are cleaning it up and documenting it little by little. Please be patient.
🔧 Developed at the ENS Paris-Saclay, Centre Borelli and accepted at the CVPR EarthVision Workshop 2023.
This project follows our previous work Sat-NeRF (2022) and was recently used in 🔥 S-EO (2025) 🔥 to further leverage shadow predictions for improved 3D reconstructions from satellite images.
Roger Marí, Gabriele Facciolo, Thibaud Ehret
Abstract: We introduce Earth Observation NeRF (EO-NeRF), a new method for digital surface modeling and novel view synthesis from collections of multi-date remote sensing images. In contrast to previous variants of NeRF proposed in the literature for satellite images, EO-NeRF outperforms the altitude accuracy of advanced pipelines for 3D reconstruction from multiple satellite images, including classic and learned stereovision methods. This is largely due to a rendering of building shadows that is strictly consistent with the scene geometry and independent from other transient phenomena. In addition to that, a number of strategies are also proposed with the aim to exploit raw satellite images. We add model parameters to circumvent usual pre-processing steps, such as the relative radiometric normalization of the input images and the bundle adjustment for refining the camera models. We evaluate our method on different areas of interest using sets of 10-20 pre-processed and raw pansharpened WorldView-3 images.
If you find this code or work helpful, please cite:
@inproceedings{mari2023multi,
title={Multi-date earth observation nerf: The detail is in the shadows},
author={Mar{\'\i}, Roger and Facciolo, Gabriele and Ehret, Thibaud},
booktitle={2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)},
pages={2035--2045},
year={2023}
}
To create the conda environment you can use the setup script, e.g.
conda init && bash -i setup_env.sh
Warning: If some libraries are not found, it may be necessary to update the environment variable LD_LIBRARY_PATH before launching the code:
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$CONDA_PREFIX/lib
where $CONDA_PREFIX is the path to your conda or miniconda environment (e.g. /home/roger/miniconda3/envs/eonerf).
Data download is available here. Please check the data description.
Example command to train EO-NeRF on the area of interest JAX_068 using the DFC2019 RGB crops:
(eonerf) $ bash run_JAX_RGB.sh JAX_068Remember to update run_JAX_RGB.sh with your own data paths. run_JAX_NEW.sh and run_IARPA.sh can be used in the same way.
Use the eval_eonerf.py script to generate the outputs of a pretrained EO-NeRF model. This script generates the output dsm and rgb/shadow renderings.