Bokehlicious: Photorealistic Bokeh Rendering with Controllable Apertures

International Conference on Computer Vision (ICCV 2025) ✨

Tim Seizinger, Florin-Alexandru Vasluianu, Marcos V. Conde, Zongwei Wu, Radu Timofte
University of Würzburg
Loading...
Use the slider to set the aperture! We can generate Bokeh for any given aperture up to a maximum of F2.0.

Abstract

Bokeh rendering methods play a key role in creating the visually appealing, softly blurred backgrounds seen in professional photography. While recent learning-based approaches show promising results, generating realistic Bokeh remains challenging. Existing methods require additional inputs and suffer from unrealistic Bokeh reproduction due to reliance on synthetic data.

In this work, we propose Bokehlicious, a highly efficient network that provides intuitive control over Bokeh strength through an Aperture-Aware Attention mechanism, mimicking the physical lens aperture. To further address the lack of high-quality real-world data, we present RealBokeh, a novel dataset featuring 23,000 high-resolution (24-MP) images captured by professional photographers, covering diverse scenes with varied aperture and focal length settings.

Evaluations on both our new RealBokeh and established Bokeh rendering benchmarks show that Bokehlicious consistently outperforms SOTA methods while significantly reducing computational cost and exhibiting strong zero-shot generalization. Our method and dataset further extend to defocus deblurring, achieving competitive results on the RealDOF benchmark.

Examples of Bokehlicious rendering results, swipe to see more!

Dataset

Real Bokeh was collected in the wild with a Canon R6II camera and a Canon RF 28-70mm f/2.0 lens.
To obtain good alignment between image samples, remote camera operation and an automated capturing capturing procedure was implemented.

Method

Overview of our Network Architecture
To increase efficiency we embed our Transformer into a residual U-Net.
Overview of our Attention Mechanism
Our Aperture Aware Attention dynamically adjusts to the desired output f-stop.

Related Links

There has been interesting works on Bokeh that were done concurrently by other researchers, feel free to check them out!

BokehMe++: Harmonious Fusion of Classical and Neural Rendering for Versatile Bokeh Creation improves BokehMe with a better classical rendering algorithm for a more complex and natural Bokeh.

Neural Bokeh: Learning Lens Blur for Computational Videography and Out-of-Focus Mixed Reality applies the idea of learning a photorealistic Bokeh blur to mixed reality image composition.

BokehDiff: Neural Lens Blur with One-Step Diffusion propose an improved diffusion-based synthetic training data generation framework and similarly employs a physics inspired attention mechanism to achieve fine-grained subject segmentation for bokeh rendering.

BibTeX

If you find our dataset or method useful, please cite us!

          
              @inproceedings{seizinger2025bokehlicious,
              author    = {Seizinger, Tim and Vasluianu, Florin-Alexandru and Conde, Marcos and Wu, Zongwei and Timofte, Radu},
              title     = {Bokehlicious: Photorealistic Bokeh Rendering with Controllable Apertures},
              booktitle = {ICCV},
              year      = {2025},