Motion-free Exposure Fusion based on Inter-consistency and Intra-consistency



Abstract

Exposure fusion often suffers from ghost artifacts caused by object movement when capturing a dynamic scene. In this paper, two types of consistency concepts are introduced to enforce the guidance of reference image for motion detection and ghost removal. Specifically, the inter-consistency, representing the similarity of pixel intensities among different exposures, is weak due to different exposure settings. Histogram matching is employed to recover the inter-consistency, so motion can be detected easily as the pixel differences are mostly caused by content changes due to object movement. To further restrain the outlier weights in fusion, motion detection is performed in super-pixel level to ensure that pixels with similar intensities and structure share similar fusion weights, which is referred as intra-consistency. Experiments in various dynamic scenes show that the proposed algorithm can find the motion more effectively than existing methods and produce pleasant fusion results free of ghosting artifacts.

Code available soon


Publication:

Wei Zhang et al. Patch-Based Correlation for Deghosting in Exposure Fusion, Information Sciences. 2017.

Wei Zhang et al. Motion-free Exposure Fusion based on Inter-consistency and Intra-consistency. Information Sciences. Nov. 2016.

Wei Zhang et al. Exploiting Patch-based Correlation for Ghost Removal in Exposure Fusion, Proc. of International Conference on Multimedia & Expo (ICME), 2017.


Patent:

  1. "一种基于直方图归一化与超像素分割的HDR重建算法" 发明人:张伟中国发明专利 (专利号:ZL201610191289.9)

  2. "高动态多曝光图像融合软件" 发明人:张伟软件著作权 (登记号:2014SR113413)


Result & Comparison:

#1

#2

#3

#4

#5

#6

#7

#8

#9

#10

#11

Reference:

  1. [Mertens et al.] T. Mertens, J. Kautz and F. V. Reeth, Exposure fusion: a simple and practical alternative to high dynamic range photography. Computer Graphics Forum, v. 28, pp. 161-171, 2009.
  2. [Zhang and Cham] W. Zhang, and W.-K. Cham, Reference-guided exposure fusion in dynamic scenes. Journal of Visual Communication and Image Representation, 23 (3) (2012) 467-475.
  3. [Sen et al.] P. Sen, N. K. Kalantari, M. Yaesoubi, S. Darabi, D. B. Goldman, E. Shechtman, Robust patch-based hdr reconstruction of dynamic scenes. ACM Trans. Graph. 31 (6) (2012) 203.
  4. [Hu et al.]J. Hu, O. Gallo, K. Pulli, X. Sun, Hdr deghosting: How to deal with saturation?, in: 2013 IEEE Conference on Computer Vision and Pattern Recognition, IEEE, 2013, pp. 1163-1170.
  5. [Gallo et al.] O. Gallo, N. Gelfand, W.-C. Chen, M. Tico, K. Pulli. Artifact-free high dynamic range imaging, in: Computational Photography (ICCP), 2009 IEEE International Conference on, IEEE, 2009, pp. 1-7.
  6. [Kang et al.] S. B. Kang, M. Uyttendaele, S. Winder, R. Szeliski, High dynamic range video, ACM Transactions on Graphics (TOG) 22 (3) (2003) 319-325.


Back to Home