Method

MoCha-Stereo: Motif Channel Attention Network for Stereo Matching [MoCha-Stereo]
https://github.com/ZYangChen/MoCha-Stereo

Submitted on 30 May. 2023 03:05 by
Ziyang Chen (Guizhou University)

Running time:0.34 s
Environment:NVIDIA Tesla A6000 (PyTorch)

Method Description:
Learning-based stereo matching techniques have
made significant progress. However, existing
methods inevitably lose geometrical structure
information during the feature channel generation
process, resulting in edge detail mismatches. In
this paper, the Motif Channel Attention Stereo
Matching Network (MoCha-Stereo) is designed to
address this problem. We provide the Motif Channel
Correlation Volume (MCCV) to determine more
accurate edge matching costs. MCCV is achieved by
projecting motif channels, which capture common
geometric structures in feature channels, onto
feature maps and cost volumes. In addition, edge
variations in the reconstruction error map also
affect details matching, we propose the
Reconstruction Error Motif Penalty (REMP) module
to further refine the fullresolution disparity
estimation. REMP integrates the frequency
information of typical channel features from the
reconstruction error.
Parameters:
Iters=32
Latex Bibtex:
@inproceedings{mocha,
title={MoCha-Stereo: Motif Channel Attention
Network for Stereo Matching},
author={Chen, Ziyang and Long, Wei and
Yao, He and Zhang, Yongjun and Wang, Bingshu and
Qin, Yongbin and Wu, Jia},
booktitle={Proceedings of the IEEE/CVF
Conference on Computer Vision and Pattern
Recognition (CVPR)},
year={2024}
}

Detailed Results

This page provides detailed results for the method(s) selected. For the first 20 test images, the percentage of erroneous pixels is depicted in the table. We use the error metric described in Object Scene Flow for Autonomous Vehicles (CVPR 2015), which considers a pixel to be correctly estimated if the disparity or flow end-point error is <3px or <5% (for scene flow this criterion needs to be fulfilled for both disparity maps and the flow map). Underneath, the left input image, the estimated results and the error maps are shown (for disp_0/disp_1/flow/scene_flow, respectively). The error map uses the log-color scale described in Object Scene Flow for Autonomous Vehicles (CVPR 2015), depicting correct estimates (<3px or <5% error) in blue and wrong estimates in red color tones. Dark regions in the error images denote the occluded pixels which fall outside the image boundaries. The false color maps of the results are scaled to the largest ground truth disparity values / flow magnitudes.

Test Set Average

Error D1-bg D1-fg D1-all
All / All 1.36 2.43 1.53
All / Est 1.36 2.43 1.53
Noc / All 1.24 2.42 1.44
Noc / Est 1.24 2.42 1.44
This table as LaTeX

Test Image 0

Error D1-bg D1-fg D1-all
All / All 1.48 1.54 1.49
All / Est 1.48 1.54 1.49
Noc / All 1.48 1.54 1.49
Noc / Est 1.48 1.54 1.49
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 1

Error D1-bg D1-fg D1-all
All / All 1.50 3.20 1.69
All / Est 1.50 3.20 1.69
Noc / All 1.44 3.20 1.64
Noc / Est 1.44 3.20 1.64
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 2

Error D1-bg D1-fg D1-all
All / All 2.72 8.43 3.00
All / Est 2.72 8.43 3.00
Noc / All 2.65 8.43 2.94
Noc / Est 2.65 8.43 2.94
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 3

Error D1-bg D1-fg D1-all
All / All 1.60 1.37 1.58
All / Est 1.60 1.37 1.58
Noc / All 1.56 1.37 1.55
Noc / Est 1.56 1.37 1.55
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 4

Error D1-bg D1-fg D1-all
All / All 0.50 0.77 0.54
All / Est 0.50 0.77 0.54
Noc / All 0.48 0.77 0.53
Noc / Est 0.48 0.77 0.53
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 5

Error D1-bg D1-fg D1-all
All / All 1.68 2.21 1.72
All / Est 1.68 2.21 1.72
Noc / All 1.51 2.21 1.58
Noc / Est 1.51 2.21 1.58
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 6

Error D1-bg D1-fg D1-all
All / All 2.17 0.82 2.03
All / Est 2.17 0.82 2.03
Noc / All 2.21 0.82 2.07
Noc / Est 2.21 0.82 2.07
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 7

Error D1-bg D1-fg D1-all
All / All 0.22 2.32 0.63
All / Est 0.22 2.32 0.63
Noc / All 0.22 2.32 0.64
Noc / Est 0.22 2.32 0.64
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 8

Error D1-bg D1-fg D1-all
All / All 0.25 2.05 0.58
All / Est 0.25 2.05 0.58
Noc / All 0.24 2.05 0.58
Noc / Est 0.24 2.05 0.58
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 9

Error D1-bg D1-fg D1-all
All / All 0.22 1.37 0.51
All / Est 0.22 1.37 0.51
Noc / All 0.22 1.44 0.53
Noc / Est 0.22 1.44 0.53
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 10

Error D1-bg D1-fg D1-all
All / All 1.13 1.63 1.25
All / Est 1.13 1.63 1.25
Noc / All 1.14 1.63 1.26
Noc / Est 1.14 1.63 1.26
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 11

Error D1-bg D1-fg D1-all
All / All 0.66 0.53 0.64
All / Est 0.66 0.53 0.64
Noc / All 0.66 0.53 0.64
Noc / Est 0.66 0.53 0.64
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 12

Error D1-bg D1-fg D1-all
All / All 0.74 0.60 0.73
All / Est 0.74 0.60 0.73
Noc / All 0.57 0.60 0.58
Noc / Est 0.57 0.60 0.58
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 13

Error D1-bg D1-fg D1-all
All / All 0.44 0.31 0.42
All / Est 0.44 0.31 0.42
Noc / All 0.43 0.31 0.41
Noc / Est 0.43 0.31 0.41
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 14

Error D1-bg D1-fg D1-all
All / All 1.42 0.06 1.40
All / Est 1.42 0.06 1.40
Noc / All 1.25 0.06 1.23
Noc / Est 1.25 0.06 1.23
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 15

Error D1-bg D1-fg D1-all
All / All 2.47 0.53 2.30
All / Est 2.47 0.53 2.30
Noc / All 2.52 0.53 2.34
Noc / Est 2.52 0.53 2.34
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 16

Error D1-bg D1-fg D1-all
All / All 3.67 0.09 3.14
All / Est 3.67 0.09 3.14
Noc / All 3.48 0.09 2.98
Noc / Est 3.48 0.09 2.98
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 17

Error D1-bg D1-fg D1-all
All / All 0.88 0.22 0.81
All / Est 0.88 0.22 0.81
Noc / All 0.87 0.22 0.80
Noc / Est 0.87 0.22 0.80
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 18

Error D1-bg D1-fg D1-all
All / All 4.81 0.89 2.95
All / Est 4.81 0.89 2.95
Noc / All 4.76 0.89 2.90
Noc / Est 4.76 0.89 2.90
This table as LaTeX

Input Image

D1 Result

D1 Error


Test Image 19

Error D1-bg D1-fg D1-all
All / All 0.76 0.47 0.72
All / Est 0.76 0.47 0.72
Noc / All 0.77 0.47 0.73
Noc / Est 0.77 0.47 0.73
This table as LaTeX

Input Image

D1 Result

D1 Error




eXTReMe Tracker