Second LPT Challenge Case 2: Time-Resolved results

This page presents published results for the Time-Resolved case 2nd LPT Challenge Case 2. The link for result submission, as well as details about the test case and  rules for result are given there.

Tables can be ordered according to each of the quantities reported. Clicking on an algorithm name in a table will open a page gathering its result, together with information on the submitter.

Definition of performance metrics

Evaluation pertains to both particle localization, at the two pulse times t0 and t1, and estimation of the displacement vectors. For this specific test case, where a subset of the particles are viewed by only 2 or 3 cameras at low I/U ratio, quantities for performance evaluation are estimated on the set of particles which are viewed by all 4 cameras. For either particles or vectors, we count :

  • a True Positive (TP) if a ground truth (GT) particle is found in the neighborhood of a detection (maximum componentwise distance less than 1 voxel). For vectors, both detections should be TPs and associated to the same GT particle.
  • a False Negative (FN, a.k.a. missed particle) if there is no detection in the neighborhood of a GT particle. A FN vector is a GT vector that is not associated to a TP vector.
  • a False Positive (FP, a.k.a. ghost particle/vector) particle/vector when it is not a TP particle/vector.

Precision and Recall are derived quantities reflecting the detection performance, defined as:

 

Root-mean-square (rms) errors, whether on particle position, velocity or acceleration, are defined by considering only the TP within the result. Corresponding units are as follows:

  • Position errors (raw and fitted) are given in voxel, with a voxel size here equal to 30.5 µm
  • Velocity errors are dimensionless, with corresponding reference velocity Vinf = 10 m/s
  • Acceleration errors are dimensionless, with corresponding reference acceleration Ainf = Vinf2/D = 10,000 m/s2, where D = 0.01 m is the cylinder diameter (see there for more information on the physical setup).

Below, we provide tables showing these quantities for all test points. Then, we show figures comparing these quantities among participants, as well as groups of error slices for velocity and acceleration at a given test point.

ppp: 0.025
Algorithm Sort descending Institution I/U ratio Mie Precision Recall #TP #FP #FN Raw pos. rms error Pos. rms error X pos. rms error Y pos. rms error Z pos. rms error Vel. rms error X vel. rms error Y vel. rms error Z vel. rms error Acc. rms error X acc. rms error Y acc. rms error Z acc. rms error
STB DLR 0.99 Off 1 0.986 50395 3 712 0.0334 0.0354 0.018 0.0191 0.0238 0.00235 0.00113 0.00121 0.00167 0.058 0.0301 0.037 0.033
STB@DaVis LaVision GmbH 0.99 Off 0.978 0.977 49945 1123 1162 0.0741 0.129 0.0733 0.0761 0.0735 0.0151 0.00808 0.00888 0.00907 0.122 0.0637 0.0725 0.0743
ppp: 0.05
Algorithm Sort descending Institution I/U ratio Mie Precision Recall #TP #FP #FN Raw pos. rms error Pos. rms error X pos. rms error Y pos. rms error Z pos. rms error Vel. rms error X vel. rms error Y vel. rms error Z vel. rms error Acc. rms error X acc. rms error Y acc. rms error Z acc. rms error
STB DLR 0.99 Off 1 0.986 100853 0 1385 0.0342 0.0365 0.0189 0.0191 0.0248 0.00229 0.00119 0.00123 0.00152 0.0571 0.032 0.0329 0.034
STB@DaVis LaVision GmbH 0.99 Off 0.982 0.981 100277 1834 1961 0.0784 0.119 0.067 0.07 0.0687 0.0146 0.00808 0.0089 0.00821 0.134 0.0711 0.0765 0.0835
ppp: 0.08
Algorithm Sort descending Institution I/U ratio Mie Precision Recall #TP #FP #FN Raw pos. rms error Pos. rms error X pos. rms error Y pos. rms error Z pos. rms error Vel. rms error X vel. rms error Y vel. rms error Z vel. rms error Acc. rms error X acc. rms error Y acc. rms error Z acc. rms error
STB DLR 0.99 Off 1 0.986 161394 8 2216 0.0356 0.0367 0.0193 0.0199 0.0241 0.00225 0.00116 0.00125 0.00148 0.055 0.0299 0.0334 0.0319
STB DLR 0.99 On 0.997 0.992 162082 537 1261 0.163 0.161 0.086 0.0872 0.104 0.0073 0.00387 0.0041 0.00464 0.127 0.0719 0.0728 0.0751
STB DLR 0.46 On 0.981 0.818 175483 3481 39034 0.213 0.205 0.111 0.112 0.131 0.0116 0.00649 0.00642 0.00708 0.221 0.13 0.128 0.126
STB DLR 0.46 Off 0.999 0.998 214352 117 466 0.0848 0.0798 0.0432 0.0434 0.0512 0.0038 0.00213 0.00225 0.0022 0.0837 0.0497 0.0505 0.0446
STB@DaVis LaVision GmbH 0.46 On 0.987 0.793 170052 2301 44465 0.299 0.202 0.108 0.111 0.13 0.016 0.00865 0.00888 0.0101 0.282 0.153 0.157 0.178
STB@DaVis LaVision GmbH 0.46 Off 0.996 0.994 213560 764 1258 0.152 0.0966 0.0526 0.0537 0.0608 0.00892 0.00507 0.00524 0.00515 0.119 0.0692 0.0683 0.0691
STB@DaVis LaVision GmbH 0.99 Off 0.985 0.983 160846 2395 2764 0.0844 0.115 0.0642 0.0665 0.0687 0.0132 0.00712 0.00799 0.00771 0.123 0.0633 0.0743 0.0746
STB@DaVis LaVision GmbH 0.99 On 0.986 0.942 153797 2165 9546 0.248 0.18 0.0977 0.0994 0.114 0.0154 0.0085 0.00897 0.00912 0.199 0.108 0.116 0.121
ppp: 0.12
Algorithm Sort descending Institution I/U ratio Mie Precision Recall #TP #FP #FN Raw pos. rms error Pos. rms error X pos. rms error Y pos. rms error Z pos. rms error Vel. rms error X vel. rms error Y vel. rms error Z vel. rms error Acc. rms error X acc. rms error Y acc. rms error Z acc. rms error
STB DLR 0.99 Off 1 0.999 245210 7 214 0.0401 0.043 0.0222 0.0231 0.0286 0.00291 0.00153 0.00154 0.00194 0.0709 0.0403 0.0408 0.0416
STB@DaVis LaVision GmbH 0.99 Off 0.981 0.977 239780 4552 5644 0.0961 0.123 0.0689 0.0721 0.072 0.0156 0.0083 0.00953 0.00918 0.169 0.0883 0.102 0.101
ppp: 0.16
Algorithm Sort descending Institution I/U ratio Mie Precision Recall #TP #FP #FN Raw pos. rms error Pos. rms error X pos. rms error Y pos. rms error Z pos. rms error Vel. rms error X vel. rms error Y vel. rms error Z vel. rms error Acc. rms error X acc. rms error Y acc. rms error Z acc. rms error
STB DLR 0.99 Off 1 0.999 326308 75 424 0.0545 0.0581 0.0308 0.0323 0.0371 0.0048 0.00246 0.0029 0.00293 0.113 0.0591 0.07 0.0665
STB@DaVis LaVision GmbH 0.99 Off 0.98 0.978 319411 6432 7321 0.0812 0.127 0.0717 0.0743 0.0738 0.0151 0.0083 0.00904 0.00875 0.145 0.075 0.0898 0.0857
ppp: 0.2
Algorithm Sort descending Institution I/U ratio Mie Precision Recall #TP #FP #FN Raw pos. rms error Pos. rms error X pos. rms error Y pos. rms error Z pos. rms error Vel. rms error X vel. rms error Y vel. rms error Z vel. rms error Acc. rms error X acc. rms error Y acc. rms error Z acc. rms error
STB DLR 0.99 Off 1 0.998 408071 54 620 0.0555 0.0568 0.0305 0.0312 0.0364 0.00413 0.00207 0.0024 0.00265 0.0899 0.046 0.0529 0.0563
STB@DaVis LaVision GmbH 0.99 Off 0.988 0.977 399494 4717 9197 0.131 0.126 0.0705 0.0725 0.0753 0.0126 0.00691 0.00761 0.00729 0.127 0.0658 0.0751 0.0778
ppp: 0.25
Algorithm Sort descending Institution I/U ratio Mie Precision Recall #TP #FP #FN Raw pos. rms error Pos. rms error X pos. rms error Y pos. rms error Z pos. rms error Vel. rms error X vel. rms error Y vel. rms error Z vel. rms error Acc. rms error X acc. rms error Y acc. rms error Z acc. rms error
STB DLR 0.99 Off 1 0.998 508476 95 1170 0.0793 0.077 0.0412 0.042 0.0498 0.00551 0.00284 0.00302 0.00363 0.119 0.0636 0.067 0.0758

2nd LPT Challenge case 2 TR results