出版社:Information and Media Technologies Editorial Board
摘要:The best known method for optimally computing parameters from noisy data based on geometric constraints is maximum likelihood (ML). This paper reinvestigates “hyperaccurate correction” for further improving the accuracy of ML. In the past, only the case of a single scalar constraint was studied. In this paper, we extend it to multiple constraints given in the form of vector equations. By detailed error analysis, we illuminate the existence of a term that has been ignored in the past. Doing simulation experiments of ellipse fitting, fundamental matrix, and homography computation, we show that the new term does not effectively affect the final solution. However, we show that our hyperaccurate correction is even superior to hyper-renormalization, the latest method regarded as the best fitting method, but that the iterations of ML computation do not necessarily converge in the presence of large noise.