I've been looking to find a way of removing jitter from a project involving tracking two aruco marker boards. The camera is static and I have two marker boards moving with respect to one another. I have been looking at extended kalman filters and it appears that a multiplicative one could be the solution for me. The link below summarises the approach for a quaternion based approach that works on orientation using an IMU.
https://matthewhampsey.github.io/blog/2020/07/18/mekf
My source transformations are rotation vectors and translation vectors from the opencv solvePNP function.
I have recorded the output of the rvecs and tvecs from the two markers in lists of arrays stored in the pickle files attached ( p1 = probe 1, p2 = probe 2).
Data can be downloaded here:
https://drive.google.com/drive/folders/1DL8rnc3rn4UqJUYZVHuUk2Rd4FzvYgCX?usp=sharing
The jitter is present in both the rvec and tvec and I am unsure how to implement the MEKF.
The gif below is a mockup of the two objects moving around with very visible jitter (the rvecs and tvecs are generated from a factory calibrated machine vision camera - 4k stream - the input image for opencv is as good as you can get).
Below is some boilerplate code to experiment with the dataset and see the jitter.
import numpy as np
import open3d as o3d
import cv2
import time
def convert_to_open3d_affine(r,t):
#takes rvec and tvec and converts to 4x4 affine transform with both rotation and translation
R, _ = cv2.Rodrigues(r)
bob=np.eye(4,4)
bob[:3,:3]=R
bob[:,3][:3]=t.reshape(1,3)
#convert from opencv format of affine to open3d
bob[1:3]=-bob[1:3]
return bob
#load saved opencv tracking data
import pickle
with open('p1_rvec_store.pickle', 'rb') as handle:
probe_1_rvec_store = pickle.load(handle)
with open('p1_tvec_store.pickle', 'rb') as handle:
probe_1_tvec_store = pickle.load(handle)
with open('p2_rvec_store.pickle', 'rb') as handle:
probe_2_rvec_store = pickle.load(handle)
with open('p2_tvec_store.pickle', 'rb') as handle:
probe_2_tvec_store = pickle.load(handle)
with open('probe_1_vertices.pickle', 'rb') as handle:
probe_1_vertices = pickle.load(handle)
with open('probe_2_vertices.pickle', 'rb') as handle:
probe_2_vertices = pickle.load(handle)
#load point clouds of tracked objects
pcd=o3d.geometry.PointCloud()
pcd.points=o3d.utility.Vector3dVector(probe_1_vertices)
pcd2=o3d.geometry.PointCloud()
pcd2.points=o3d.utility.Vector3dVector(probe_2_vertices)
#setup visualiser
vis = o3d.visualization.Visualizer()
vis.create_window(height=480, width=640)
vis.add_geometry(pcd)
vis.add_geometry(pcd2)
while True:
for pr,pt,sr,st in zip(probe_1_rvec_store,probe_1_tvec_store,probe_2_rvec_store,probe_2_tvec_store):
time.sleep(0.05)
pcd.points=o3d.utility.Vector3dVector(probe_1_vertices)
pcd2.points=o3d.utility.Vector3dVector(probe_2_vertices)
#need to insert kalman filter here, taking pr,pt,sr and st as input
pcd.points=pcd.transform(convert_to_open3d_affine(pr,pt)).points #open3d takes 4x4 affine as input for transforms hence conversion
pcd2.points=pcd2.transform(convert_to_open3d_affine(sr,st)).points #open3d takes 4x4 affine as input for transforms hence conversion
vis.update_geometry(pcd)
vis.update_geometry(pcd2)
vis.poll_events()
vis.update_renderer()
Any help perhaps with python code to elaborate would be most welcome as my knowledge of code is probably less weak than my knowledge of the maths behind this!
