RQ Decomposition In Practice

How to use RQ Decomposition to recover your camera’s K, R and C matrices.
Computer Vision
Linear Algebra
Published

March 13, 2020

Let’s keep things short and sweet.

Given a camera projection matrix, \(P\), we can decompose it into a \(K\) (Camera Matrix), \(R\) (Rotation Matrix) and \(C\) (Camera centroid location) matrix.

IE, given we have \(P = K[R|-RC]\), We want to find \(K\), \(R\) and \(C\).

The method is described in Multiple View Geometry in Computer Vision (Second Edition), on page 163; however, let’s turn it into a practical Python implementation.

Let’s follow along with an example from the book.

import numpy as np
import scipy.linalg
np.set_printoptions(precision=2)
P = np.array([[3.53553e+2,  3.39645e+2, 2.77744e+2, -1.44946e+6],
              [-1.03528e+2, 2.33212e+1, 4.59607e+2, -6.3252e+5],
              [7.07107e-1, -3.53553e-1, 6.12372e-1, -9.18559e+2]])

So, we have: \(P = [M | −MC]\)

M can be decomposed as \(M=KR\) using the RQ decomposition.

M = P[0:3,0:3]
K, R = linalg.rq(M)

So far, so good.

Now things get a little more complex.

We want to find a Camera matrix with a positive diagonal, giving positive focal lengths.

However, if this doesn’t happen, we can adjust the sign of each column for both the \(K\) and \(R\) matrix.

T = np.diag(np.sign(np.diag(K)))
if linalg.det(T) < 0:
    T[1,1] *= -1

K = np.dot(K,T)
R = np.dot(T,R)

Finally, we can find the Camera Center (\(C\)).

We have \(P_4\), the 4th column of \(P\).

\(P_4 = −MC\)

From this, we can find \(C = {-M}^{-1} P_4\)

def factorize(P):
    M = P[:,0:3]
    
    K,R = scipy.linalg.rq(M)
    
    T = np.diag(np.sign(np.diag(K)))
    
    if scipy.linalg.det(T) < 0:
        T[1,1] *= -1
    
    K = np.dot(K,T)
    R = np.dot(T,R)
    
    C = np.dot(scipy.linalg.inv(-M),P[:,3])
    return(K,R,C)
    
K,R,C = factorize(P)

print('K')
print(K)

print('R')
print(R)

print('C')
print(C)
K
[[468.16  91.23 300.  ]
 [  0.   427.2  200.  ]
 [  0.     0.     1.  ]]
R
[[ 0.41  0.91  0.05]
 [-0.57  0.22  0.79]
 [ 0.71 -0.35  0.61]]
C
[1000.01 2000.   1499.99]

Voilà!

This presentation is a great read and provides a good overview of the RQ and QR decompositions.