# Numpys pinv seem slow?

## Numpys pinv seem slow?

Contents

Problem Description:

I was playing around with numpys pinv, and it seems like it is quite slow. Compared to a homebrewed function:

``````def moore_penrose_inverse(data: np.ndarray) -> np.ndarray:
if data.shape[0] > data.shape[1]:
results = np.linalg.inv(np.dot(data.T, data)) @ data.T
elif data.shape[0] < data.shape[1]:
results = data.T @ np.linalg.inv(np.dot(data, data.T))
else:
results = np.linalg.inv(data)
return results
``````

It seem to be way slower:

``````data = np.random.random((500,1000))

%timeit np.linalg.pinv(data)
335 ms ± 57.1 ms per loop (mean ± std. dev. of 7 runs, 1 loop each)

%timeit moore_penrose_inverse(data)
42.6 ms ± 1.64 ms per loop (mean ± std. dev. of 7 runs, 10 loops each)
``````

An they do return the same result:

``````np.allclose(np.linalg.pinv(data), moore_penrose_inverse(data))
True
np.allclose(np.linalg.pinv(data.T), moore_penrose_inverse(data.T))
True
``````

For larger datasets, it also seemed that the pinv was around a factor 10 slower than the homebrewed.

I know numpy uses svd to calculate the pseudo inverse, but what are the benefits of this and is it worth compared to the loss of speed?

## Solution – 1

Based on my understanding of the Moore–Penrose inverse, the code of Numpy of both `inv` and `pinv`, and profiling results, I can say that the performance gap comes from the computation of the singular value decomposition which takes about 96% of the time (at least on my machine).

I do not have a strong mathematical background on this, so please correct me if I am wrong: AFAIK, this SVD seems required for the pinv to provide correct/numerically-stable results when it is useful to compute this generalized inverse. AFAIK, the random matrix used in the example is not a pathological case where the generalized inverse is useful and a basic inverse can be used instead. I think it is related to the rank of the input matrix and random input matrices have the property of being full-rank while this method is useful when they are not.

Rate this post
We use cookies in order to give you the best possible experience on our website. By continuing to use this site, you agree to our use of cookies.