I am trying to use the Python opencv function Moments() on a grayscale image, but I receive the following TypeError:
>>> img = cv.LoadImage('example_image.jpg', cv.CV_LOAD_IMAGE_GRAYSCALE)
>>> moments = cv.Moments(img)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: Argument '' must be CvSeq, CvArr, or a sequence of numbers
>>>
I am confident this usage is correct as it is demonstrated in the opencv docs here, where GetHuMoments() uses the results from Moments().
I believe I have opencv and numpy installed correctly, as I have been successfully using them for many other things, and I encounter this on both OS X 10.6 and Red Hat 6.
The same question is posed in the opencv user group, but I don't want to convert the image to a contour first as the reply instructs.