I'm trying to allocate a Java char array from Jython which will be populated by a Java library. I want to do the equivalent to from Jython:
char[] charBuffer = new char[charCount];
I've read the documentation for the array and jarray modules (I think they're the same) but I'm not entirely sure which type code I want to use. The two document's seem slightly contradictory, but the newer array module seems more correct.
According to the Java documentation, a char is a "16-bit Unicode character" (2 bytes).
So if I check the following type codes:
>>> array.array('c').itemsize # C char, Python character
1
>>> array.array('b').itemsize # C signed char, Python int
1
>>> array.array('B').itemsize # C unsigned char, Python int
2
>>> array.array('u').itemsize # C Py_UNICODE, Python unicode character
4
>>> array.array('h').itemsize # C signed short, Python int
2
>>> array.array('H').itemsize # C unsigned short Python int
4
It seems odd to me that the size of B and H are twice the size of their signed counterparts b and h. Can I safely and reliably use the 16-bit B (unsigned char) or h (signed short int) for a Java char? Or, if using the array module for this is completely wrong for this, please let me know.