I tried to convert a byte array in order to use its bit representation.
Input:
byte[] values = new byte[]{21, 117, 41, -1};
I would like to create one BitSet object from the byte array but I splitted up it to investigate the issue and tried to create multiple BitSet objects from each element of the array.
BitSet objects are created on the following way:
BitSet bitSetTwentyOne = BitSet.valueOf(new byte[]{21});
BitSet bitSetOneHundredSevenTeen = BitSet.valueOf(new byte[{117});
BitSet bitSetFourtyOne = BitSet.valueOf(new byte[]{41});
BitSet bitSetMinusOne = BitSet.valueOf(new byte[]{-1});
The bits are printed out by using the following method:
private String getBits(BitSet bitSet) {
StringBuilder bitsBuilder = new StringBuilder();
for (int i = 0; i < bitSet.length(); i++) {
bitsBuilder.append(bitSet.get(i) == true ? 1 : 0);
}
return bitsBuilder.toString();
}
Output:
bitSetTwentyOne: 10101 but expected -> 00010101 (BitSet::length = 5)
bitSetOneHundredSevenTeen: 1010111 but expected -> 01110101 (BitSet::length = 7)
bitSetFourtyOne: 100101 but expected -> 00101001 (BitSet::length = 6)
bitSetMinusOne: 11111111 but it is as expected at least (BitSet::length = 8)
I want all values 8bit width even if it is needed to fill with zeros. I do not understand why it gives wrong binary values in case of converting 117 and 41.
BitSet::length()do?