0

I tried to convert a byte array in order to use its bit representation.

Input:

byte[] values = new byte[]{21, 117, 41, -1};

I would like to create one BitSet object from the byte array but I splitted up it to investigate the issue and tried to create multiple BitSet objects from each element of the array.

BitSet objects are created on the following way:

BitSet bitSetTwentyOne = BitSet.valueOf(new byte[]{21});
BitSet bitSetOneHundredSevenTeen = BitSet.valueOf(new byte[{117});
BitSet bitSetFourtyOne = BitSet.valueOf(new byte[]{41});
BitSet bitSetMinusOne = BitSet.valueOf(new byte[]{-1});

The bits are printed out by using the following method:

private String getBits(BitSet bitSet) {
    StringBuilder bitsBuilder = new StringBuilder();

    for (int i = 0; i < bitSet.length(); i++) {
        bitsBuilder.append(bitSet.get(i) == true ? 1 : 0);
    }

    return bitsBuilder.toString();
}

Output:

bitSetTwentyOne: 10101 but expected -> 00010101 (BitSet::length = 5)
bitSetOneHundredSevenTeen: 1010111 but expected -> 01110101 (BitSet::length = 7)
bitSetFourtyOne: 100101 but expected -> 00101001 (BitSet::length = 6)
bitSetMinusOne: 11111111 but it is as expected at least (BitSet::length = 8)

I want all values 8bit width even if it is needed to fill with zeros. I do not understand why it gives wrong binary values in case of converting 117 and 41.

2
  • 2
    What does BitSet::length() do? Commented Feb 12, 2019 at 21:12
  • 3
    Stored as little-endian but printed big-endian - that's why. Commented Feb 12, 2019 at 21:19

2 Answers 2

3
  • length() delivers the logical length using the highest bit 1.
  • size() would give 8.
  • cardinality() would give the number of bit 1s.

So you should have used size().

And then the bits are in little endian format, and you reversed it by outputting bit at 0 first.

private String getBits(BitSet bitSet) {
    StringBuilder bitsBuilder = new StringBuilder();
    for (int i = 0; i < bitSet.size(); ++i) {
        bitsBuilder.insert(0, bitSet.get(i) ? '1' : '0');
    }
    return bitsBuilder.toString();
}
Sign up to request clarification or add additional context in comments.

2 Comments

insert(0, ...) has poor performance because all the elements have to be shifted every time. It is better to allocate bitSet.size() and then use StringBuilder.setCharAt.
@Clashsoft thanks for mentioning that; implementation matters. In fact originally I tended to change the original problem to use a new char[8] i.o. a StringBuilder.
2

It appears that you are printing the bits backwards. Notice how your output (except for the desired leading zeros) is a mirror image of your expected input.

Using your code, passing in 4, which should be 100, I get 001. It appears that bit 0 is the least significant bit, not the most significant bit. To correct this, loop over the indices backwards. The reason that -1 and 41 are correct apart from the leading zeros is that their bitset representations are palindromes.

To prepend leading zeros, subtract the length() from 8 and print that many zeros. The length() method only returns the number of bits necessary to represent the number. You need to code this yourself; there is no BitSet functionality for leading zeros.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.