31

I have a function to convert string to hex as this,

public static string ConvertToHex(string asciiString)
{
    string hex = "";
    foreach (char c in asciiString)
    {
         int tmp = c;
         hex += String.Format("{0:x2}", (uint)System.Convert.ToUInt32(tmp.ToString()));
    }
    return hex;
}

Could you please help me write another string to Binary function based on my sample function?

public static string ConvertToBin(string asciiString)
{
    string bin = "";
    foreach (char c in asciiString)
    {
        int tmp = c;
        bin += String.Format("{0:x2}", (uint)System.Convert.????(tmp.ToString()));
    }
    return bin;
}
3
  • 8
    char => int => string => uint => uint (again?) … whoa! You’ve lost me there. Commented Apr 14, 2011 at 13:59
  • 1
    You seem to think that it is ToUInt32 that is doing the conversion to the hex, but it is actually the x2 formatting specifier to String.Format. Unfortunately, I don't think there is a b8 format specifier. Commented Apr 14, 2011 at 14:02
  • You can implement the ICustomFormatter, shown in MSDN link Commented Oct 12, 2016 at 6:54

4 Answers 4

50

Here you go:

public static byte[] ConvertToByteArray(string str, Encoding encoding)
{
    return encoding.GetBytes(str);
}

public static String ToBinary(Byte[] data)
{
    return string.Join(" ", data.Select(byt => Convert.ToString(byt, 2).PadLeft(8, '0')));
}

// Use any sort of encoding you like. 
var binaryString = ToBinary(ConvertToByteArray("Welcome, World!", Encoding.ASCII));
Sign up to request clarification or add additional context in comments.

7 Comments

Please use System.Text.UTF8Encoding.
His example says asciiString as parameter. Nor do I know what format the binary array should be. But you can change the encoding on demand.
@JSBangs The OP does seem to want to use ASCII. But you’re right, that’s not what the original code does and it probably wouldn’t work either. But using UTF8 does something different yet. The equivalent to OP’s code would be to use Unicode.
The OP used a variable name "asciiString", but this does not change the fact that the string is UTF-16 LE (because that's what string always has). In my opinion, the only reason to ever use a non-Unicode encoding is in the thin interface layer to a legacy system that cannot be changed. And even then, that is only until the legacy system can be replaced. Now the OP may be saying that the characters in asciiString are restricted to the ASCII range (7-bit values). If that is the case, the UTF-8 solution will be identical to the ASCII solution, so UTF-8 should be used anyway.
@JSBangs Right, I totally agree with that. My comment was more in the direction that UTF8 is probably also wrong, or at least not what the code is currently doing for any codepoint > 127.
|
11

It sounds like you basically want to take an ASCII string, or more preferably, a byte[] (as you can encode your string to a byte[] using your preferred encoding mode) into a string of ones and zeros? i.e. 101010010010100100100101001010010100101001010010101000010111101101010

This will do that for you...

//Formats a byte[] into a binary string (010010010010100101010)
public string Format(byte[] data)
{
    //storage for the resulting string
    string result = string.Empty;
    //iterate through the byte[]
    foreach(byte value in data)
    {
        //storage for the individual byte
        string binarybyte = Convert.ToString(value, 2);
        //if the binarybyte is not 8 characters long, its not a proper result
        while(binarybyte.Length < 8)
        {
            //prepend the value with a 0
            binarybyte = "0" + binarybyte;
        }
        //append the binarybyte to the result
        result += binarybyte;
    }
    //return the result
    return result;
}

Comments

3

Here's an extension function:

        public static string ToBinary(this string data, bool formatBits = false)
        {
            char[] buffer = new char[(((data.Length * 8) + (formatBits ? (data.Length - 1) : 0)))];
            int index = 0;
            for (int i = 0; i < data.Length; i++)
            {
                string binary = Convert.ToString(data[i], 2).PadLeft(8, '0');
                for (int j = 0; j < 8; j++)
                {
                    buffer[index] = binary[j];
                    index++;
                }
                if (formatBits && i < (data.Length - 1))
                {
                    buffer[index] = ' ';
                    index++;
                }
            }
            return new string(buffer);
        }

You can use it like:

Console.WriteLine("Testing".ToBinary());

and if you add 'true' as a parameter, it will automatically separate each binary sequence.

Comments

2

The following will give you the hex encoding for the low byte of each character, which looks like what you're asking for:

StringBuilder sb = new StringBuilder();
foreach (char c in asciiString)
{
    uint i = (uint)c;
    sb.AppendFormat("{0:X2}", (i & 0xff));
}
return sb.ToString();

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.