Convert Unicode character to an array of bytes. : BitConverter « Development Class « C# / C Sharp






Convert Unicode character to an array of bytes.

 
using System;

class GetBytesCharDemo
{
    const string formatter = "{0,10}{1,16}";
    public static void GetBytesChar( char argument )
    {
        byte[ ] byteArray = BitConverter.GetBytes( argument );
        Console.WriteLine( formatter, argument, BitConverter.ToString( byteArray ) );
    }

    public static void Main( )
    {
        GetBytesChar( '\0' );
        GetBytesChar( 'A' );
    }
}

   
  








Related examples in the same category

1.Convert byte array to String with BitConverter
2.Convert byte array to Int32
3.BitConverter converts base data types to an array of bytes, and an array of bytes to base data types.
4.Convert bytes to UInt32.
5.Convert various data type to string with BitConverter
6.Converts double number to a 64-bit signed integer.
7.Convert Boolean value to an array of bytes.
8.Convert double to an array of bytes.
9.Convert 16-bit signed integer value to an array of bytes.
10.Convert 32-bit signed integer value to an array of bytes.
11.Convert 64-bit signed integer value to an array of bytes.
12.Convert single-precision floating point value to an array of bytes.
13.Convert specified 16-bit unsigned integer value to an array of bytes.
14.Convert specified 32-bit unsigned integer value to an array of bytes.
15.Convert specified 64-bit unsigned integer value to an array of bytes.
16.Converts 64-bit signed integer to a double-precision floating point number.
17.Indicates the byte order ("endianness")
18.Convert one byte at a specified position in a byte array to bool
19.Convert two bytes at a specified position in a byte array to a Unicode character
20.Convert two bytes at a specified position in a byte array to a 16-bit signed integer
21.Convert four bytes at a specified position in a byte array to a 32-bit signed integer