Encoding.GetByteCount Method

Definition

When overridden in a derived class, calculates the number of bytes produced by encoding a set of characters.

Overloads

GetByteCount(Char[], Int32, Int32)

When overridden in a derived class, calculates the number of bytes produced by encoding a set of characters from the specified character array.

GetByteCount(String, Int32, Int32)

When overridden in a derived class, calculates the number of bytes produced by encoding a set of characters from the specified string.

GetByteCount(Char*, Int32)

When overridden in a derived class, calculates the number of bytes produced by encoding a set of characters starting at the specified character pointer.

GetByteCount(ReadOnlySpan<Char>)

When overridden in a derived class, calculates the number of bytes produced by encoding the characters in the specified character span.

GetByteCount(Char[])

When overridden in a derived class, calculates the number of bytes produced by encoding all the characters in the specified character array.

GetByteCount(String)

When overridden in a derived class, calculates the number of bytes produced by encoding the characters in the specified string.

GetByteCount(Char[], Int32, Int32)

Source:
Encoding.cs
Source:
Encoding.cs
Source:
Encoding.cs

When overridden in a derived class, calculates the number of bytes produced by encoding a set of characters from the specified character array.

public abstract int GetByteCount (char[] chars, int index, int count);

Parameters

chars
Char[]

The character array containing the set of characters to encode.

index
Int32

The index of the first character to encode.

count
Int32

The number of characters to encode.

Returns

The number of bytes produced by encoding the specified characters.

Exceptions

chars is null.

index or count is less than zero.

-or-

index and count do not denote a valid range in chars.

A fallback occurred (for more information, see Character Encoding in .NET)

-and-

EncoderFallback is set to EncoderExceptionFallback.

Examples

The following example determines the number of bytes required to encode three characters from a character array, encodes the characters, and displays the resulting bytes.

using System;
using System.Text;

public class SamplesEncoding  {

   public static void Main()  {

      // The characters to encode:
      //    Latin Small Letter Z (U+007A)
      //    Latin Small Letter A (U+0061)
      //    Combining Breve (U+0306)
      //    Latin Small Letter AE With Acute (U+01FD)
      //    Greek Small Letter Beta (U+03B2)
      //    a high-surrogate value (U+D8FF)
      //    a low-surrogate value (U+DCFF)
      char[] myChars = new char[] { 'z', 'a', '\u0306', '\u01FD', '\u03B2', '\uD8FF', '\uDCFF' };

      // Get different encodings.
      Encoding  u7    = Encoding.UTF7;
      Encoding  u8    = Encoding.UTF8;
      Encoding  u16LE = Encoding.Unicode;
      Encoding  u16BE = Encoding.BigEndianUnicode;
      Encoding  u32   = Encoding.UTF32;

      // Encode three characters starting at index 4, and print out the counts and the resulting bytes.
      PrintCountsAndBytes( myChars, 4, 3, u7 );
      PrintCountsAndBytes( myChars, 4, 3, u8 );
      PrintCountsAndBytes( myChars, 4, 3, u16LE );
      PrintCountsAndBytes( myChars, 4, 3, u16BE );
      PrintCountsAndBytes( myChars, 4, 3, u32 );
   }

   public static void PrintCountsAndBytes( char[] chars, int index, int count, Encoding enc )  {

      // Display the name of the encoding used.
      Console.Write( "{0,-30} :", enc.ToString() );

      // Display the exact byte count.
      int iBC  = enc.GetByteCount( chars, index, count );
      Console.Write( " {0,-3}", iBC );

      // Display the maximum byte count.
      int iMBC = enc.GetMaxByteCount( count );
      Console.Write( " {0,-3} :", iMBC );

      // Encode the array of chars.
      byte[] bytes = enc.GetBytes( chars, index, count );

      // The following is an alternative way to encode the array of chars:
      // byte[] bytes = new byte[iBC];
      // enc.GetBytes( chars, index, count, bytes, bytes.GetLowerBound(0) );

      // Display all the encoded bytes.
      PrintHexBytes( bytes );
   }

   public static void PrintHexBytes( byte[] bytes )  {

      if (( bytes == null ) || ( bytes.Length == 0 ))
        {
            Console.WriteLine( "<none>" );
        }
        else  {
         for ( int i = 0; i < bytes.Length; i++ )
            Console.Write( "{0:X2} ", bytes[i] );
         Console.WriteLine();
      }
   }
}


/* 
This code produces the following output.

System.Text.UTF7Encoding       : 10  11  :2B 41 37 4C 59 2F 39 7A 2F 2D
System.Text.UTF8Encoding       : 6   12  :CE B2 F1 8F B3 BF
System.Text.UnicodeEncoding    : 6   8   :B2 03 FF D8 FF DC
System.Text.UnicodeEncoding    : 6   8   :03 B2 D8 FF DC FF
System.Text.UTF32Encoding      : 8   16  :B2 03 00 00 FF FC 04 00

*/

Remarks

To calculate the exact array size required by GetBytes to store the resulting bytes, call the GetByteCount method. To calculate the maximum array size, call the GetMaxByteCount method. The GetByteCount method generally allows allocation of less memory, while the GetMaxByteCount method generally executes faster.

The GetByteCount method determines how many bytes result in encoding a set of Unicode characters, and the GetBytes method performs the actual encoding. The GetBytes method expects discrete conversions, in contrast to the Encoder.GetBytes method, which handles multiple conversions on a single input stream.

Several versions of GetByteCount and GetBytes are supported. The following are some programming considerations for use of these methods:

  • Your app might need to encode many input characters to a code page and process the characters using multiple calls. In this case, you probably need to maintain state between calls, taking into account the state that is persisted by the Encoder object being used.

  • If your app handles string inputs, the string version of GetBytes is recommended.

  • The Unicode character buffer version of GetBytes(Char*, Int32, Byte*, Int32) allows some fast techniques, particularly with multiple calls using the Encoder object or inserting into existing buffers. Bear in mind, however, that this method version is sometimes unsafe, since pointers are required.

  • If your app must convert a large amount of data, it should reuse the output buffer. In this case, the GetBytes version that supports byte arrays is the best choice.

  • Consider using the Encoder.Convert method instead of GetByteCount. The conversion method converts as much data as possible, and does throw an exception if the output buffer is too small. For continuous encoding of a stream, this method is often the best choice.

See also

Applies to

.NET 9 and other versions
Product Versions
.NET Core 1.0, Core 1.1, Core 2.0, Core 2.1, Core 2.2, Core 3.0, Core 3.1, 5, 6, 7, 8, 9
.NET Framework 1.1, 2.0, 3.0, 3.5, 4.0, 4.5, 4.5.1, 4.5.2, 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2, 4.8, 4.8.1
.NET Standard 1.0, 1.1, 1.2, 1.3, 1.4, 1.6, 2.0, 2.1
UWP 10.0

GetByteCount(String, Int32, Int32)

Source:
Encoding.cs
Source:
Encoding.cs
Source:
Encoding.cs

When overridden in a derived class, calculates the number of bytes produced by encoding a set of characters from the specified string.

public int GetByteCount (string s, int index, int count);

Parameters

s
String

The string containing the set of characters to encode.

index
Int32

The index of the first character to encode.

count
Int32

The number of characters to encode.

Returns

The number of bytes produced by encoding the string.

Examples

The following example determines the number of bytes required to encode three characters from a character array, encodes the characters, and displays the resulting bytes.

using System;
using System.Text;

public class SamplesEncoding  {

   public static void Main()  {

      // The characters to encode:
      //    Latin Small Letter Z (U+007A)
      //    Latin Small Letter A (U+0061)
      //    Combining Breve (U+0306)
      //    Latin Small Letter AE With Acute (U+01FD)
      //    Greek Small Letter Beta (U+03B2)
      //    a high-surrogate value (U+D8FF)
      //    a low-surrogate value (U+DCFF)
      char[] myChars = new char[] { 'z', 'a', '\u0306', '\u01FD', '\u03B2', '\uD8FF', '\uDCFF' };

      // Get different encodings.
      Encoding  u7    = Encoding.UTF7;
      Encoding  u8    = Encoding.UTF8;
      Encoding  u16LE = Encoding.Unicode;
      Encoding  u16BE = Encoding.BigEndianUnicode;
      Encoding  u32   = Encoding.UTF32;

      // Encode three characters starting at index 4, and print out the counts and the resulting bytes.
      PrintCountsAndBytes( myChars, 4, 3, u7 );
      PrintCountsAndBytes( myChars, 4, 3, u8 );
      PrintCountsAndBytes( myChars, 4, 3, u16LE );
      PrintCountsAndBytes( myChars, 4, 3, u16BE );
      PrintCountsAndBytes( myChars, 4, 3, u32 );
   }

   public static void PrintCountsAndBytes( char[] chars, int index, int count, Encoding enc )  {

      // Display the name of the encoding used.
      Console.Write( "{0,-30} :", enc.ToString() );

      // Display the exact byte count.
      int iBC  = enc.GetByteCount( chars, index, count );
      Console.Write( " {0,-3}", iBC );

      // Display the maximum byte count.
      int iMBC = enc.GetMaxByteCount( count );
      Console.Write( " {0,-3} :", iMBC );

      // Encode the array of chars.
      byte[] bytes = enc.GetBytes( chars, index, count );

      // The following is an alternative way to encode the array of chars:
      // byte[] bytes = new byte[iBC];
      // enc.GetBytes( chars, index, count, bytes, bytes.GetLowerBound(0) );

      // Display all the encoded bytes.
      PrintHexBytes( bytes );
   }

   public static void PrintHexBytes( byte[] bytes )  {

      if (( bytes == null ) || ( bytes.Length == 0 ))
        {
            Console.WriteLine( "<none>" );
        }
        else  {
         for ( int i = 0; i < bytes.Length; i++ )
            Console.Write( "{0:X2} ", bytes[i] );
         Console.WriteLine();
      }
   }
}


/* 
This code produces the following output.

System.Text.UTF7Encoding       : 10  11  :2B 41 37 4C 59 2F 39 7A 2F 2D
System.Text.UTF8Encoding       : 6   12  :CE B2 F1 8F B3 BF
System.Text.UnicodeEncoding    : 6   8   :B2 03 FF D8 FF DC
System.Text.UnicodeEncoding    : 6   8   :03 B2 D8 FF DC FF
System.Text.UTF32Encoding      : 8   16  :B2 03 00 00 FF FC 04 00

*/

Remarks

To calculate the exact array size required by GetBytes to store the resulting bytes, call the GetByteCount method. To calculate the maximum array size, call the GetMaxByteCount method. The GetByteCount method generally allows allocation of less memory, while the GetMaxByteCount method generally executes faster.

The GetByteCount method determines how many bytes result in encoding a set of Unicode characters, and the GetBytes method performs the actual encoding. The GetBytes method expects discrete conversions, in contrast to the Encoder.GetBytes method, which handles multiple conversions on a single input stream.

Several versions of GetByteCount and GetBytes are supported. The following are some programming considerations for use of these methods:

  • Your app might need to encode many input characters to a code page and process the characters using multiple calls. In this case, you probably need to maintain state between calls, taking into account the state that is persisted by the Encoder object being used.

  • If your app handles string inputs, the string version of GetBytes is recommended.

  • The Unicode character buffer version of GetBytes(Char*, Int32, Byte*, Int32) allows some fast techniques, particularly with multiple calls using the Encoder object or inserting into existing buffers. Bear in mind, however, that this method version is sometimes unsafe, since pointers are required.

  • If your app must convert a large amount of data, it should reuse the output buffer. In this case, the GetBytes version that supports byte arrays is the best choice.

  • Consider using the Encoder.Convert method instead of GetByteCount. The conversion method converts as much data as possible, and does throw an exception if the output buffer is too small. For continuous encoding of a stream, this method is often the best choice.

Applies to

.NET 9 and other versions
Product Versions
.NET Core 2.0, Core 2.1, Core 2.2, Core 3.0, Core 3.1, 5, 6, 7, 8, 9
.NET Standard 2.1

GetByteCount(Char*, Int32)

Source:
Encoding.cs
Source:
Encoding.cs
Source:
Encoding.cs

Important

This API is not CLS-compliant.

When overridden in a derived class, calculates the number of bytes produced by encoding a set of characters starting at the specified character pointer.

[System.CLSCompliant(false)]
[System.Security.SecurityCritical]
public virtual int GetByteCount (char* chars, int count);
[System.CLSCompliant(false)]
public virtual int GetByteCount (char* chars, int count);
[System.CLSCompliant(false)]
[System.Runtime.InteropServices.ComVisible(false)]
public virtual int GetByteCount (char* chars, int count);
[System.CLSCompliant(false)]
[System.Security.SecurityCritical]
[System.Runtime.InteropServices.ComVisible(false)]
public virtual int GetByteCount (char* chars, int count);

Parameters

chars
Char*

A pointer to the first character to encode.

count
Int32

The number of characters to encode.

Returns

The number of bytes produced by encoding the specified characters.

Attributes

Exceptions

chars is null.

count is less than zero.

A fallback occurred (for more information, see Character Encoding in .NET)

-and-

EncoderFallback is set to EncoderExceptionFallback.

Remarks

To calculate the exact array size that GetBytes requires to store the resulting bytes, you should call the GetByteCount method. To calculate the maximum array size, call the GetMaxByteCount method. The GetByteCount method generally allows allocation of less memory, while the GetMaxByteCount method generally executes faster.

The GetByteCount(Char*, Int32) method determines how many bytes result in encoding a set of Unicode characters, and the GetBytes(Char*, Int32, Byte*, Int32) method performs the actual encoding. The GetBytes method expects discrete conversions, in contrast to the Encoder.GetBytes method, which handles multiple conversions on a single input stream.

Several versions of GetByteCount and GetBytes are supported. The following are some considerations for using these methods:

  • Your app may need to encode many input characters to a code page and process the characters using multiple calls. In this case, you probably need to maintain state between calls, taking into account the state that is persisted by the Encoder object being used.

  • If your app handles string inputs, you should use the string version of the GetBytes method.

  • The Unicode character buffer version of GetBytes allows some fast techniques, particularly with multiple calls using the Encoder object or inserting into existing buffers. Bear in mind, however, that this method version is sometimes unsafe, since pointers are required.

  • If your app must convert a large amount of data, it should reuse the output buffer. In this case, the GetBytes version that supports byte arrays is the best choice.

  • Consider using the Encoder.Convert method instead of GetByteCount. The conversion method converts as much data as possible, and does throw an exception if the output buffer is too small. For continuous encoding of a stream, this method is often the best choice.

See also

Applies to

.NET 9 and other versions
Product Versions
.NET Core 1.0, Core 1.1, Core 2.0, Core 2.1, Core 2.2, Core 3.0, Core 3.1, 5, 6, 7, 8, 9
.NET Framework 2.0, 3.0, 3.5, 4.0, 4.5, 4.5.1, 4.5.2, 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2, 4.8, 4.8.1
.NET Standard 1.3, 1.4, 1.6, 2.0, 2.1
UWP 10.0

GetByteCount(ReadOnlySpan<Char>)

Source:
Encoding.cs
Source:
Encoding.cs
Source:
Encoding.cs

When overridden in a derived class, calculates the number of bytes produced by encoding the characters in the specified character span.

public virtual int GetByteCount (ReadOnlySpan<char> chars);

Parameters

chars
ReadOnlySpan<Char>

The span of characters to encode.

Returns

The number of bytes produced by encoding the specified character span.

Remarks

To calculate the exact span size required by GetBytes to store the resulting bytes, call the GetByteCount method. To calculate the maximum span size, call the GetMaxByteCount method. The GetByteCount method generally allows allocation of less memory, while the GetMaxByteCount method generally executes faster.

The GetByteCount method determines how many bytes result in encoding a set of Unicode characters, and the GetBytes method performs the actual encoding. The GetBytes method expects discrete conversions, in contrast to the Encoder.GetBytes method, which handles multiple conversions on a single input stream.

Several versions of GetByteCount and GetBytes are supported. The following are some programming considerations for use of these methods:

  • Your app might need to encode many input characters to a code page and process the characters using multiple calls. In this case, you probably need to maintain state between calls, taking into account the state that is persisted by the Encoder object being used.

  • If your app handles span of characters inputs, the span version of GetBytes is recommended.

  • Consider using the Encoder.Convert method instead of GetByteCount. The conversion method converts as much data as possible, and does throw an exception if the output span buffer is too small. For continuous encoding of a stream, this method is often the best choice.

Applies to

.NET 9 and other versions
Product Versions
.NET Core 2.1, Core 2.2, Core 3.0, Core 3.1, 5, 6, 7, 8, 9
.NET Standard 2.1

GetByteCount(Char[])

Source:
Encoding.cs
Source:
Encoding.cs
Source:
Encoding.cs

When overridden in a derived class, calculates the number of bytes produced by encoding all the characters in the specified character array.

public virtual int GetByteCount (char[] chars);

Parameters

chars
Char[]

The character array containing the characters to encode.

Returns

The number of bytes produced by encoding all the characters in the specified character array.

Exceptions

chars is null.

A fallback occurred (for more information, see Character Encoding in .NET)

-and-

EncoderFallback is set to EncoderExceptionFallback.

Examples

The following example determines the number of bytes required to encode a character array, encodes the characters, and displays the resulting bytes.

using System;
using System.Text;

public class SamplesEncoding  {

   public static void Main()  {

      // The characters to encode:
      //    Latin Small Letter Z (U+007A)
      //    Latin Small Letter A (U+0061)
      //    Combining Breve (U+0306)
      //    Latin Small Letter AE With Acute (U+01FD)
      //    Greek Small Letter Beta (U+03B2)
      //    a high-surrogate value (U+D8FF)
      //    a low-surrogate value (U+DCFF)
      char[] myChars = new char[] { 'z', 'a', '\u0306', '\u01FD', '\u03B2', '\uD8FF', '\uDCFF' };

      // Get different encodings.
      Encoding  u7    = Encoding.UTF7;
      Encoding  u8    = Encoding.UTF8;
      Encoding  u16LE = Encoding.Unicode;
      Encoding  u16BE = Encoding.BigEndianUnicode;
      Encoding  u32   = Encoding.UTF32;

      // Encode the entire array, and print out the counts and the resulting bytes.
      PrintCountsAndBytes( myChars, u7 );
      PrintCountsAndBytes( myChars, u8 );
      PrintCountsAndBytes( myChars, u16LE );
      PrintCountsAndBytes( myChars, u16BE );
      PrintCountsAndBytes( myChars, u32 );
   }

   public static void PrintCountsAndBytes( char[] chars, Encoding enc )  {

      // Display the name of the encoding used.
      Console.Write( "{0,-30} :", enc.ToString() );

      // Display the exact byte count.
      int iBC  = enc.GetByteCount( chars );
      Console.Write( " {0,-3}", iBC );

      // Display the maximum byte count.
      int iMBC = enc.GetMaxByteCount( chars.Length );
      Console.Write( " {0,-3} :", iMBC );

      // Encode the array of chars.
      byte[] bytes = enc.GetBytes( chars );

      // Display all the encoded bytes.
      PrintHexBytes( bytes );
   }

   public static void PrintHexBytes( byte[] bytes )  {

      if (( bytes == null ) || ( bytes.Length == 0 ))
        {
            Console.WriteLine( "<none>" );
        }
        else  {
         for ( int i = 0; i < bytes.Length; i++ )
            Console.Write( "{0:X2} ", bytes[i] );
         Console.WriteLine();
      }
   }
}


/* 
This code produces the following output.

System.Text.UTF7Encoding       : 18  23  :7A 61 2B 41 77 59 42 2F 51 4F 79 32 50 2F 63 2F 77 2D
System.Text.UTF8Encoding       : 12  24  :7A 61 CC 86 C7 BD CE B2 F1 8F B3 BF
System.Text.UnicodeEncoding    : 14  16  :7A 00 61 00 06 03 FD 01 B2 03 FF D8 FF DC
System.Text.UnicodeEncoding    : 14  16  :00 7A 00 61 03 06 01 FD 03 B2 D8 FF DC FF
System.Text.UTF32Encoding      : 24  32  :7A 00 00 00 61 00 00 00 06 03 00 00 FD 01 00 00 B2 03 00 00 FF FC 04 00

*/

Remarks

To calculate the exact array size required by GetBytes to store the resulting bytes, call the GetByteCount method. To calculate the maximum array size, call the GetMaxByteCount method. The GetByteCount method generally allows allocation of less memory, while the GetMaxByteCount method generally executes faster.

The GetByteCount method determines how many bytes result in encoding a set of Unicode characters, and the GetBytes method performs the actual encoding. The GetBytes method expects discrete conversions, in contrast to the Encoder.GetBytes method, which handles multiple conversions on a single input stream.

Several versions of GetByteCount and GetBytes are supported. The following are some programming considerations for use of these methods:

  • Your app might need to encode many input characters to a code page and process the characters using multiple calls. In this case, you probably need to maintain state between calls, taking into account the state that is persisted by the Encoder object being used.

  • If your app handles string inputs, you should use the string versions of the GetBytes method.

  • The Unicode character buffer version of GetBytes(Char*, Int32, Byte*, Int32) allows some fast techniques, particularly with multiple calls using the Encoder object or inserting into existing buffers. Bear in mind, however, that this method version is sometimes unsafe, since pointers are required.

  • If your app must convert a large amount of data, you should reuse the output buffer. In this case, the GetBytes version that supports byte arrays is the best choice.

  • Consider using the Encoder.Convert method instead of GetByteCount. The conversion method converts as much data as possible, and does throw an exception if the output buffer is too small. For continuous encoding of a stream, this method is often the best choice.

See also

Applies to

.NET 9 and other versions
Product Versions
.NET Core 1.0, Core 1.1, Core 2.0, Core 2.1, Core 2.2, Core 3.0, Core 3.1, 5, 6, 7, 8, 9
.NET Framework 1.1, 2.0, 3.0, 3.5, 4.0, 4.5, 4.5.1, 4.5.2, 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2, 4.8, 4.8.1
.NET Standard 1.0, 1.1, 1.2, 1.3, 1.4, 1.6, 2.0, 2.1
UWP 10.0

GetByteCount(String)

Source:
Encoding.cs
Source:
Encoding.cs
Source:
Encoding.cs

When overridden in a derived class, calculates the number of bytes produced by encoding the characters in the specified string.

public virtual int GetByteCount (string s);

Parameters

s
String

The string containing the set of characters to encode.

Returns

The number of bytes produced by encoding the specified characters.

Exceptions

A fallback occurred (for more information, see Character Encoding in .NET)

-and-

EncoderFallback is set to EncoderExceptionFallback.

Examples

The following example determines the number of bytes required to encode a string or a range in the string, encodes the characters, and displays the resulting bytes.

using System;
using System.Text;

public class SamplesEncoding  {

   public static void Main()  {

      // The characters to encode:
      //    Latin Small Letter Z (U+007A)
      //    Latin Small Letter A (U+0061)
      //    Combining Breve (U+0306)
      //    Latin Small Letter AE With Acute (U+01FD)
      //    Greek Small Letter Beta (U+03B2)
      //    a high-surrogate value (U+D8FF)
      //    a low-surrogate value (U+DCFF)
      String myStr = "za\u0306\u01FD\u03B2\uD8FF\uDCFF";

      // Get different encodings.
      Encoding  u7    = Encoding.UTF7;
      Encoding  u8    = Encoding.UTF8;
      Encoding  u16LE = Encoding.Unicode;
      Encoding  u16BE = Encoding.BigEndianUnicode;
      Encoding  u32   = Encoding.UTF32;

      // Encode the entire string, and print out the counts and the resulting bytes.
      Console.WriteLine( "Encoding the entire string:" );
      PrintCountsAndBytes( myStr, u7 );
      PrintCountsAndBytes( myStr, u8 );
      PrintCountsAndBytes( myStr, u16LE );
      PrintCountsAndBytes( myStr, u16BE );
      PrintCountsAndBytes( myStr, u32 );

      Console.WriteLine();

      // Encode three characters starting at index 4, and print out the counts and the resulting bytes.
      Console.WriteLine( "Encoding the characters from index 4 through 6:" );
      PrintCountsAndBytes( myStr, 4, 3, u7 );
      PrintCountsAndBytes( myStr, 4, 3, u8 );
      PrintCountsAndBytes( myStr, 4, 3, u16LE );
      PrintCountsAndBytes( myStr, 4, 3, u16BE );
      PrintCountsAndBytes( myStr, 4, 3, u32 );
   }

   public static void PrintCountsAndBytes( String s, Encoding enc )  {

      // Display the name of the encoding used.
      Console.Write( "{0,-30} :", enc.ToString() );

      // Display the exact byte count.
      int iBC  = enc.GetByteCount( s );
      Console.Write( " {0,-3}", iBC );

      // Display the maximum byte count.
      int iMBC = enc.GetMaxByteCount( s.Length );
      Console.Write( " {0,-3} :", iMBC );

      // Encode the entire string.
      byte[] bytes = enc.GetBytes( s );

      // Display all the encoded bytes.
      PrintHexBytes( bytes );
   }

   public static void PrintCountsAndBytes( String s, int index, int count, Encoding enc )  {

      // Display the name of the encoding used.
      Console.Write( "{0,-30} :", enc.ToString() );

      // Display the exact byte count.
      int iBC  = enc.GetByteCount( s.ToCharArray(), index, count );
      Console.Write( " {0,-3}", iBC );

      // Display the maximum byte count.
      int iMBC = enc.GetMaxByteCount( count );
      Console.Write( " {0,-3} :", iMBC );

      // Encode a range of characters in the string.
      byte[] bytes = new byte[iBC];
      enc.GetBytes( s, index, count, bytes, bytes.GetLowerBound(0) );

      // Display all the encoded bytes.
      PrintHexBytes( bytes );
   }

   public static void PrintHexBytes( byte[] bytes )  {

      if (( bytes == null ) || ( bytes.Length == 0 ))
        {
            Console.WriteLine( "<none>" );
        }
        else  {
         for ( int i = 0; i < bytes.Length; i++ )
            Console.Write( "{0:X2} ", bytes[i] );
         Console.WriteLine();
      }
   }
}


/* 
This code produces the following output.

Encoding the entire string:
System.Text.UTF7Encoding       : 18  23  :7A 61 2B 41 77 59 42 2F 51 4F 79 32 50 2F 63 2F 77 2D
System.Text.UTF8Encoding       : 12  24  :7A 61 CC 86 C7 BD CE B2 F1 8F B3 BF
System.Text.UnicodeEncoding    : 14  16  :7A 00 61 00 06 03 FD 01 B2 03 FF D8 FF DC
System.Text.UnicodeEncoding    : 14  16  :00 7A 00 61 03 06 01 FD 03 B2 D8 FF DC FF
System.Text.UTF32Encoding      : 24  32  :7A 00 00 00 61 00 00 00 06 03 00 00 FD 01 00 00 B2 03 00 00 FF FC 04 00

Encoding the characters from index 4 through 6:
System.Text.UTF7Encoding       : 10  11  :2B 41 37 4C 59 2F 39 7A 2F 2D
System.Text.UTF8Encoding       : 6   12  :CE B2 F1 8F B3 BF
System.Text.UnicodeEncoding    : 6   8   :B2 03 FF D8 FF DC
System.Text.UnicodeEncoding    : 6   8   :03 B2 D8 FF DC FF
System.Text.UTF32Encoding      : 8   16  :B2 03 00 00 FF FC 04 00

*/

Remarks

To calculate the exact array size required by GetBytes to store the resulting bytes, call the GetByteCount method. To calculate the maximum array size, call the GetMaxByteCount method. The GetByteCount method generally allows allocation of less memory, while the GetMaxByteCount method generally executes faster.

The GetByteCount method determines how many bytes result in encoding a set of Unicode characters, and the GetBytes method performs the actual encoding. The GetBytes method expects discrete conversions, in contrast to the Encoder.GetBytes method, which handles multiple conversions on a single input stream.

Several versions of GetByteCount and GetBytes are supported. The following are some programming considerations for use of these methods:

  • Your app might need to encode many input characters to a code page and process the characters using multiple calls. In this case, you probably need to maintain state between calls, taking into account the state that is persisted by the Encoder object being used.

  • If your app handles string inputs, the string version of GetBytes is recommended.

  • The Unicode character buffer version of GetBytes(Char*, Int32, Byte*, Int32) allows some fast techniques, particularly with multiple calls using the Encoder object or inserting into existing buffers. Bear in mind, however, that this method version is sometimes unsafe, since pointers are required.

  • If your app must convert a large amount of data, it should reuse the output buffer. In this case, the GetBytes version that supports byte arrays is the best choice.

  • Consider using the Encoder.Convert method instead of GetByteCount. The conversion method converts as much data as possible, and does throw an exception if the output buffer is too small. For continuous encoding of a stream, this method is often the best choice.

See also

Applies to

.NET 9 and other versions
Product Versions
.NET Core 1.0, Core 1.1, Core 2.0, Core 2.1, Core 2.2, Core 3.0, Core 3.1, 5, 6, 7, 8, 9
.NET Framework 1.1, 2.0, 3.0, 3.5, 4.0, 4.5, 4.5.1, 4.5.2, 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2, 4.8, 4.8.1
.NET Standard 1.0, 1.1, 1.2, 1.3, 1.4, 1.6, 2.0, 2.1
UWP 10.0