Breyta

Deila með


UTF32Encoding.GetByteCount Method

Definition

Calculates the number of bytes produced by encoding a set of characters.

Overloads

GetByteCount(String)

Calculates the number of bytes produced by encoding the characters in the specified String.

GetByteCount(Char*, Int32)

Calculates the number of bytes produced by encoding a set of characters starting at the specified character pointer.

GetByteCount(Char[], Int32, Int32)

Calculates the number of bytes produced by encoding a set of characters from the specified character array.

GetByteCount(String)

Source:
UTF32Encoding.cs
Source:
UTF32Encoding.cs
Source:
UTF32Encoding.cs

Calculates the number of bytes produced by encoding the characters in the specified String.

public:
 override int GetByteCount(System::String ^ s);
public override int GetByteCount (string s);
override this.GetByteCount : string -> int
Public Overrides Function GetByteCount (s As String) As Integer

Parameters

s
String

The String containing the set of characters to encode.

Returns

The number of bytes produced by encoding the specified characters.

Exceptions

The resulting number of bytes is greater than the maximum number that can be returned as an integer.

Error detection is enabled, and s contains an invalid sequence of characters.

A fallback occurred (for more information, see Character Encoding in .NET)

-and-

EncoderFallback is set to EncoderExceptionFallback.

Examples

The following example calls the GetMaxByteCount and GetByteCount(String) methods to calculate the maximum and actual number of bytes required to encode a string. It also displays the actual number of bytes required to store a byte stream with a byte order mark.

using System;
using System.Text;

class UTF8EncodingExample {
    public static void Main() {
        String chars = "UTF-32 Encoding Example";
        Encoding enc = Encoding.UTF32;

        Console.WriteLine("Bytes needed to encode '{0}':", chars);
        Console.WriteLine("   Maximum:         {0}",
                          enc.GetMaxByteCount(chars.Length));
        Console.WriteLine("   Actual:          {0}",
                          enc.GetByteCount(chars));
        Console.WriteLine("   Actual with BOM: {0}",
                          enc.GetByteCount(chars) + enc.GetPreamble().Length);
    }
}
// The example displays the following output:
//       Bytes needed to encode 'UTF-32 Encoding Example':
//          Maximum:         96
//          Actual:          92
//          Actual with BOM: 96
Imports System.Text

Module Example
    Public Sub Main()
        Dim chars As String = "UTF-32 Encoding Example"
        Dim enc As Encoding = Encoding.UTF32

        Console.WriteLine("Bytes needed to encode '{0}':", chars)
        Console.WriteLine("   Maximum:         {0}",
                          enc.GetMaxByteCount(chars.Length))
        Console.WriteLine("   Actual:          {0}",
                          enc.GetByteCount(chars))
        Console.WriteLine("   Actual with BOM: {0}",
                          enc.GetByteCount(chars) + enc.GetPreamble().Length)
    End Sub
End Module
' The example displays the following output:
'       Bytes needed to encode 'UTF-32 Encoding Example':
'          Maximum:         96
'          Actual:          92
'          Actual with BOM: 96

Remarks

To calculate the exact array size required by GetBytes to store the resulting bytes, you call the GetByteCount method. To calculate the maximum array size, you call the GetMaxByteCount method. The GetByteCount method generally allocates less memory, while the GetMaxByteCount method generally executes faster.

With error detection, an invalid sequence causes this method to throw a ArgumentException. Without error detection, invalid sequences are ignored, and no exception is thrown.

To ensure that the encoded bytes are decoded properly when they are saved as a file or as a stream, you can prefix a stream of encoded bytes with a preamble. Inserting the preamble at the beginning of a byte stream (such as at the beginning of a series of bytes to be written to a file) is the developer's responsibility, and the number of bytes in the preamble is not reflected in the value returned by the GetByteCount method.

See also

Applies to

GetByteCount(Char*, Int32)

Source:
UTF32Encoding.cs
Source:
UTF32Encoding.cs
Source:
UTF32Encoding.cs

Important

This API is not CLS-compliant.

Calculates the number of bytes produced by encoding a set of characters starting at the specified character pointer.

public:
 override int GetByteCount(char* chars, int count);
[System.CLSCompliant(false)]
[System.Security.SecurityCritical]
public override int GetByteCount (char* chars, int count);
[System.CLSCompliant(false)]
public override int GetByteCount (char* chars, int count);
[<System.CLSCompliant(false)>]
[<System.Security.SecurityCritical>]
override this.GetByteCount : nativeptr<char> * int -> int
[<System.CLSCompliant(false)>]
override this.GetByteCount : nativeptr<char> * int -> int

Parameters

chars
Char*

A pointer to the first character to encode.

count
Int32

The number of characters to encode.

Returns

The number of bytes produced by encoding the specified characters.

Attributes

Exceptions

chars is null.

count is less than zero.

-or-

The resulting number of bytes is greater than the maximum number that can be returned as an integer.

Error detection is enabled, and chars contains an invalid sequence of characters.

A fallback occurred (for more information, see Character Encoding in .NET)

-and-

EncoderFallback is set to EncoderExceptionFallback.

Remarks

To calculate the exact array size required by GetBytes to store the resulting bytes, you call the GetByteCount method. To calculate the maximum array size, you call the GetMaxByteCount. The GetByteCount method generally allocates less memory, while the GetMaxByteCount method generally executes faster.

With error detection, an invalid sequence causes this method to throw an ArgumentException. Without error detection, invalid sequences are ignored, and no exception is thrown.

To ensure that the encoded bytes are decoded properly when they are saved as a file or a stream, you can prefix a stream of encoded bytes with a preamble. Inserting a preamble at the beginning of a byte stream (such as at the beginning of a series of bytes to be written to a file) is the developer's responsibility, and the number of bytes in the preamble Is not reflected in the value returned by the GetByteCount method.

See also

Applies to

GetByteCount(Char[], Int32, Int32)

Source:
UTF32Encoding.cs
Source:
UTF32Encoding.cs
Source:
UTF32Encoding.cs

Calculates the number of bytes produced by encoding a set of characters from the specified character array.

public:
 override int GetByteCount(cli::array <char> ^ chars, int index, int count);
public override int GetByteCount (char[] chars, int index, int count);
override this.GetByteCount : char[] * int * int -> int
Public Overrides Function GetByteCount (chars As Char(), index As Integer, count As Integer) As Integer

Parameters

chars
Char[]

The character array containing the set of characters to encode.

index
Int32

The index of the first character to encode.

count
Int32

The number of characters to encode.

Returns

The number of bytes produced by encoding the specified characters.

Exceptions

chars is null.

index or count is less than zero.

-or-

index and count do not denote a valid range in chars.

-or-

The resulting number of bytes is greater than the maximum number that can be returned as an integer.

Error detection is enabled, and chars contains an invalid sequence of characters.

A fallback occurred (for more information, see Character Encoding in .NET)

-and-

EncoderFallback is set to EncoderExceptionFallback.

Examples

The following example populates an array with a Latin uppercase and lowercase characters and calls the GetByteCount(Char[], Int32, Int32) method to determine the number of bytes needed to encode the Latin lowercase characters. It then displays this information along with the total number of bytes needed if a byte order mark is added. It compares this number with the value returned by the GetMaxByteCount method, which indicates maximum number of bytes needed to encode the Latin lowercase characters. The following example populates an array with a combination of Greek and Cyrillic characters and calls the GetByteCount(Char[], Int32, Int32) method to determine the number of bytes needed to encode the Cyrillic characters. It then displays this information along with the total number of bytes needed if a byte order mark is added. It compares this number with the value returned by the GetMaxByteCount method, which indicates maximum number of bytes needed to encode the Cyrillic characters.

using System;
using System.Text;

public class Example
{
   public static void Main()
   {
      int uppercaseStart = 0x0041;
      int uppercaseEnd = 0x005a;
      int lowercaseStart = 0x0061;
      int lowercaseEnd = 0x007a;
      // Instantiate a UTF8 encoding object with BOM support.
      Encoding enc = Encoding.UTF32;

      // Populate array with characters.
      char[] chars = new char[lowercaseEnd - lowercaseStart + uppercaseEnd - uppercaseStart + 2];
      int index = 0;
      for (int ctr = uppercaseStart; ctr <= uppercaseEnd; ctr++) {
         chars[index] = (char)ctr;
         index++;
      }
      for (int ctr = lowercaseStart; ctr <= lowercaseEnd; ctr++) {
         chars[index] = (char)ctr;
         index++;
      }

      // Display the bytes needed for the lowercase characters.
      Console.WriteLine("Bytes needed for lowercase Latin characters:");
      Console.WriteLine("   Maximum:         {0,5:N0}",
                        enc.GetMaxByteCount(lowercaseEnd - lowercaseStart + 1));
      Console.WriteLine("   Actual:          {0,5:N0}",
                        enc.GetByteCount(chars, uppercaseEnd - uppercaseStart + 1,
                                          lowercaseEnd - lowercaseStart + 1));
      Console.WriteLine("   Actual with BOM: {0,5:N0}",
                        enc.GetByteCount(chars, uppercaseEnd - uppercaseStart + 1,
                                          lowercaseEnd - lowercaseStart + 1) +
                                          enc.GetPreamble().Length);
   }
}
// The example displays the following output:
//       Bytes needed for lowercase Latin characters:
//          Maximum:           108
//          Actual:            104
//          Actual with BOM:   108
Imports System.Text

Module Example
   Public Sub Main()
      Dim uppercaseStart As Integer = &h0041
      Dim uppercaseEnd As Integer = &h005a
      Dim lowercaseStart As Integer = &h0061
      Dim lowercaseEnd As Integer = &h007a
      ' Instantiate a UTF8 encoding object with BOM support.
      Dim enc As Encoding = Encoding.UTF32
      
      ' Populate array with characters.
      Dim chars(lowercaseEnd - lowercaseStart + uppercaseEnd - uppercaseStart + 1) As Char
      Dim index As Integer = 0
      For ctr As Integer = uppercaseStart To uppercaseEnd
         chars(index) = ChrW(ctr)
         index += 1
      Next
      For ctr As Integer = lowercaseStart To lowercaseEnd
         chars(index) = ChrW(ctr)
         index += 1
      Next

      ' Display the bytes needed for the lowercase characters.
        Console.WriteLine("Bytes needed for lowercase Latin characters:")
        Console.WriteLine("   Maximum:         {0,5:N0}",
                          enc.GetMaxByteCount(lowercaseEnd - lowercaseStart + 1))
        Console.WriteLine("   Actual:          {0,5:N0}",
                          enc.GetByteCount(chars, uppercaseEnd - uppercaseStart + 1,
                                            lowercaseEnd - lowercaseStart + 1))
        Console.WriteLine("   Actual with BOM: {0,5:N0}",
                          enc.GetByteCount(chars, uppercaseEnd - uppercaseStart + 1,
                                            lowercaseEnd - lowercaseStart + 1) +
                                            enc.GetPreamble().Length)
   End Sub
End Module
' The example displays the following output:
'       Bytes needed for lowercase Latin characters:
'          Maximum:           108
'          Actual:            104
'          Actual with BOM:   108

Remarks

To calculate the exact array size required by GetBytes to store the resulting bytes, you call the GetByteCount method. To calculate the maximum array size, you call the GetMaxByteCount method. The GetByteCount method generally allocates less memory, while the GetMaxByteCount method generally executes faster.

With error detection, an invalid sequence causes this method to throw a ArgumentException. Without error detection, invalid sequences are ignored, and no exception is thrown.

To ensure that the encoded bytes are decoded properly when they are saved as a file or a stream, you can prefix a stream of encoded bytes with a preamble. Inserting a preamble at the beginning of a byte stream (such as at the beginning of a series of bytes to be written to a file) is the developer's responsibility, and the number of bytes in the preamble Is not reflected in the value returned by the GetByteCount method.

See also

Applies to