Editare

Partajați prin


UnicodeEncoding.GetByteCount Method

Definition

Calculates the number of bytes produced by encoding a set of characters.

Overloads

GetByteCount(String)

Calculates the number of bytes produced by encoding the characters in the specified string.

GetByteCount(Char*, Int32)

Calculates the number of bytes produced by encoding a set of characters starting at the specified character pointer.

GetByteCount(Char[], Int32, Int32)

Calculates the number of bytes produced by encoding a set of characters from the specified character array.

GetByteCount(String)

Source:
UnicodeEncoding.cs
Source:
UnicodeEncoding.cs
Source:
UnicodeEncoding.cs

Calculates the number of bytes produced by encoding the characters in the specified string.

public:
 override int GetByteCount(System::String ^ s);
public override int GetByteCount (string s);
override this.GetByteCount : string -> int
Public Overrides Function GetByteCount (s As String) As Integer

Parameters

s
String

The string that contains the set of characters to encode.

Returns

The number of bytes produced by encoding the specified characters.

Exceptions

The resulting number of bytes is greater than the maximum number that can be returned as an integer.

Error detection is enabled, and s contains an invalid sequence of characters.

A fallback occurred (for more information, see Character Encoding in .NET)

-and-

EncoderFallback is set to EncoderExceptionFallback.

Examples

The following example calls the GetMaxByteCount and GetByteCount(String) methods to calculate the maximum and actual number of bytes required to encode a string. It also displays the actual number of bytes required to store a byte stream with a byte order mark.

using System;
using System.Text;

class UTF8EncodingExample {
    public static void Main() {
        String chars = "UTF-16 Encoding Example";
        Encoding unicode = Encoding.Unicode;

        Console.WriteLine("Bytes needed to encode '{0}':", chars);
        Console.WriteLine("   Maximum:         {0}",
                          unicode.GetMaxByteCount(chars.Length));
        Console.WriteLine("   Actual:          {0}",
                          unicode.GetByteCount(chars));
        Console.WriteLine("   Actual with BOM: {0}",
                          unicode.GetByteCount(chars) + unicode.GetPreamble().Length);
    }
}
// The example displays the following output:
//       Bytes needed to encode 'UTF-16 Encoding Example':
//          Maximum:         48
//          Actual:          46
//          Actual with BOM: 48
Imports System.Text

Module Example
    Public Sub Main()
        Dim chars As String = "UTF-16 Encoding Example"
        Dim unicode As Encoding = Encoding.Unicode

        Console.WriteLine("Bytes needed to encode '{0}':", chars)
        Console.WriteLine("   Maximum:         {0}",
                          unicode.GetMaxByteCount(chars.Length))
        Console.WriteLine("   Actual:          {0}",
                          unicode.GetByteCount(chars))
        Console.WriteLine("   Actual with BOM: {0}",
                          unicode.GetByteCount(chars) + unicode.GetPreamble().Length)
    End Sub
End Module
' The example displays the following output:
'       Bytes needed to encode 'UTF-16 Encoding Example':
'          Maximum:         48
'          Actual:          46
'          Actual with BOM: 48

Remarks

To calculate the exact array size required by GetBytes to store the resulting bytes, you call the GetByteCount method. To calculate the maximum array size, you call the GetMaxByteCount method. The GetByteCount method generally allocates less memory, while the GetMaxByteCount method generally executes faster.

With error detection, an invalid sequence causes this method to throw a ArgumentException. Without error detection, invalid sequences are ignored, and no exception is thrown.

Important

To ensure that the encoded bytes are decoded properly when they are saved as a file or as a stream, you can prefix a stream of encoded bytes with a preamble. Inserting the preamble at the beginning of a byte stream (such as at the beginning of a series of bytes to be written to a file) is the developer's responsibility, and the number of bytes in the preamble is not reflected in the value returned by the GetByteCount(String) method.

See also

Applies to

GetByteCount(Char*, Int32)

Source:
UnicodeEncoding.cs
Source:
UnicodeEncoding.cs
Source:
UnicodeEncoding.cs

Important

This API is not CLS-compliant.

Calculates the number of bytes produced by encoding a set of characters starting at the specified character pointer.

public:
 override int GetByteCount(char* chars, int count);
[System.CLSCompliant(false)]
public override int GetByteCount (char* chars, int count);
[System.CLSCompliant(false)]
[System.Security.SecurityCritical]
public override int GetByteCount (char* chars, int count);
[System.CLSCompliant(false)]
[System.Runtime.InteropServices.ComVisible(false)]
public override int GetByteCount (char* chars, int count);
[System.CLSCompliant(false)]
[System.Security.SecurityCritical]
[System.Runtime.InteropServices.ComVisible(false)]
public override int GetByteCount (char* chars, int count);
[<System.CLSCompliant(false)>]
override this.GetByteCount : nativeptr<char> * int -> int
[<System.CLSCompliant(false)>]
[<System.Security.SecurityCritical>]
override this.GetByteCount : nativeptr<char> * int -> int
[<System.CLSCompliant(false)>]
[<System.Runtime.InteropServices.ComVisible(false)>]
override this.GetByteCount : nativeptr<char> * int -> int
[<System.CLSCompliant(false)>]
[<System.Security.SecurityCritical>]
[<System.Runtime.InteropServices.ComVisible(false)>]
override this.GetByteCount : nativeptr<char> * int -> int

Parameters

chars
Char*

A pointer to the first character to encode.

count
Int32

The number of characters to encode.

Returns

The number of bytes produced by encoding the specified characters.

Attributes

Exceptions

chars is null.

count is less than zero.

-or-

The resulting number of bytes is greater than the maximum number that can be returned as an integer.

Error detection is enabled and chars contains an invalid sequence of characters.

A fallback occurred (for more information, see Character Encoding in .NET)

-and-

EncoderFallback is set to EncoderExceptionFallback.

Remarks

To calculate the exact array size that GetBytes requires to store the resulting bytes, you call the GetByteCount method. To calculate the maximum array size, you call the GetMaxByteCount method. The GetByteCount method generally allocates less memory, while the GetMaxByteCount method generally executes faster.

With error detection, an invalid sequence causes this method to throw a ArgumentException. Without error detection, invalid sequences are ignored, and no exception is thrown.

Important

To ensure that the encoded bytes are decoded properly when they are saved as a file or as a stream, you can prefix a stream of encoded bytes with a preamble. Inserting the preamble at the beginning of a byte stream (such as at the beginning of a series of bytes to be written to a file) is the developer's responsibility, and the number of bytes in the preamble is not reflected in the value returned by the GetByteCount method.

See also

Applies to

GetByteCount(Char[], Int32, Int32)

Source:
UnicodeEncoding.cs
Source:
UnicodeEncoding.cs
Source:
UnicodeEncoding.cs

Calculates the number of bytes produced by encoding a set of characters from the specified character array.

public:
 override int GetByteCount(cli::array <char> ^ chars, int index, int count);
public override int GetByteCount (char[] chars, int index, int count);
override this.GetByteCount : char[] * int * int -> int
Public Overrides Function GetByteCount (chars As Char(), index As Integer, count As Integer) As Integer

Parameters

chars
Char[]

The character array containing the set of characters to encode.

index
Int32

The index of the first character to encode.

count
Int32

The number of characters to encode.

Returns

The number of bytes produced by encoding the specified characters.

Exceptions

chars is null (Nothing).

index or count is less than zero.

-or-

index and count do not denote a valid range in chars.

-or-

The resulting number of bytes is greater than the maximum number that can be returned as an integer.

Error detection is enabled, and chars contains an invalid sequence of characters.

A fallback occurred (for more information, see Character Encoding in .NET)

-and-

EncoderFallback is set to EncoderExceptionFallback.

Examples

The following example populates an array with a Latin uppercase and lowercase characters and calls the GetByteCount(Char[], Int32, Int32) method to determine the number of bytes needed to encode the Latin lowercase characters. It then displays this information along with the total number of bytes needed if a byte order mark is added. It compares this number with the value returned by the GetMaxByteCount method, which indicates maximum number of bytes needed to encode the Latin lowercase characters. The following example populates an array with a combination of Greek and Cyrillic characters and calls the GetByteCount(Char[], Int32, Int32) method to determine the number of bytes needed to encode the Cyrillic characters. It then displays this information along with the total number of bytes needed if a byte order mark is added. It compares this number with the value returned by the GetMaxByteCount method, which indicates maximum number of bytes needed to encode the Cyrillic characters.

using System;
using System.Text;

public class Example
{
   public static void Main()
   {
      int uppercaseStart = 0x0041;
      int uppercaseEnd = 0x005a;
      int lowercaseStart = 0x0061;
      int lowercaseEnd = 0x007a;
      // Instantiate a UTF8 encoding object with BOM support.
      Encoding unicode = Encoding.Unicode;

      // Populate array with characters.
      char[] chars = new char[lowercaseEnd - lowercaseStart + uppercaseEnd - uppercaseStart + 2];
      int index = 0;
      for (int ctr = uppercaseStart; ctr <= uppercaseEnd; ctr++) {
         chars[index] = (char)ctr;
         index++;
      }
      for (int ctr = lowercaseStart; ctr <= lowercaseEnd; ctr++) {
         chars[index] = (char)ctr;
         index++;
      }

      // Display the bytes needed for the lowercase characters.
      Console.WriteLine("Bytes needed for lowercase Latin characters:");
      Console.WriteLine("   Maximum:         {0,5:N0}",
                        unicode.GetMaxByteCount(lowercaseEnd - lowercaseStart + 1));
      Console.WriteLine("   Actual:          {0,5:N0}",
                        unicode.GetByteCount(chars, uppercaseEnd - uppercaseStart + 1,
                                          lowercaseEnd - lowercaseStart + 1));
      Console.WriteLine("   Actual with BOM: {0,5:N0}",
                        unicode.GetByteCount(chars, uppercaseEnd - uppercaseStart + 1,
                                          lowercaseEnd - lowercaseStart + 1) +
                                          unicode.GetPreamble().Length);
   }
}
// The example displays the following output:
//       Bytes needed for lowercase Latin characters:
//          Maximum:            54
//          Actual:             52
//          Actual with BOM:    54
Imports System.Text

Module Example
   Public Sub Main()
      Dim uppercaseStart As Integer = &h0041
      Dim uppercaseEnd As Integer = &h005a
      Dim lowercaseStart As Integer = &h0061
      Dim lowercaseEnd As Integer = &h007a
      ' Instantiate a UTF8 encoding object with BOM support.
      Dim unicode As Encoding = Encoding.Unicode
      
      ' Populate array with characters.
      Dim chars(lowercaseEnd - lowercaseStart + uppercaseEnd - uppercaseStart + 1) As Char
      Dim index As Integer = 0
      For ctr As Integer = uppercaseStart To uppercaseEnd
         chars(index) = ChrW(ctr)
         index += 1
      Next
      For ctr As Integer = lowercaseStart To lowercaseEnd
         chars(index) = ChrW(ctr)
         index += 1
      Next

      ' Display the bytes needed for the lowercase characters.
        Console.WriteLine("Bytes needed for lowercase Latin characters:")
        Console.WriteLine("   Maximum:         {0,5:N0}",
                          unicode.GetMaxByteCount(lowercaseEnd - lowercaseStart + 1))
        Console.WriteLine("   Actual:          {0,5:N0}",
                          unicode.GetByteCount(chars, uppercaseEnd - uppercaseStart + 1,
                                            lowercaseEnd - lowercaseStart + 1))
        Console.WriteLine("   Actual with BOM: {0,5:N0}",
                          unicode.GetByteCount(chars, uppercaseEnd - uppercaseStart + 1,
                                            lowercaseEnd - lowercaseStart + 1) +
                                            unicode.GetPreamble().Length)
   End Sub
End Module
' The example displays the following output:
'       Bytes needed for lowercase Latin characters:
'          Maximum:            54
'          Actual:             52
'          Actual with BOM:    54

Remarks

To calculate the exact array size required by GetBytes to store the resulting bytes, the application uses GetByteCount. To calculate the maximum array size, you call the GetMaxByteCount method. The GetByteCount method generally allocates less memory, while the GetMaxByteCount method generally executes faster.

With error detection enabled, an invalid sequence causes this method to throw an ArgumentException. Without error detection, invalid sequences are ignored, and no exception is thrown.

To ensure that the encoded bytes are decoded properly when they are saved as a file or as a stream, you can prefix a stream of encoded bytes with a preamble. Inserting the preamble at the beginning of a byte stream (such as at the beginning of a series of bytes to be written to a file) is the developer's responsibility, and the number of bytes in the preamble Is not reflected in the value returned by the GetByteCount(Char[], Int32, Int32) method.

See also

Applies to