Editare

Partajați prin


UnicodeEncoding.GetMaxCharCount(Int32) Method

Definition

Calculates the maximum number of characters produced by decoding the specified number of bytes.

public:
 override int GetMaxCharCount(int byteCount);
public override int GetMaxCharCount (int byteCount);
override this.GetMaxCharCount : int -> int
Public Overrides Function GetMaxCharCount (byteCount As Integer) As Integer

Parameters

byteCount
Int32

The number of bytes to decode.

Returns

The maximum number of characters produced by decoding the specified number of bytes.

Exceptions

byteCount is less than zero.

-or-

The resulting number of bytes is greater than the maximum number that can be returned as an integer.

A fallback occurred (for more information, see Character Encoding in .NET)

-and-

DecoderFallback is set to DecoderExceptionFallback.

Examples

The following example demonstrates how to use the GetMaxCharCount method to return the maximum number of characters produced by decoding a specified number of bytes.

using namespace System;
using namespace System::Text;
int main()
{
   UnicodeEncoding^ Unicode = gcnew UnicodeEncoding;
   int byteCount = 8;
   int maxCharCount = Unicode->GetMaxCharCount( byteCount );
   Console::WriteLine( "Maximum of {0} characters needed to decode {1} bytes.", maxCharCount, byteCount );
}
using System;
using System.Text;

class UnicodeEncodingExample {
    public static void Main() {
        UnicodeEncoding Unicode = new UnicodeEncoding();
        int byteCount = 8;
        int maxCharCount = Unicode.GetMaxCharCount(byteCount);
        Console.WriteLine(
            "Maximum of {0} characters needed to decode {1} bytes.",
            maxCharCount,
            byteCount
        );
    }
}
Imports System.Text

Class UnicodeEncodingExample
    
    Public Shared Sub Main()
        Dim uni As New UnicodeEncoding()
        Dim byteCount As Integer = 8
        Dim maxCharCount As Integer = uni.GetMaxCharCount(byteCount)
        Console.WriteLine("Maximum of {0} characters needed to decode {1} bytes.", maxCharCount, byteCount)
    End Sub
End Class

Remarks

To calculate the exact array size required by GetChars to store the resulting characters, the application uses GetCharCount. To calculate the maximum array size, the application should use GetMaxCharCount. The GetCharCount method generally allocates less memory, while the GetMaxCharCount method generally executes faster.

GetMaxCharCount retrieves a worst-case number, including the worst case for the currently selected DecoderFallback. If a fallback is chosen with a potentially large string, GetMaxCharCount retrieves large values.

In most cases, this method retrieves reasonable numbers for small strings. For large strings, you might have to choose between using very large buffers and catching errors in the rare case that a more reasonable buffer is exceeded. You might also want to consider a different approach using GetCharCount or Convert.

GetMaxCharCount has no relation to GetBytes. If your application needs a similar function to use with GetBytes, it should use GetMaxByteCount.

Note

GetMaxCharCount(N) is not necessarily the same value as N* GetMaxCharCount(1).

Applies to

See also