Uredi

Deli z drugimi prek


UTF8Encoding.GetBytes Method

Definition

Encodes a set of characters into a sequence of bytes.

Overloads

GetBytes(String)

Encodes the characters in a specified String object into a sequence of bytes.

GetBytes(ReadOnlySpan<Char>, Span<Byte>)

Encodes the specified character span into the specified byte span.

GetBytes(Char*, Int32, Byte*, Int32)

Encodes a set of characters starting at the specified character pointer into a sequence of bytes that are stored starting at the specified byte pointer.

GetBytes(Char[], Int32, Int32, Byte[], Int32)

Encodes a set of characters from the specified character array into the specified byte array.

GetBytes(String, Int32, Int32, Byte[], Int32)

Encodes a set of characters from the specified String into the specified byte array.

GetBytes(String)

Encodes the characters in a specified String object into a sequence of bytes.

public:
 override cli::array <System::Byte> ^ GetBytes(System::String ^ s);
public override byte[] GetBytes (string s);
override this.GetBytes : string -> byte[]
Public Overrides Function GetBytes (s As String) As Byte()

Parameters

s
String

The character string to encode.

Returns

Byte[]

A byte array that contains the encoded characters in the string specified by the s parameter.

Applies to

GetBytes(ReadOnlySpan<Char>, Span<Byte>)

Source:
UTF8Encoding.cs
Source:
UTF8Encoding.cs
Source:
UTF8Encoding.cs

Encodes the specified character span into the specified byte span.

public:
 override int GetBytes(ReadOnlySpan<char> chars, Span<System::Byte> bytes);
public override int GetBytes (ReadOnlySpan<char> chars, Span<byte> bytes);
override this.GetBytes : ReadOnlySpan<char> * Span<byte> -> int
Public Overrides Function GetBytes (chars As ReadOnlySpan(Of Char), bytes As Span(Of Byte)) As Integer

Parameters

chars
ReadOnlySpan<Char>

The character span to encode.

bytes
Span<Byte>

The span to contain the resulting set of bytes.

Returns

The actual number of bytes written into bytes.

Remarks

To calculate the exact size required by GetBytes to store the resulting bytes, you call the GetByteCount method. To calculate the maximum size, you call the GetMaxByteCount method. The GetByteCount method generally allocates less memory, while the GetMaxByteCount method generally executes faster.

With error detection, an invalid sequence causes this method to throw an ArgumentException exception. Without error detection, invalid sequences are ignored, and no exception is thrown.

Data to be converted, such as data read from a stream, might be available only in sequential blocks. In this case, or if the amount of data is so large that it needs to be divided into smaller blocks, use the Decoder or the Encoder returned by the GetDecoder method or the GetEncoder method, respectively.

To ensure that the encoded bytes are decoded properly when they are saved as a file or as a stream, you can prefix a stream of encoded bytes with a preamble. Inserting the preamble at the beginning of a byte stream (such as at the beginning of a series of bytes to be written to a file) is the developer's responsibility. The GetBytes method does not prepend a preamble to the beginning of a sequence of encoded bytes.

Applies to

GetBytes(Char*, Int32, Byte*, Int32)

Source:
UTF8Encoding.cs
Source:
UTF8Encoding.cs
Source:
UTF8Encoding.cs

Important

This API is not CLS-compliant.

Encodes a set of characters starting at the specified character pointer into a sequence of bytes that are stored starting at the specified byte pointer.

public:
 override int GetBytes(char* chars, int charCount, System::Byte* bytes, int byteCount);
[System.CLSCompliant(false)]
[System.Security.SecurityCritical]
public override int GetBytes (char* chars, int charCount, byte* bytes, int byteCount);
[System.CLSCompliant(false)]
public override int GetBytes (char* chars, int charCount, byte* bytes, int byteCount);
[System.CLSCompliant(false)]
[System.Runtime.InteropServices.ComVisible(false)]
public override int GetBytes (char* chars, int charCount, byte* bytes, int byteCount);
[System.CLSCompliant(false)]
[System.Security.SecurityCritical]
[System.Runtime.InteropServices.ComVisible(false)]
public override int GetBytes (char* chars, int charCount, byte* bytes, int byteCount);
[<System.CLSCompliant(false)>]
[<System.Security.SecurityCritical>]
override this.GetBytes : nativeptr<char> * int * nativeptr<byte> * int -> int
[<System.CLSCompliant(false)>]
override this.GetBytes : nativeptr<char> * int * nativeptr<byte> * int -> int
[<System.CLSCompliant(false)>]
[<System.Runtime.InteropServices.ComVisible(false)>]
override this.GetBytes : nativeptr<char> * int * nativeptr<byte> * int -> int
[<System.CLSCompliant(false)>]
[<System.Security.SecurityCritical>]
[<System.Runtime.InteropServices.ComVisible(false)>]
override this.GetBytes : nativeptr<char> * int * nativeptr<byte> * int -> int

Parameters

chars
Char*

A pointer to the first character to encode.

charCount
Int32

The number of characters to encode.

bytes
Byte*

A pointer to the location at which to start writing the resulting sequence of bytes.

byteCount
Int32

The maximum number of bytes to write.

Returns

The actual number of bytes written at the location indicated by bytes.

Attributes

Exceptions

chars is null.

-or-

bytes is null.

charCount or byteCount is less than zero.

Error detection is enabled, and chars contains an invalid sequence of characters.

-or-

byteCount is less than the resulting number of bytes.

A fallback occurred (for more information, see Character Encoding in .NET)

-and-

EncoderFallback is set to EncoderExceptionFallback.

Remarks

To calculate the exact array size required by GetBytes to store the resulting bytes, you call the GetByteCount method. To calculate the maximum array size, you call the GetMaxByteCount method. The GetByteCount method generally allocates less memory, while the GetMaxByteCount method generally executes faster.

With error detection, an invalid sequence causes this method to throw an ArgumentException exception. Without error detection, invalid sequences are ignored, and no exception is thrown.

Data to be converted, such as data read from a stream, might be available only in sequential blocks. In this case, or if the amount of data is so large that it needs to be divided into smaller blocks, use the Decoder or the Encoder returned by the GetDecoder method or the GetEncoder method, respectively.

To ensure that the encoded bytes are decoded properly when they are saved as a file or as a stream, you can prefix a stream of encoded bytes with a preamble. Inserting the preamble at the beginning of a byte stream (such as at the beginning of a series of bytes to be written to a file) is the developer's responsibility. The GetBytes method does not prepend a preamble to the beginning of a sequence of encoded bytes.

See also

Applies to

GetBytes(Char[], Int32, Int32, Byte[], Int32)

Source:
UTF8Encoding.cs
Source:
UTF8Encoding.cs
Source:
UTF8Encoding.cs

Encodes a set of characters from the specified character array into the specified byte array.

public:
 override int GetBytes(cli::array <char> ^ chars, int charIndex, int charCount, cli::array <System::Byte> ^ bytes, int byteIndex);
public override int GetBytes (char[] chars, int charIndex, int charCount, byte[] bytes, int byteIndex);
override this.GetBytes : char[] * int * int * byte[] * int -> int
Public Overrides Function GetBytes (chars As Char(), charIndex As Integer, charCount As Integer, bytes As Byte(), byteIndex As Integer) As Integer

Parameters

chars
Char[]

The character array containing the set of characters to encode.

charIndex
Int32

The index of the first character to encode.

charCount
Int32

The number of characters to encode.

bytes
Byte[]

The byte array to contain the resulting sequence of bytes.

byteIndex
Int32

The index at which to start writing the resulting sequence of bytes.

Returns

The actual number of bytes written into bytes.

Exceptions

chars is null.

-or-

bytes is null.

charIndex or charCount or byteIndex is less than zero.

-or-

charIndex and charCount do not denote a valid range in chars.

-or-

byteIndex is not a valid index in bytes.

Error detection is enabled, and chars contains an invalid sequence of characters.

-or-

bytes does not have enough capacity from byteIndex to the end of the array to accommodate the resulting bytes.

A fallback occurred (for more information, see Character Encoding in .NET)

-and-

EncoderFallback is set to EncoderExceptionFallback.

Examples

The following example uses the GetBytes method to encode a range of characters from a string and stores the encoded bytes in a range of elements in a byte array.

using namespace System;
using namespace System::Text;
using namespace System::Collections;
int main()
{
   array<Byte>^bytes;
   String^ chars = "UTF8 Encoding Example";
   UTF8Encoding^ utf8 = gcnew UTF8Encoding;
   int byteCount = utf8->GetByteCount( chars->ToCharArray(), 0, 13 );
   bytes = gcnew array<Byte>(byteCount);
   int bytesEncodedCount = utf8->GetBytes( chars, 0, 13, bytes, 0 );
   Console::WriteLine( "{0} bytes used to encode string.", bytesEncodedCount );
   Console::Write( "Encoded bytes: " );
   IEnumerator^ myEnum = bytes->GetEnumerator();
   while ( myEnum->MoveNext() )
   {
      Byte b = safe_cast<Byte>(myEnum->Current);
      Console::Write( "[{0}]", b );
   }

   Console::WriteLine();
}
using System;
using System.Text;

class UTF8EncodingExample {
    public static void Main() {
        Byte[] bytes;
        String chars = "UTF8 Encoding Example";
        
        UTF8Encoding utf8 = new UTF8Encoding();
        
        int byteCount = utf8.GetByteCount(chars.ToCharArray(), 0, 13);
        bytes = new Byte[byteCount];
        int bytesEncodedCount = utf8.GetBytes(chars, 0, 13, bytes, 0);
        
        Console.WriteLine(
            "{0} bytes used to encode string.", bytesEncodedCount
        );

        Console.Write("Encoded bytes: ");
        foreach (Byte b in bytes) {
            Console.Write("[{0}]", b);
        }
        Console.WriteLine();
    }
}
Imports System.Text

Class UTF8EncodingExample
    
    Public Shared Sub Main()
        Dim bytes() As Byte
        Dim chars As String = "UTF8 Encoding Example"
        
        Dim utf8 As New UTF8Encoding()
        
        Dim byteCount As Integer = utf8.GetByteCount(chars.ToCharArray(), 0, 13)
        bytes = New Byte(byteCount - 1) {}
        Dim bytesEncodedCount As Integer = utf8.GetBytes(chars, 0, 13, bytes, 0)
        
        Console.WriteLine("{0} bytes used to encode string.", bytesEncodedCount)
        
        Console.Write("Encoded bytes: ")
        Dim b As Byte
        For Each b In  bytes
            Console.Write("[{0}]", b)
        Next b
        Console.WriteLine()
    End Sub
End Class

Remarks

To calculate the exact array size required by GetBytes to store the resulting bytes, you call the GetByteCount method. To calculate the maximum array size, you call the GetMaxByteCount method. The GetByteCount method generally allocates less memory, while the GetMaxByteCount method generally executes faster.

With error detection, an invalid sequence causes this method to throw an ArgumentException exception. Without error detection, invalid sequences are ignored, and no exception is thrown.

Data to be converted, such as data read from a stream, might be available only in sequential blocks. In this case, or if the amount of data is so large that it needs to be divided into smaller blocks, use the Decoder or the Encoder provided by the GetDecoder method or the GetEncoder method, respectively.

To ensure that the encoded bytes are decoded properly when they are saved as a file or as a stream, you can prefix a stream of encoded bytes with a preamble. Inserting the preamble at the beginning of a byte stream (such as at the beginning of a series of bytes to be written to a file) is the developer's responsibility. The GetBytes method does not prepend a preamble to the beginning of a sequence of encoded bytes.

See also

Applies to

GetBytes(String, Int32, Int32, Byte[], Int32)

Source:
UTF8Encoding.cs
Source:
UTF8Encoding.cs
Source:
UTF8Encoding.cs

Encodes a set of characters from the specified String into the specified byte array.

public:
 override int GetBytes(System::String ^ s, int charIndex, int charCount, cli::array <System::Byte> ^ bytes, int byteIndex);
public override int GetBytes (string s, int charIndex, int charCount, byte[] bytes, int byteIndex);
override this.GetBytes : string * int * int * byte[] * int -> int
Public Overrides Function GetBytes (s As String, charIndex As Integer, charCount As Integer, bytes As Byte(), byteIndex As Integer) As Integer

Parameters

s
String

The String containing the set of characters to encode.

charIndex
Int32

The index of the first character to encode.

charCount
Int32

The number of characters to encode.

bytes
Byte[]

The byte array to contain the resulting sequence of bytes.

byteIndex
Int32

The index at which to start writing the resulting sequence of bytes.

Returns

The actual number of bytes written into bytes.

Exceptions

s is null.

-or-

bytes is null.

charIndex or charCount or byteIndex is less than zero.

-or-

charIndex and charCount do not denote a valid range in chars.

-or-

byteIndex is not a valid index in bytes.

Error detection is enabled, and s contains an invalid sequence of characters.

-or-

bytes does not have enough capacity from byteIndex to the end of the array to accommodate the resulting bytes.

A fallback occurred (for more information, see Character Encoding in .NET)

-and-

EncoderFallback is set to EncoderExceptionFallback.

Examples

The following example uses the GetBytes method to encode a range of elements from a Unicode character array and store the encoded bytes in a range of elements in a byte array.

using namespace System;
using namespace System::Text;
using namespace System::Collections;
int main()
{
   array<Byte>^bytes;
   
   // Unicode characters.
   
   // Pi
   // Sigma
   array<Char>^chars = {L'\u03a0',L'\u03a3',L'\u03a6',L'\u03a9'};
   UTF8Encoding^ utf8 = gcnew UTF8Encoding;
   int byteCount = utf8->GetByteCount( chars, 1, 2 );
   bytes = gcnew array<Byte>(byteCount);
   int bytesEncodedCount = utf8->GetBytes( chars, 1, 2, bytes, 0 );
   Console::WriteLine( "{0} bytes used to encode characters.", bytesEncodedCount );
   Console::Write( "Encoded bytes: " );
   IEnumerator^ myEnum = bytes->GetEnumerator();
   while ( myEnum->MoveNext() )
   {
      Byte b = safe_cast<Byte>(myEnum->Current);
      Console::Write( "[{0}]", b );
   }

   Console::WriteLine();
}
using System;
using System.Text;

class UTF8EncodingExample {
    public static void Main() {
        Byte[] bytes;
        // Unicode characters.
        Char[] chars = new Char[] {
            '\u0023', // #
            '\u0025', // %
            '\u03a0', // Pi
            '\u03a3'  // Sigma
        };
        
        UTF8Encoding utf8 = new UTF8Encoding();
        
        int byteCount = utf8.GetByteCount(chars, 1, 2);
        bytes = new Byte[byteCount];
        int bytesEncodedCount = utf8.GetBytes(chars, 1, 2, bytes, 0);
        
        Console.WriteLine(
            "{0} bytes used to encode characters.", bytesEncodedCount
        );

        Console.Write("Encoded bytes: ");
        foreach (Byte b in bytes) {
            Console.Write("[{0}]", b);
        }
        Console.WriteLine();
    }
}
Imports System.Text
Imports Microsoft.VisualBasic.Strings

Class UTF8EncodingExample
    
    Public Shared Sub Main()
        Dim bytes() As Byte
        ' Unicode characters.
        ' ChrW(35)  = #
        ' ChrW(37)  = %
        ' ChrW(928) = Pi
        ' ChrW(931) = Sigma
        Dim chars() As Char = {ChrW(35), ChrW(37), ChrW(928), ChrW(931)}
        
        Dim utf8 As New UTF8Encoding()
        
        Dim byteCount As Integer = utf8.GetByteCount(chars, 1, 2)
        bytes = New Byte(byteCount - 1) {}
        Dim bytesEncodedCount As Integer = utf8.GetBytes(chars, 1, 2, bytes, 0)
        
        Console.WriteLine("{0} bytes used to encode characters.", bytesEncodedCount)
        
        Console.Write("Encoded bytes: ")
        Dim b As Byte
        For Each b In  bytes
            Console.Write("[{0}]", b)
        Next b
        Console.WriteLine()
    End Sub
End Class

Remarks

To calculate the exact array size required by GetBytes to store the resulting bytes, you call the GetByteCount method. To calculate the maximum array size, you call the GetMaxByteCount method. The GetByteCount method generally allocates less memory, while the GetMaxByteCount method generally executes faster.

With error detection, an invalid sequence causes this method to throw an ArgumentException exception. Without error detection, invalid sequences are ignored, and no exception is thrown.

Data to be converted, such as data read from a stream, might be available only in sequential blocks. In this case, or if the amount of data is so large that it needs to be divided into smaller blocks, use the Decoder or the Encoder provided by the GetDecoder method or the GetEncoder method, respectively.

To ensure that the encoded bytes are decoded properly when they are saved as a file or as a stream, you can prefix a stream of encoded bytes with a preamble. Inserting the preamble at the beginning of a byte stream (such as at the beginning of a series of bytes to be written to a file) is the developer's responsibility. The GetBytes method does not prepend a preamble to the beginning of a sequence of encoded bytes.

See also

Applies to