რედაქტირება

What's new in .NET libraries for .NET 11

This article describes new features in the .NET libraries for .NET 11. It was last updated for Preview 4.

Diagnostics and process execution

Process API expansion

Process has a substantial set of new APIs that cover common scenarios where you previously had to wire up OutputDataReceived/ErrorDataReceived events manually or use P/Invoke.

Run-and-capture helpers

New one-shot APIs let you launch a process and get its result without manual setup:

// One-shot capture: stdout and stderr together, plus exit code.
ProcessTextOutput result = await Process.RunAndCaptureTextAsync(
    "git", ["status", "--porcelain"]);

Console.WriteLine(result.StandardOutput);
Console.WriteLine($"Exit code: {result.ExitStatus.ExitCode}");

The full set of helpers includes:

Fire-and-forget launches

SafeProcessHandle lifecycle methods

SafeProcessHandle gains lifecycle methods for advanced scenarios:

Tighter handle control

Console FORCE_COLOR support

.NET console output now honors the FORCE_COLOR standard alongside the existing NO_COLOR support. When FORCE_COLOR is set, Console.IsOutputRedirected no longer suppresses ANSI escape codes. This is useful when you pipe dotnet run output through tee, into a CI log viewer, or through less -R:

FORCE_COLOR=1 dotnet run | tee build.log

Text, serialization, and data handling

String and character enhancements

.NET 11 introduces significant enhancements to string and character manipulation APIs, making it easier to work with Unicode characters and runes.

Rune support in String methods

The String class now includes methods that accept Rune parameters, enabling you to search, replace, and manipulate strings using Unicode scalar values directly. These new methods include:

Many of these methods include overloads that accept a StringComparison parameter for culture-aware comparisons.

Char.Equals with StringComparison

The Char struct now includes an Char.Equals(Char, StringComparison) method that accepts a StringComparison parameter, allowing you to compare characters using culture-aware or ordinal comparisons.

Rune support in TextInfo

The TextInfo class now provides TextInfo.ToLower(Rune) and TextInfo.ToUpper(Rune) methods that accept Rune parameters, enabling you to perform case conversions on individual Unicode scalar values.

Base64 encoding improvements

.NET 11 adds new APIs and overloads to the existing Base64 type, providing comprehensive support for Base64 encoding and decoding. These additions offer improved performance and flexibility compared to existing methods.

New Base64 APIs

The new APIs support encoding and decoding operations with various input and output formats:

These methods provide both high-level convenience methods (that allocate and return arrays or strings) and low-level span-based methods (for zero-allocation scenarios).

System.Text.Unicode has two new complementary features. Utf16.IsValid(ReadOnlySpan<Char>) answers whether a sequence is well-formed UTF-16 without scanning twice, and Utf8.IndexOfInvalidSubsequence(ReadOnlySpan<Byte>) / Utf16.IndexOfInvalidSubsequence(ReadOnlySpan<Char>) return the position of the first ill-formed code-unit sequence (or -1 for valid input). Together, these methods let parsers, validators, and serializers report precise errors instead of generic encoding-error messages.

ReadOnlySpan<byte> bytes = [0xC3, 0x28]; // invalid UTF-8
int badIndex = Utf8.IndexOfInvalidSubsequence(bytes); // 0

ReadOnlySpan<char> chars = "valid \uD83D\uDC4D end"; // valid UTF-16 (👍 emoji)
bool ok = Utf16.IsValid(chars); // true

System.Text.Json improvements

Generic type info retrieval

A common pattern when working with System.Text.Json type metadata is to retrieve a JsonTypeInfo<T> from JsonSerializerOptions. Previously, you had to manually downcast from the non-generic GetTypeInfo(Type) method. New generic JsonSerializerOptions.GetTypeInfo<T>() and JsonSerializerOptions.TryGetTypeInfo<T>(JsonTypeInfo<T>) methods return strongly typed metadata directly, eliminating the cast.

JsonSerializerOptions options = new(JsonSerializerDefaults.Web);
options.MakeReadOnly();

// Before: manual downcast required
JsonTypeInfo<MyRecord> info1 = (JsonTypeInfo<MyRecord>)options.GetTypeInfo(typeof(MyRecord));

// After: generic method returns the right type directly
JsonTypeInfo<MyRecord> info2 = options.GetTypeInfo<MyRecord>();

// TryGetTypeInfo variant for cases where the type may not be registered
if (options.TryGetTypeInfo<MyRecord>(out JsonTypeInfo<MyRecord>? typeInfo))
{
    // Use typeInfo
    _ = typeInfo;
}

This is particularly useful when working with source generation, NativeAOT, and polymorphic serialization scenarios where type metadata access is common.

Naming and ignore defaults

The naming and ignore options available in System.Text.Json now include:

// Type-level JsonIgnore: all members use WhenWritingNull by default
// Per-member JsonNamingPolicy: EventName uses camelCase even though the
// serializer options use PascalCase
var options = new JsonSerializerOptions
{
    PropertyNamingPolicy = JsonNamingPolicy.PascalCase
};

var data = new EventData { EventName = "Launch", Notes = null };
string json = JsonSerializer.Serialize(data, options);
Console.WriteLine(json);
// {"eventName":"Launch"}  -- Notes omitted (null), EventName camel-cased

F# discriminated union support

The serializer now understands F# discriminated unions out of the box. Apps that share types between F# producers and C# consumers no longer need a custom converter for the most common shapes:

type Shape =
    | Circle of radius: float
    | Square of side: float

let json = System.Text.Json.JsonSerializer.Serialize(Circle 1.5)
// {"$type":"Circle","radius":1.5}

Utf8JsonWriter.Reset with options

Reset now accepts a JsonWriterOptions parameter, so writer instances can be repooled with different options without allocating a new writer:

using var stream = new MemoryStream();
using var writer = new Utf8JsonWriter(stream, new JsonWriterOptions { Indented = true });
writer.WriteStartObject();
writer.WriteString("name", "example");
writer.WriteEndObject();
writer.Flush();

// Reset with different options for next use — no new allocation needed
stream.SetLength(0);
writer.Reset(stream, new JsonWriterOptions { Indented = false });

Regular expression improvements

AnyNewLine option

A new RegexOptions.AnyNewLine flag makes ^, $, and . treat the full set of Unicode newline characters as line terminators—not just \n. This helps when parsing text that mixes Windows (\r\n), Unix (\n), and Unicode-specific (\u0085, \u2028, \u2029) line endings.

string text = "line1\r\nline2\u0085line3\u2028line4";

// RegexOptions.AnyNewLine makes ^, $, and . treat all Unicode newline
// sequences as line terminators, not just \n.
MatchCollection matches = Regex.Matches(
    text,
    @"^line\d$",
    RegexOptions.Multiline | RegexOptions.AnyNewLine);

Console.WriteLine(matches.Count); // 4

Regex engine and source generator fixes

.NET 11 includes several regex correctness and code-quality fixes:

  • The non-backtracking engine no longer takes super-linear time on certain nested-loop patterns and produces correct results for cases that previously diverged.
  • The regex compiler and source generator handle resumeAt correctly when a conditional appears inside a loop body.
  • The SYSLIB1045 code fixer no longer creates duplicate class names when applied across multiple partial declarations of the same class.

Compression and archive formats

Compression enhancements

.NET 11 includes several improvements to compression APIs.

ZIP archive entry access modes

The ZipArchiveEntry class now supports opening entries with specific file access modes through new overloads: ZipArchiveEntry.Open(FileAccess) and ZipArchiveEntry.OpenAsync(FileAccess, CancellationToken). These overloads accept a FileAccess parameter and allow you to open ZIP entries for read, write, or read-write access.

Additionally, a new CompressionMethod property exposes the compression method used for an entry through the ZipCompressionMethod enum, which includes values for Stored, Deflate, and Deflate64.

ZIP CRC32 validation

ZipArchive validates the CRC32 checksum when reading ZIP entries. Corrupted or truncated archives that previously passed without error now throw InvalidDataException, helping you detect data integrity issues early.

DeflateStream and GZipStream behavior change

Starting in .NET 11, DeflateStream and GZipStream always write format headers and footers to the output stream, even when no data is written. This ensures the output is a valid compressed stream according to the Deflate and GZip specifications.

Previously, these streams didn't produce any output if no data was written, resulting in an empty output stream. This change ensures compatibility with tools that expect properly formatted compressed streams.

For more information, see DeflateStream and GZipStream write headers and footers for empty payload.

Span-based Deflate, ZLib, and GZip APIs

System.IO.Compression now offers Span<byte>/ReadOnlySpan<byte> encode and decode entry points for the Deflate, ZLib, and GZip formats. The new APIs, on types such as DeflateEncoder, ZLibEncoder, and GZipEncoder, mirror the shape of BrotliEncoder/BrotliDecoder and the Zstandard primitives. You can compress and decompress buffers without allocating a Stream. This is useful for high-throughput scenarios such as protocol parsers, log shippers, and middleware that already operate on spans.

ReadOnlySpan<byte> source = [0x48, 0x65, 0x6C, 0x6C, 0x6F]; // "Hello"
byte[] buffer = new byte[source.Length + 32];
Span<byte> destination = buffer;

using ZLibEncoder encoder = new();
OperationStatus status = encoder.Compress(
    source, destination, out int bytesConsumed, out int bytesWritten,
    isFinalBlock: true);

Console.WriteLine($"Compressed {bytesConsumed} bytes into {bytesWritten} bytes. Status: {status}");

Zstandard compression

The Zstandard compression APIs, for example, ZstandardStream and ZstandardEncoder, are now part of the System.IO.Compression namespace, alongside DeflateStream, GZipStream, and BrotliStream. The API surface is otherwise unchanged.

Tar archive format selection

New overloads on CreateFromDirectory and CreateFromDirectoryAsync accept a TarEntryFormat parameter, giving you direct control over the archive format. Previously, CreateFromDirectory always produced Pax archives. The new overloads support all four tar formats—Pax, Ustar, GNU, and V7—for compatibility with specific tools and environments.

// Create a GNU format tar archive for Linux compatibility
TarFile.CreateFromDirectory("/source/dir", "/dest/archive.tar",
    includeBaseDirectory: true, format: TarEntryFormat.Gnu);

// Create a Ustar format archive for broader compatibility
using Stream outputStream = File.OpenWrite("/dest/ustar.tar");
TarFile.CreateFromDirectory("/source/dir", outputStream,
    includeBaseDirectory: false, format: TarEntryFormat.Ustar);

// Async version
CancellationToken cancellationToken = CancellationToken.None;
await TarFile.CreateFromDirectoryAsync("/source/dir", "/dest/archive.tar",
    includeBaseDirectory: true, format: TarEntryFormat.Pax,
    cancellationToken: cancellationToken);

TarReader can now also read entries that use the GNU sparse format 1.0 (PAX) representation. The earlier 0.1 representation was already supported. With 1.0 support in place, TarReader matches what modern tar implementations write by default for sparse files.

Collections, numerics, and low-level I/O

BFloat16 support in BitConverter

The BitConverter class now includes methods for converting between BFloat16 values and byte arrays or bit representations. These new methods include:

BFloat16 (Brain Floating Point) is a 16-bit floating-point format that's commonly used in machine learning and scientific computing.

Floating-point hex formatting and parsing

double, float, and Half can now be formatted and parsed in their hexadecimal IEEE-754 form. The hex form preserves every bit of the underlying value, making it the right choice for golden-file tests, cross-language interop with C/C++ printf("%a", ...), and any scenario where round-tripping a double through decimal text is too lossy.

double value = Math.PI;

// Format as hexadecimal IEEE-754: preserves all bits exactly
string hex = value.ToString("X"); // e.g., "0X1.921FB54442D18P+1"
double roundTripped = double.Parse(hex, NumberStyles.HexFloat);

Console.WriteLine(roundTripped == value); // True — exact round-trip

Numerics improvements

Matrix4x4.GetDeterminant() now uses an SSE-vectorized implementation, improving performance by approximately 15%.

Low-level I/O improvements

SafeFileHandle pipe support

SafeFileHandle gains two new members:

SafeFileHandle.CreateAnonymousPipe(
    out SafeFileHandle readEnd,
    out SafeFileHandle writeEnd,
    asyncRead: true,
    asyncWrite: false);

using (readEnd)
using (writeEnd)
{
    // SafeFileHandle.Type reports the kind of OS object the handle refers to
    Console.WriteLine(readEnd.Type);   // Pipe
    Console.WriteLine(writeEnd.Type);  // Pipe
}

RandomAccess pipe support

RandomAccess.Read and RandomAccess.Write now work with non-seekable handles such as pipes, in addition to regular file handles.

On Windows, Process now uses overlapped I/O for redirected stdout/stderr, which reduces thread-pool blocking in process-heavy applications.

Collections improvements

BitArray.PopCount

The BitArray class now includes a BitArray.PopCount() method that returns the number of bits set to true in the array. This provides an efficient way to count set bits without manually iterating through the array.

IReadOnlySet support in JSON serialization

The JsonMetadataServices class now includes a JsonMetadataServices.CreateIReadOnlySetInfo method, enabling JSON serialization support for IReadOnlySet<T> collections.

Extensions and developer platform

Discriminated-union scaffolding

Note

This is a preview feature in .NET 11.

.NET 11 introduces System.Runtime.CompilerServices.UnionAttribute and System.Runtime.CompilerServices.IUnion in System.Runtime.CompilerServices. These types are the runtime side of the C# discriminated-union design. They aren't directly user-facing yet—the C# compiler and source generators are the expected producers—but they ship in the framework so libraries can author against the surface now.

For the language-side design, see the C# unions proposal.

MetadataLoadContext additions

MetadataLoadContext.GetLoadContext(Assembly) returns the load context that produced a given Assembly, mirroring the long-existing API on AssemblyLoadContext. This closes a gap for tooling that reflects over assemblies in an isolated MetadataLoadContext and needs to walk back from an Assembly reference to the context that owns it:

using System.Reflection;
using System.Reflection.Metadata;

string[] paths = [typeof(object).Assembly.Location];
using var mlc = new MetadataLoadContext(new PathAssemblyResolver(paths));
Assembly asm = mlc.LoadFromAssemblyPath(typeof(object).Assembly.Location);

MetadataLoadContext owner = MetadataLoadContext.GetLoadContext(asm)!;
Console.WriteLine(ReferenceEquals(owner, mlc)); // true

URI data scheme constant

A new Uri.UriSchemeData constant has been added, representing the data: URI scheme. This constant provides a standardized way to reference data URIs.

StringSyntax attribute enhancements

The StringSyntaxAttribute class now includes constants for common programming languages:

These constants can be used with the StringSyntax attribute to provide better tooling support for string literals containing code in these languages.

Caching and configuration

Configuration binding

Microsoft.Extensions.Configuration adds Microsoft.Extensions.Configuration.ConfigurationIgnoreAttribute, so models can opt individual properties out of binding declaratively without relying on BindNonPublicProperties toggles or custom converters:

public sealed class AppOptions
{
    public string Endpoint { get; set; } = "";

    [ConfigurationIgnore]
    public string ComputedKey => Endpoint + ":default";
}

ConfigurationBinder now also binds an empty array to a constructor parameter instead of throwing.

PhysicalFilesWatcher no longer throws when its root directory doesn't yet exist, and InMemoryDirectoryInfo resolves .. and other relative segments consistently with the physical provider.

MemoryCache OpenTelemetry metrics

MemoryCache now emits a built-in set of OpenTelemetry (OTel)-compatible metrics without an extra adapter package. To opt in, set MemoryCacheOptions.TrackStatistics to true:

var cache = new MemoryCache(new MemoryCacheOptions
{
    TrackStatistics = true
});

The new Microsoft.Extensions.Caching.Memory.MemoryCache meter publishes four observable instruments:

  • dotnet.cache.requests (with a dotnet.cache.request.type tag that distinguishes hit from miss)
  • dotnet.cache.evictions
  • dotnet.cache.entries
  • dotnet.cache.estimated_size

Pass an System.Diagnostics.Metrics.IMeterFactory to the new MemoryCache.MemoryCache(IOptions<MemoryCacheOptions>, ILoggerFactory, IMeterFactory) constructor overload for per-instance metrics. Without one, the instruments are aggregated process-wide on a shared meter.

Networking and transport security

TLS handshake hardening

Two System.Net.Security items improve TLS (Transport Layer Security) reliability:

  • SslStream server-side handshake bounds-checking fixes in TlsFrameHelper close several edge cases that could surface as IOException on malformed ClientHello records.
  • On Linux, certificate-validation failures now surface as standard TLS alerts to the peer, matching Windows behavior. Connecting clients receive an actionable handshake error instead of a connection drop.

HTTP/2 automatic downgrade for Windows authentication

HttpClient automatically downgrades to HTTP/1.1 when a request requires Windows authentication (NTLM/Negotiate) over HTTP/2. The HTTP/2 specification disallows the connection-bound authentication schemes that NTLM and Kerberos rely on, so these requests previously failed. With the downgrade in place, applications targeting mixed-authentication environments—common in enterprise intranets—work without explicit HttpRequestMessage.Version overrides.

See also