I would create an empty replica of the big table, copy to the replica the 2 million rows, then drop the original table, and finally rename the replica as the original table.
I would also try to delete the rows in batches of 100,000 or less. You an even add a delay after each delete executes.
while exists (select 1 from your_table where <your_condition>)
delete top(100000) from your_table
where <your_condition>
You can also copy data from one database to another using an elastic query.
CREATE MASTER KEY ENCRYPTION BY PASSWORD = '<password>';
CREATE DATABASE SCOPED CREDENTIAL SQL_Credential
WITH IDENTITY = '<username>',
SECRET = '<password>';
CREATE EXTERNAL DATA SOURCE RemoteReferenceData
WITH
(
TYPE=RDBMS,
LOCATION='<server>.database.windows.net',
DATABASE_NAME='<db>',
CREDENTIAL= SQL_Credential
);
CREATE EXTERNAL TABLE [dbo].[source_table] (
[Id] BIGINT NOT NULL,
...
)
WITH
(
DATA_SOURCE = RemoteReferenceData
)
SELECT *
INTO target_table
FROM source_table