300 million rows would be a bit more than a mouthful, so you should probably to rows in batches of maybe five million rows at a time. Although the optimal row size depends on lot of things. How wide are the rows? LOB columns? Also how much memory are the in the server etc.
It is extremely important that the batch follows the clustered index. Here is a pattern you can follow:
SELECT @minID = MIN(ID) FROM src
WHILE @min IS NOT NULL
BEGIN
SELECT @maxID = MAX(ID)
FROM (SELECT TOP (@batchsize)
FROM src
ID >= @minID
AND (otherconditions)
INSERT target (...)
SELECT ....
FROM src
WHERE ID BETWEEN @minID AND @maxID
AND (otherconditions)
SELECT @minID = (SELECT MIN(ID) FROM src WHERE ID > @maxID)
END