For the first scenario you can use the following tools/options:
- Azure SQL Data Sync (not available for Azure Managed Instance and Hyperscale, but available for Azure SQL DTU-model and vCore Model). You can sync monthly, weekly, daily, hourly, and more). It is a free tool.
- Azure Data Factory can help you upload incremental data using watermarks and "slices"
- SQL Server replication can have Azure SQL databases involved. Azure SQL Database can be the push subscriber of a SQL Server publisher and distributor.
You can definitely schedule scale up/down of the SQL Azure resources depending of when ETL processes or users consumed data. If you can identify time windows when usage of resources drops, you can schedule scale down of resources using Azure Automation. Azure SQL Databases cannot be paused (equivalent to shut down a VM).
About the second scenario, you will need to use Azure Synapse (formerly Azure Data Warehouse) SQL Pools and get used of Azure Data Factory to transfer those 200 GB during each ETL. If you are using Columnstore indexes or In-Memory tables, this can be done fairly quick. Do not use incrementally clustered indexes with Azure Data Factory. Azure Synapse can be paused and can be scaled as needed.
You may need to install this gateway on-premises to get Azure Data Factory access your on-premises data sources.