What you are talking about above will directly involve your data engineering pipelines.
Solution A:
If for example your on premise SQL DB has the data storage in AWS S3 you can use python/Data Factory to copy the files from your AWS S3 bucket to your Data Lake location in Azure. Since you're copying the files from A to B the data will remain the same and you'll have to just load the data from your DL storage into your Azure SQL Database using a Datafactory pipeline, here is a link to their tutorials on how to do it: https://learn.microsoft.com/en-us/azure/data-factory/tutorial-copy-data-dot-net
Problem with this solution is you will have to create all your tables again within the new DB so depending on how large that is it could end up a long job.
Solution B:
Use the DMA (Data Migration Assistant) or DMS (Data Migration Service). This is essentially a GUI soultion and providing you have full access to your DB(s) then you should be able to set up the necessary flow to directly migrate to an Azure SQL instance. Although the drawback is that it won't be stored in your DL Storage account. If you have an SSIS packages synced up to the old DB then you'll need to work with everyone involved to make sure that all connections are directed towards the new instance before turning off the old connections. Here is some documentation on this: https://learn.microsoft.com/en-us/azure/azure-sql/migration-guides/database/sql-server-to-sql-database-guide?view=azuresql
Another option, which I don't recommend is the lift and shift, but all that does is move everything from your current on prem SQL instance to one hosted inside a VM: https://learn.microsoft.com/en-us/azure/azure-sql/migration-guides/virtual-machines/sql-server-to-sql-on-azure-vm-migration-overview?view=azuresql
So there is a way of doing it, just depends on the method you want to use.
Hopefully this helps you out.