Hi @DCAK ,
Welcome to Microsoft Q&A Platform. Thanks for posting the query.
Azure Data Lake Storage can be used for storing files today and any modifications to these files needs to be done from ETL tools/Scripts/Programming only. One approach would be to use Azure ETL tool i.e., Azure Data Factory to build pipelines for implementing the above functionality. Dataflows can be used with filter transformation to filter the data dynamically and load into the same file again. Please find below GIF doing the same for one csv file where I filtered data that is not having "col6" as "test1". This condition can be modified according to the requirement i.e., based on year in above requirement.
Hope this helps! Please let us know if it is not aligning with the requirement or for further queries and we will be glad to assist.