I'm glad that you were able to resolve your issue and thank you for posting your solution so that others experiencing the same thing can easily reference this! Since the Microsoft Q&A community has a policy that "The question author cannot accept their own answer. They can only accept answers by others ", I'll repost your solution in case you'd like to accept the answer.
Issue:
I have multiple integration running but for one integration when I extract zip file size of zip is 2-3 gbs, extract and upload each file to our storage.It shows Negsignal.SIGKILL. It was running fine a month ago but now its this error. The file is same always where it fails.
I am using Airflow for Azure Data Factory. When I run it like 1 more time it will get successfully run It seems to be issue with resources but I cant don’t know how much resouces airflow is using and how to increase it.
Is there a way to increase resources in DAG or will adding multiple workers solve the issue?or will I had to add concurrency in case of multiple works.
Dag code is simple we are using PythonOperator
Solution:
I have currently solved the issue by changing time of my integration and it seems to be working fine now. Their might be more usage of resources at particular time which cause integration to fail.
If I missed anything please let me know and I'd be happy to add it to my answer, or feel free to comment below with any additional information.
Hope this helps. Do let us know if you have any further queries.
If this answers your query, do click Accept Answer
and Yes
for was this answer helpful. And, if you have any further query do let us know.