# SequentialExecutor, LocalExecutor, CeleryExecutor, DaskExecutor # The executor class that airflow should use. Simple_log_format = %%(asctime)s %%(levelname)s - %%(message)s Log_format = %%(levelname)s - %%(message)s # logging_config_class = my.fault_local_settings.LOGGING_CONFIG # This class has to be on the python classpath # Specify the class that will specify the logging configuration # must supply an Airflow connection id that provides access to the storage # Airflow can store logs remotely in AWS S3 or Google Cloud Storage. # The folder where airflow should store its log filesīase_log_folder = /home/ec2-user/airflow/logs # The folder where your airflow pipelines live, most likely aĭags_folder = /home/ec2-user/airflow/dags # The home folder for airflow, default is ~/airflow Here is the code I have: from airflow import DAGįrom airflow.operators import SimpleHttpOperator, HttpSensor, EmailOperator, S3KeySensorįrom _operator import BashOperatorĭag = DAG('myDag', default_args=default_args, schedule_interval= BashOperator(īash_command='echo "Dag Ran Successfully!" > /home/ec2-user/output.txt',Īnd if needed here is my airflow.cfg file (note the only lines I changed were executor = LocalExecutor and sql_alchemy_conn = I've tried using this Stackoverflow Post on the same topic as well but to no avail. Also, the scheduler is currently running too. I've already cleared all of it's PAST, FUTURE, UPSTREAM history on the UI and I have the DAG turned on so that's not the issue. All of the tasks in the DAG are colored white on the graph view with no status when the first one should be in the running state while it waits for the S3 file to appear. I had this basic example DAG that was working when my executor was set to SequentialExecuter but now that I have it set to LocalExecuter it never runs. I was able to start up the airflow webserver and airflow scheduler and I'm able to go on the UI and view all my DAGs but now none of my DAGs are starting that previously were working. │ scheduler just went through the process of configuring my Airflow setup to be capable of parallel processing by following this article and using this article.Įverything seems to be working fine in the sense that I was able to run all of those commands from the articles without any errors, warnings, or exceptions. │ scheduler File "/home/airflow/.local/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py", line 1356, in _find_zombies │ │ scheduler return func(*args, session=session, **kwargs) │ │ scheduler File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 71, in wrapper │ │ scheduler File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/event_scheduler.py", line 36, in repeat │ │ scheduler action(*argument, **kwargs) │ │ scheduler File "/usr/local/lib/python3.7/sched.py", line 151, in run │ │ scheduler next_event = n(blocking=False) │ │ scheduler File "/home/airflow/.local/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py", line 836, in _run_scheduler_loop │ │ scheduler File "/home/airflow/.local/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py", line 736, in _execute │ │ scheduler File "/home/airflow/.local/lib/python3.7/site-packages/airflow/jobs/base_job.py", line 244, in run │ │ scheduler File "/home/airflow/.local/lib/python3.7/site-packages/airflow/cli/commands/scheduler_command.py", line 46, in _run_schedule │ │ scheduler _run_scheduler_job(args=args) │ │ scheduler File "/home/airflow/.local/lib/python3.7/site-packages/airflow/cli/commands/scheduler_command.py", line 75, in scheduler │ │ scheduler File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/cli.py", line 99, in wrapper │ │ scheduler return func(*args, **kwargs) │ │ scheduler File "/home/airflow/.local/lib/python3.7/site-packages/airflow/cli/cli_parser.py", line 51, in command │ │ scheduler File "/home/airflow/.local/lib/python3.7/site-packages/airflow/_main_.py", line 38, in main │ What does this error mean? I have a simple airflow setup where I ran the airflow helm chart on a local kind/kubernetes cluster with the CeleryKubernetesExecutor │ scheduler WARNING - Failed to get task ' │
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |