[0KRunning with gitlab-runner 17.7.0 (3153ccc6)[0;m [0K on customer-churn-runner hYXFyj2eh, system ID: s_b5ec90919e6b[0;m section_start:1736522589:prepare_executor [0K[0K[36;1mPreparing the "docker" executor[0;m[0;m [0KUsing Docker executor with image apache/airflow:2.7.1-python3.11 ...[0;m [0KStarting service postgres:13...[0;m [0KPulling docker image postgres:13 ...[0;m [0KUsing docker image sha256:78db9e1a6ba0d59443ff621071dd7836a013c4b5066b1f7d54b96f163c03ac64 for postgres:13 with digest postgres@sha256:6161de79c8ea9fd11b5c103ef1d6ebabe0595a8dba5dd3ed14facc809e157780 ...[0;m [0;33mWARNING: Service postgres:13 is already created. Ignoring.[0;m [0KWaiting for services to be up and running (timeout 30 seconds)...[0;m [0KPulling docker image apache/airflow:2.7.1-python3.11 ...[0;m [0KUsing docker image sha256:4fb802b917bcf57622c5c7f12d484c17f31338587d630786df3272285a1d85df for apache/airflow:2.7.1-python3.11 with digest apache/airflow@sha256:d3fcbd26a961c3e3a88419760b693d7ecc1765a82d080512b1526c6a0411f2a9 ...[0;m section_end:1736522599:prepare_executor [0Ksection_start:1736522599:prepare_script [0K[0K[36;1mPreparing environment[0;m[0;m Running on runner-hyxfyj2eh-project-14162-concurrent-0 via LAPTOP-96G58TFM... section_end:1736522600:prepare_script [0Ksection_start:1736522600:get_sources [0K[0K[36;1mGetting source from Git repository[0;m[0;m [32;1mFetching changes with git depth set to 20...[0;m Reinitialized existing Git repository in /builds/rayhanp1402/xops-gamimir/.git/ [32;1mChecking out 40f9972f as detached HEAD (ref is main)...[0;m [32;1mSkipping Git submodules setup[0;m section_end:1736522601:get_sources [0Ksection_start:1736522601:step_script [0K[0K[36;1mExecuting "step_script" stage of the job script[0;m[0;m [0KUsing docker image sha256:4fb802b917bcf57622c5c7f12d484c17f31338587d630786df3272285a1d85df for apache/airflow:2.7.1-python3.11 with digest apache/airflow@sha256:d3fcbd26a961c3e3a88419760b693d7ecc1765a82d080512b1526c6a0411f2a9 ...[0;m airflow command error: argument GROUP_OR_COMMAND: invalid choice: 'sh' (choose from 'cheat-sheet', 'config', 'connections', 'dag-processor', 'dags', 'db', 'info', 'jobs', 'kerberos', 'plugins', 'pools', 'providers', 'roles', 'rotate-fernet-key', 'scheduler', 'standalone', 'sync-perm', 'tasks', 'triggerer', 'users', 'variables', 'version', 'webserver'), see help above. Usage: airflow [-h] GROUP_OR_COMMAND ... Positional Arguments: GROUP_OR_COMMAND Groups: config View configuration connections Manage connections dags Manage DAGs db Database operations jobs Manage jobs pools Manage pools providers Display providers roles Manage roles tasks Manage tasks users Manage users variables Manage variables Commands: cheat-sheet Display cheat sheet dag-processor Start a standalone Dag Processor instance info Show information about current Airflow and environment kerberos Start a kerberos ticket renewer plugins Dump information about loaded plugins rotate-fernet-key Rotate encrypted connection credentials and variables scheduler Start a scheduler instance standalone Run an all-in-one copy of Airflow sync-perm Update permissions for existing roles and optionally DAGs triggerer Start a triggerer instance version Show the version webserver Start a Airflow webserver instance Options: -h, --help show this help message and exit section_end:1736522605:step_script [0Ksection_start:1736522605:cleanup_file_variables [0K[0K[36;1mCleaning up project directory and file based variables[0;m[0;m section_end:1736522605:cleanup_file_variables [0K[31;1mERROR: Job failed: exit code 2 [0;m