Airflow console logs display in kuberntes container executor
Problem description
When use kubernetes executor
in airflow, the dags not use the podOperator but that use PythonOperator, the console logs end of Running %s on host %s <TaskInstance:
, then the task logs don’t redirect to stdout
According to the Airflow descript need config the logging class:
custom_log_settings
Change the custom log settings comment relative code and enable to console
the tee_file_task_handler.py
edit the following code snippet.
- comment the following content.
1 if self.write_stdout:
2 # Task has finished running, write entire contents of log to stdout
3 # self.write_task_to_stdout()
4
5 # def write_task_to_stdout(self):
6 # try:
7 # absolute_path = self.local_base + '/' + self.log_relative_path
8 # log_file = open(absolute_path, "r")
9 # contents = log_file.read()
10 # except IOError:
11 # pass
12
13 # self.stream_handler.emit(logging.makeLogRecord({
14 # 'msg': "*TASK_LOG*\n\n" + contents,
15 # 'log_path': absolute_path
16 # }))
- Override the emit method
1 def emit(self, record):
2 if self.handler:
3 self.handler.emit(record)
4 if self.stream_handler:
5 self.stream_handler.emit(record)
The Airflow Dockerfile
1FROM apache/airflow:1.10.12-python3.6
2COPY ./requirements.txt ./requirements.txt
3RUN pip3 install -r requirements.txt --user
4#COPY --chown=airflow:airflow ./webserver_config.py /opt/airflow/
5USER root
6RUN apt-get update -y
7
8# Define en_US.
9ENV LANGUAGE en_US.UTF-8
10ENV LANG en_US.UTF-8
11ENV LC_ALL en_US.UTF-8
12ENV LC_CTYPE en_US.UTF-8
13ENV LC_MESSAGES en_US.UTF-8
14
15# Disable noisy "Handling signal" log messages:
16# ENV GUNICORN_CMD_ARGS --log-level WARNING
17
18#RUN sed -i 's#http://deb.debian.org#https://mirrors.aliyun.com#g' /etc/apt/sources.list
19RUN apt-get install apt-transport-https
20
21# JVM installation needs man folder
22RUN mkdir -p /usr/share/man/man1
23
24RUN rm -rf /var/lib/apt/lists/* \
25 && apt-get clean \
26 && apt-get update -yqq \
27 && apt-get upgrade -yqq \
28 && apt-get install software-properties-common -y \
29 && apt-get install zip unzip -y \
30 && apt-get install -yqq --no-install-recommends \
31 curl \
32 locales \
33 && sed -i 's/^# en_US.UTF-8 UTF-8$/en_US.UTF-8 UTF-8/g' /etc/locale.gen \
34 && locale-gen \
35 && update-locale LANG=en_US.UTF-8 LC_ALL=en_US.UTF-8 \
36# && apt-get purge --auto-remove -yqq $buildDeps \
37 && apt-get autoremove -yqq --purge \
38 && apt-get clean
39
40RUN mkdir -p /opt/airflow/config
41COPY ./pymysql_config.py /opt/airflow/config/
42ENV AIRFLOW__CORE__SQL_ALCHEMY_CONNECT_ARGS=pymysql_config.CONNECT_ARGS
43
44USER airflow
45
46COPY config ${AIRFLOW_USER_HOME_DIR}/config
47ENV PYTHONPATH "${PYTHONPATH}:${AIRFLOW_USER_HOME_DIR}"
Configure PythonOperator environtment parameters
Specification the envs where the custom logging class and format
1executor_config={
2 "KubernetesExecutor": {
3 "labels": {
4 "task_id": task["task_id"]
5 },
6 "image": image,
7 "resources": pod_resources,
8 "envs": {
9 "TZ": time_zone,
10 "AIRFLOW__CORE__LOGGING_CONFIG_CLASS": "config.airflow_custom_log_settings.CONFIG",
11 "AIRFLOW__STDOUT__JSON_FORMAT": "true",
12 "AIRFLOW__STDOUT__WRITE_STDOUT": "true",
13 "AIRFLOW__STDOUT__JSON_FIELDS": "log_path, message"
14 }
15 }
16 }