Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error while running docker-compose up #1

Open
aziz-ab opened this issue Mar 28, 2023 · 1 comment
Open

Error while running docker-compose up #1

aziz-ab opened this issue Mar 28, 2023 · 1 comment

Comments

@aziz-ab
Copy link

aziz-ab commented Mar 28, 2023

Hi, thanks you for share this repo.

I'm trying to run docker-compose, but I get this error :

[root@svr-fon-app-d02 docker-airflow-pdi-02]# docker-compose up
WARNING: The PENTAHO_DI_JAVA_OPTIONS variable is not set. Defaulting to a blank string.
ERROR: The Compose file './docker-compose.yaml' is invalid because:
Invalid top-level property "x-airflow-common". Valid top-level sections for this Compose file are: version, services, networks, volumes, and extension                 s starting with "x-".

You might be seeing this error because you're using the wrong Compose file version. Either specify a supported version (e.g "2.2" or "3.3") and place                  your service definitions under the `services` key, or omit the `version` key and place your service definitions at the root of the file to use version                  1.
For more on the Compose file format versions, see https://docs.docker.com/compose/compose-file/
services.airflow-scheduler.depends_on contains an invalid type, it should be an array
services.airflow-init.depends_on contains an invalid type, it should be an array
services.airflow-webserver.depends_on contains an invalid type, it should be an array
services.airflow-init.volumes contains an invalid type, it should be a string
services.airflow-scheduler.volumes contains an invalid type, it should be a string
services.airflow-webserver.volumes contains an invalid type, it should be a string
services.airflow-worker.volumes contains an invalid type, it should be a string
services.airflow-init.volumes contains an invalid type, it should be a string
services.airflow-scheduler.volumes contains an invalid type, it should be a string
services.airflow-webserver.volumes contains an invalid type, it should be a string
services.airflow-worker.volumes contains an invalid type, it should be a string
services.airflow-init.volumes contains an invalid type, it should be a string
services.airflow-scheduler.volumes contains an invalid type, it should be a string
services.airflow-webserver.volumes contains an invalid type, it should be a string
services.airflow-worker.volumes contains an invalid type, it should be a string
services.airflow-init.volumes contains an invalid type, it should be a string
services.airflow-scheduler.volumes contains an invalid type, it should be a string
services.airflow-webserver.volumes contains an invalid type, it should be a string
services.airflow-worker.volumes contains an invalid type, it should be a string
services.airflow-init.volumes contains an invalid type, it should be a string
services.airflow-scheduler.volumes contains an invalid type, it should be a string
services.airflow-webserver.volumes contains an invalid type, it should be a string
services.airflow-worker.volumes contains an invalid type, it should be a string
services.airflow-init.volumes contains an invalid type, it should be a string
services.airflow-scheduler.volumes contains an invalid type, it should be a string
services.airflow-webserver.volumes contains an invalid type, it should be a string
services.airflow-worker.volumes contains an invalid type, it should be a string
services.airflow-init.volumes contains an invalid type, it should be a string
services.airflow-scheduler.volumes contains an invalid type, it should be a string
services.airflow-webserver.volumes contains an invalid type, it should be a string
services.airflow-worker.volumes contains an invalid type, it should be a string
services.airflow-init.volumes contains an invalid type, it should be a string
services.airflow-scheduler.volumes contains an invalid type, it should be a string
services.airflow-webserver.volumes contains an invalid type, it should be a string
services.airflow-worker.volumes contains an invalid type, it should be a string
[root@svr-fon-app-d02 docker-airflow-pdi-02]#

why this happen?

Thanks you alot.

@sarit-si
Copy link
Owner

sarit-si commented Mar 31, 2023

Hi,

With the given code as it is in this repo, I ran in GitPod, was able to run.
Check if the docker compose version is correct, i have used version 3. Different versions might have different schema.
Check if the services mentioned in the depends_on section of each service are present in the compose file.
Try command docker-compose config if you can check if it's able to get parsed, else it wont in case of any error or info missing. You can also remove the bindings as i did below for the logs and plugins folder.

version: '3'

x-airflow-common:
  &airflow-common
  image: apache/airflow:2.0.1
  environment:
    &airflow-common-env
    HOST_ENV: ${HOST_ENV:-localhost}
    AIRFLOW_UID: ${AIRFLOW_UID}
    AIRFLOW_GID: ${AIRFLOW_GID}
    AIRFLOW__CORE__EXECUTOR: CeleryExecutor
    AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql+psycopg2://airflow:airflow@airflow-database/airflow
    AIRFLOW__CELERY__RESULT_BACKEND: db+postgresql://airflow:airflow@airflow-database/airflow
    AIRFLOW__CELERY__BROKER_URL: redis://:@airflow-broker:6379/0
    AIRFLOW__CORE__FERNET_KEY: ''
    AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: 'true'
    AIRFLOW__CORE__LOAD_EXAMPLES: 'false'
  volumes:
    # Airflow mounts
    - /var/run/docker.sock:/var/run/docker.sock
    - type: bind
      source: ./source-code/dags
      target: /opt/airflow/dags
    # - type: bind
    #   source: ./setup-airflow-pdi/plugins
    #   target: /opt/airflow/plugins
    # - type: bind
    #   source: ./setup-airflow-pdi/logs/airflow
    #   target: /opt/airflow/logs
    - type: bind
      source: ./setup-airflow-pdi/airflow.cfg
      target: /opt/airflow/airflow.cfg
    # Pdi mounts
    - type: bind
      source: ./source-code/ktrs
      target: /opt/airflow/ktrs
    - type: bind
      source: ./setup-airflow-pdi/logs/pdi
      target: /opt/airflow/data-integration/logs
    - type: bind
      source: ./setup-airflow-pdi/kettle-properties/${HOST_ENV:-localhost}-kettle.properties
      target: /opt/airflow/data-integration/.kettle/kettle.properties
    - type: bind
      source: ./setup-airflow-pdi/simple-jndi
      target: /opt/airflow/data-integration/simple-jndi


services:
# Airflow-DB
  airflow-database:
    image: postgres:13
    container_name: airflow-database
    environment:
      POSTGRES_USER: airflow
      POSTGRES_PASSWORD: airflow
      POSTGRES_DB: airflow
    volumes:
      - postgres-db-volume:/var/lib/postgresql/data
    healthcheck:
      test: ["CMD", "pg_isready", "-U", "airflow"]
      interval: 5s
      retries: 5
    restart: always

# Airflow-messenger
  airflow-broker:
    image: redis:latest
    container_name: airflow-broker
    ports:
      - 6379:6379
    healthcheck:
      test: ["CMD", "redis-cli", "ping"]
      interval: 5s
      timeout: 30s
      retries: 50
    restart: always

# Airflow-webserver
  airflow-webserver:
    <<: *airflow-common
    container_name: airflow-webserver
    depends_on:
      airflow-database:
        condition: service_healthy
    ports:
      - ${AIRFLOW_HOST_PORT:-8080}:8080
    healthcheck:
      test: ["CMD", "curl", "--fail", "http://localhost:${AIRFLOW_HOST_PORT:-8080}/health"]
      interval: 10s
      timeout: 10s
      retries: 5
    command: webserver
    restart: always

# Airflow-scheduler
  airflow-scheduler:
    <<: *airflow-common
    container_name: airflow-scheduler
    depends_on:
      airflow-broker:
        condition: service_healthy
      airflow-database:
        condition: service_healthy
    command: scheduler
    restart: always

# Airflow-DB-initialize
  airflow-init:
    <<: *airflow-common
    container_name: airflow-init
    environment:
      <<: *airflow-common-env
      _AIRFLOW_DB_UPGRADE: 'true'
      _AIRFLOW_WWW_USER_CREATE: 'true'
      _AIRFLOW_WWW_USER_USERNAME: ${AIRFLOW_ADMIN_USER:-airflow}
      _AIRFLOW_WWW_USER_PASSWORD: ${AIRFLOW_ADMIN_PASSWORD:-airflow}
      _AIRFLOW_WWW_USER_EMAIL: ${AIRFLOW_ADMIN_EMAIL:[email protected]}
    depends_on:
      airflow-database:
        condition: service_healthy
    command: version

# Airflow-worker
  airflow-worker:
    <<: *airflow-common
    build:
      context: ./setup-airflow-pdi
    image: airflow-pdi
    environment:
      <<: *airflow-common-env
      PENTAHO_DI_JAVA_OPTIONS: ${PENTAHO_DI_JAVA_OPTIONS}
    command: celery worker
    restart: always

volumes:
  postgres-db-volume:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants