-
Notifications
You must be signed in to change notification settings - Fork 14.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SparkKubernetesOperator fails after upgrade from 2.8.1 to 2.8.2 #38017
Comments
Thanks for opening your first issue here! Be sure to follow the issue template! If you are willing to raise PR to address this issue please do so, no need to wait for approval. |
cc: @hamedhsn ? Can you please take a look? It seems related to your change As a workaround @gbloisi-openaire -> you can easily downgrade If you are using container image - you can build your own image and downgrade providers there: https://airflow.apache.org/docs/docker-stack/build.html#example-of-customizing-airflow-provider-packages |
@gbloisi-openaire can you pls post the content of your |
Plese find it below. It is an adaption from tests/system/providers/cncf/kubernetes/example_spark_kubernetes_spark_pi.yaml. I removed license's comments after a few tries to have confirmation that the error was reporting the content of this file. Also my DAG is an adaption from the provided tests/system/providers/cncf/kubernetes/example_spark_kubernetes.py.
|
This cause by multiple changes and some of them mutable exclusive or conflicts to the Airflow Core
If value expected non-templated file I would recommend to use LiteralValue (see: #35017, this not documented well) wrapper instead |
I had the same issue described in the post after upgrading from 2.8.1 to 2.8.2. Will it be fixed in 2.8.3 version ? |
Well. The merge happened 6 hours ago and 2.8.3 has been released yesterday - so - obviously - no. But this is a provider fix - and you can both - downgrade provider to older version that had no problems and upgrade the provider when it gets released after the fix. So you do not have to wait - you can downgrade the provider now - and then when we release an RC candidate for new providers - in a few days you can take RC for a spin and test it. Can we count on your help testing it when RC is out @renanxx1 ? |
If anyone finds this, I was able to run Airflow 2.8.3 with apache-airflow-providers-cncf-kubernetes 7.13.0. Our container does custom PIP installs and I set the requirement as |
I still have the same issue on Airflow 2.9.0 |
the error still persists in v2.8.3 |
I create my own operator based on previous version and then I use it without fear that in new update operator will be changed or broken.
Don't forget to change
use operator as next:
Or u can use new KubernetusSparkOperator by next example:
U can build spark image used next DockerFile:
|
Apache Airflow version
2.8.2
If "Other Airflow 2 version" selected, which one?
2.8.3rc1
What happened?
I'm running a spark-pi example using the SparkKubernetesOperator:
It was running fine on 2.8.1. After upgrading to airflow 2.8.2 I got the following error:
It looks like self.application_file eventually contains the content of the file it point to.
I suspect it was caused by changes introduced by PR-22253. I'm quite new to Airflow and Python but my guess is that "application_file" property hasn't to be managed as a template_property since template representations where moved to template_body.
What you think should happen instead?
No response
How to reproduce
Given my understanding of the issue, a very simple example of SparkKubernetesOperator using application_file property should reproduce this issue.
Operating System
kind kubernetes
Versions of Apache Airflow Providers
No response
Deployment
Official Apache Airflow Helm Chart
Deployment details
No response
Anything else?
No response
Are you willing to submit PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: