-
Notifications
You must be signed in to change notification settings - Fork 16.4k
add clarifying language to google backport readme #12416
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
The readme is automatically generated based on template: https://github.com/apache/airflow/blob/master/dev/provider_packages/PROVIDER_README_TEMPLATE.md.jinja2. And it will be overwritten the next time we generate it (which is most likely tomorrow). It's next to impossible to maintain those READMEs separately for 60 providers we have so we always generate them before releasing new set of packages. However, this template has one place where you can add per-provider specific information that will be added automatically to all the future READMEs - > {{ ADDITIONAL_INFO }}. It is taken from ADDITIONAL_INFO.md file that should be added in the appropriate provider folder (in this case "airflow/providers/google/ADDITIONAL_INFO.md"). So this explanation should be added there in order to include it next time were generate the README. |
potiuk
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As explaine - it should be added in ADDITIONAL_INFO.md file
|
Excellent - makes sense, I'll make that change! |
|
plz enjoy the comic relief of me mistakenly adding and removing blank lines all over the place as evidenced by my commits 😆 |
|
CI seems like it's failing because of lack of license header - I was poking around and it seems like the jinja templates have headers in them, but the generated .md files here do not. Should this new ADDITIONAL_INFO.md comply with the demands of the linter, or should it match with the generated ones? |
It should contain the licence. The licence comment gets removed when the content is read:
And you can see other ADDITIONAL_INFO.md files we have (this is the kubernetes one): The rule of thumb is that if a file is generated from other sources, even if it is committed, it does not have to have a licence. It's not per-se documented in ASF rules but this the information that we get when we run the official RAT tool from Apache: |
|
|
||
| ## Additional limitations | ||
|
|
||
| The [2020.11.13 release](#release-20201113) of this provider is only usable with Apache Airflow >= 1.10.12 version due to refactorings implemented in |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One comment here @leahecole -> I think it's not precise.
The package, in general, should be usable and installable with <1.10.12 - only the GKEPodOperator has some problems with <1.10.12. This is because cncf.kubernetes provider cannot be installed on airflow < 1.10.11 (and there were some known issues with cncf.kubernetes + 1.10.11). The cncf.kubernetes backport provider is an optional cross-dependency for the google backport provider). So we have actually not limited the google package to >=1.10.12 - you can still install it with airflow 1.10.9 and it will continue to work (except the GKEPodOperator)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
At least this is what we think it is. As far as I remember, you mentioned there are problems with installing the provider in composer environment with 1.10.9 so I am wondering where it came from in this case.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I had problems installing it in a Composer environment with 1.10.10, (1.10.11 isn't offered in Composer) but not 1.10.12, @mik-laj mentioned that the latest version of this package was only compatible with the latest version of Composer but earlier version of the package are compatible with older versions of Airflow - I'm happy to have you adjust the wording as you see fit - I'm just trying to emulate the language used with the kubernetes package and the info given to me by Kamil and get it documented centrally so no one else has to figure this out the hard way 😄
|
The Workflow run is cancelling this PR. It has some failed jobs matching ^Pylint$,^Static checks,^Build docs$,^Spell check docs$,^Backport packages$,^Provider packages,^Checks: Helm tests$,^Test OpenAPI*. |
|
Hey @leahecole -> sorry for not getting back on it but we are super-busy with preparing for 2.0.0 (and we have out-of-band 1.10.14 that I am working on getting a consistent set of dependencies). Will come back on it next week! |
|
No prob - I was offline for a few days due to a US holiday. If you want to set up a time to synchronously chat about this communication please let me know, and then if we do that, I can update the comments with our discussion details for transparency :) |
|
Ah yeah. Sorry for that @leahecole It was there waiting for some "freeier" time but it never came :( |
|
Ha! No worries. I did it mainly to clean up my PRs because I'm not focused on this right now. As the backport providers mature, I think things will be come more clear and this won't be as much of an issue. |
|
I think backports are already kinda "on their way out" - they have 2 months EOL (end of March) and we are not going to release them any more after that date. I am more focusing on the regular providers of 2.0: Maybe you can take a look instead at the new providers docs which we are discussing in #13767 (comment) and specifically take a look at the proposal on how the new "provider" documentation will look here: http://gabby-cough.surge.sh/. It would be great to get some comments from someone who might - sooner or later - want to use the providers! Any comments there are most welcome @leahecole ! |
|
I'll take a look at it either later today or tomorrow! |
After discovering this the hard way, I wanted to update the docs based on my slack convo with @potiuk and @mik-laj