Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HADOOP-14239. S3A Retry Multiple S3 Key Deletion #208

Closed
wants to merge 1 commit into from

Conversation

kazuyukitanimura
Copy link

@kazuyukitanimura kazuyukitanimura commented Mar 25, 2017

Hi @steveloughran @liuml07

Sorry for sending may requests.

I explained the problem here https://issues.apache.org/jira/browse/HADOOP-14239

This pull requests recursively retries to delete only S3 keys that are previously failed to delete during the multiple object deletion because aws-java-sdk retry does not help. If it still fails, it will fall back to the single deletion.

@kazuyukitanimura
Copy link
Author

Closing PR as stated in Jira

shanthoosh pushed a commit to shanthoosh/hadoop that referenced this pull request Oct 15, 2019
- Added `close()` to the lifecycle of `OperatorImpl`s, and all `Function`s.
- Added unit tests to verify calls to `close()`

Author: vjagadish1989 <[email protected]>

Reviewers: Prateek Maheshwari<[email protected]>

Closes apache#208 from vjagadish1989/operator_functions
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant