forked from google/oss-fuzz
-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
build(deps): bump pygithub from 1.51 to 1.55 in /infra/build/functions #5
Open
dependabot
wants to merge
1
commit into
master
Choose a base branch
from
dependabot/pip/infra/build/functions/pygithub-1.55
base: master
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
build(deps): bump pygithub from 1.51 to 1.55 in /infra/build/functions #5
dependabot
wants to merge
1
commit into
master
from
dependabot/pip/infra/build/functions/pygithub-1.55
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Bumps [pygithub](https://github.com/pygithub/pygithub) from 1.51 to 1.55. - [Release notes](https://github.com/pygithub/pygithub/releases) - [Changelog](https://github.com/PyGithub/PyGithub/blob/master/doc/changes.rst) - [Commits](PyGithub/PyGithub@v1.51...v1.55) --- updated-dependencies: - dependency-name: pygithub dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] <[email protected]>
dependabot
bot
added
dependencies
Pull requests that update a dependency file
python
Pull requests that update Python code
labels
Mar 24, 2022
evverx
pushed a commit
that referenced
this pull request
Dec 18, 2022
cc @oliverchang @alan32liu after google#9100 and google#8448 After compiling locally, I can see that `./SystemSan ./target_dns -dict=vuln.dict` crashes in a few seconds with ``` ===BUG DETECTED: Arbitrary domain name resolution=== ===Domain resolved: .f.z=== ===DNS request type: 0, class: 256=== ==315== ERROR: libFuzzer: deadly signal #0 0x539131 in __sanitizer_print_stack_trace /src/llvm-project/compiler-rt/lib/asan/asan_stack.cpp:87:3 #1 0x457c48 in fuzzer::PrintStackTrace() /src/llvm-project/compiler-rt/lib/fuzzer/FuzzerUtil.cpp:210:5 #2 0x43c923 in fuzzer::Fuzzer::CrashCallback() /src/llvm-project/compiler-rt/lib/fuzzer/FuzzerLoop.cpp:233:3 #3 0x7fa57940041f (/lib/x86_64-linux-gnu/libpthread.so.0+0x1441f) (BuildId: 7b4536f41cdaa5888408e82d0836e33dcf436466) #4 0x7fa5793ff7db in send (/lib/x86_64-linux-gnu/libpthread.so.0+0x137db) (BuildId: 7b4536f41cdaa5888408e82d0836e33dcf436466) #5 0x503ba4 in __interceptor_send /src/llvm-project/compiler-rt/lib/asan/../sanitizer_common/sanitizer_common_interceptors.inc:6802:17 #6 0x7fa578abf462 (/lib/x86_64-linux-gnu/libresolv.so.2+0xb462) (BuildId: 4519041bde5b859c55798ac0745b0b6199cb7d94) #7 0x7fa578abbc43 in __res_context_query (/lib/x86_64-linux-gnu/libresolv.so.2+0x7c43) (BuildId: 4519041bde5b859c55798ac0745b0b6199cb7d94) #8 0x7fa578abc8ed in __res_context_search (/lib/x86_64-linux-gnu/libresolv.so.2+0x88ed) (BuildId: 4519041bde5b859c55798ac0745b0b6199cb7d94) #9 0x7fa578ad2cc1 (/lib/x86_64-linux-gnu/libnss_dns.so.2+0x2cc1) (BuildId: 3fac4ec397ba8e8938fe298f103113f315465130) #10 0x7fa578ad2e8b in _nss_dns_gethostbyname3_r (/lib/x86_64-linux-gnu/libnss_dns.so.2+0x2e8b) (BuildId: 3fac4ec397ba8e8938fe298f103113f315465130) #11 0x7fa578ad2f41 in _nss_dns_gethostbyname2_r (/lib/x86_64-linux-gnu/libnss_dns.so.2+0x2f41) (BuildId: 3fac4ec397ba8e8938fe298f103113f315465130) #12 0x7fa5792fdc9d in gethostbyname2_r (/lib/x86_64-linux-gnu/libc.so.6+0x130c9d) (BuildId: 1878e6b475720c7c51969e69ab2d276fae6d1dee) #13 0x7fa5792d179e (/lib/x86_64-linux-gnu/libc.so.6+0x10479e) (BuildId: 1878e6b475720c7c51969e69ab2d276fae6d1dee) google#14 0x7fa5792d2f58 in getaddrinfo (/lib/x86_64-linux-gnu/libc.so.6+0x105f58) (BuildId: 1878e6b475720c7c51969e69ab2d276fae6d1dee) google#15 0x4d93ac in getaddrinfo /src/llvm-project/compiler-rt/lib/asan/../sanitizer_common/sanitizer_common_interceptors.inc:2667:13 google#16 0x56c8d9 in LLVMFuzzerTestOneInput /out/SystemSan/target_dns.cpp:35:11 google#17 0x43dec3 in fuzzer::Fuzzer::ExecuteCallback(unsigned char const*, unsigned long) /src/llvm-project/compiler-rt/lib/fuzzer/FuzzerLoop.cpp:611:15 google#18 0x43d6aa in fuzzer::Fuzzer::RunOne(unsigned char const*, unsigned long, bool, fuzzer::InputInfo*, bool, bool*) /src/llvm-project/compiler-rt/lib/fuzzer/FuzzerLoop.cpp:514:3 google#19 0x43ed79 in fuzzer::Fuzzer::MutateAndTestOne() /src/llvm-project/compiler-rt/lib/fuzzer/FuzzerLoop.cpp:757:19 google#20 0x43fa45 in fuzzer::Fuzzer::Loop(std::__Fuzzer::vector<fuzzer::SizedFile, std::__Fuzzer::allocator<fuzzer::SizedFile> >&) /src/llvm-project/compiler-rt/lib/fuzzer/FuzzerLoop.cpp:895:5 google#21 0x42edaf in fuzzer::FuzzerDriver(int*, char***, int (*)(unsigned char const*, unsigned long)) /src/llvm-project/compiler-rt/lib/fuzzer/FuzzerDriver.cpp:912:6 google#22 0x458402 in main /src/llvm-project/compiler-rt/lib/fuzzer/FuzzerMain.cpp:20:10 google#23 0x7fa5791f1082 in __libc_start_main (/lib/x86_64-linux-gnu/libc.so.6+0x24082) (BuildId: 1878e6b475720c7c51969e69ab2d276fae6d1dee) google#24 0x41f7ed in _start (/out/SystemSan/target_dns+0x41f7ed) NOTE: libFuzzer has rudimentary signal handlers. Combine libFuzzer with AddressSanitizer or similar for better crash reports. SUMMARY: libFuzzer: deadly signal MS: 2 CrossOver-ManualDict- DE: "f.z"-; base unit: ac3478d69a3c81fa62e60f5c3696165a4e5e6ac4 0x66,0x2e,0x7a, f.z artifact_prefix='./'; Test unit written to ./crash-926813b2d6adde373f96a10594a5314951588384 Base64: Zi56 ``` You can also try ``` echo -n f.z > toto ./SystemSan ./target_dns toto ``` Co-authored-by: Oliver Chang <[email protected]> Co-authored-by: jonathanmetzman <[email protected]>
evverx
pushed a commit
that referenced
this pull request
Apr 22, 2023
We've still got an issue with crashes on the urllib3 requests test that uses the mock HTTP server. Fix google#9958 to handle port mapping errors didn't resolve it. I got a feeling there's an ordering issue. Looking at the error logs [https://bugs.chromium.org/p/oss-fuzz/issues/detail?id=56500#c2](https://bugs.chromium.org/p/oss-fuzz/issues/detail?id=56500#c2) there appears to be an issue where we're throwing exceptions before the coverage completes. ``` === Uncaught Python exception: === -- | MaxRetryError: HTTPConnectionPool(host='localhost', port=8011): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f4cdf33d1f0>: Failed to establish a new connection: [Errno 101] Network is unreachable')) | Traceback (most recent call last): | File "fuzz_requests.py", line 109, in TestOneInput | File "urllib3/_request_methods.py", line 118, in request | File "urllib3/_request_methods.py", line 217, in request_encode_body | File "urllib3/poolmanager.py", line 433, in urlopen | File "urllib3/connectionpool.py", line 874, in urlopen | File "urllib3/connectionpool.py", line 874, in urlopen | File "urllib3/connectionpool.py", line 874, in urlopen | File "urllib3/connectionpool.py", line 844, in urlopen | File "urllib3/util/retry.py", line 505, in increment | MaxRetryError: HTTPConnectionPool(host='localhost', port=8011): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f4cdf33d1f0>: Failed to establish a new connection: [Errno 101] Network is unreachable')) | | INFO: Instrumenting 3854 functions... | INFO: Instrumentation complete. | ==10674== ERROR: libFuzzer: fuzz target exited | #0 0x7f4ce0bac694 in __sanitizer_print_stack_trace /src/llvm-project/compiler-rt/lib/ubsan/ubsan_diag_standalone.cpp:31:3 | #1 0x7f4ce0b2df48 in fuzzer::PrintStackTrace() /src/llvm-project/compiler-rt/lib/fuzzer/FuzzerUtil.cpp:210:5 | #2 0x7f4ce0b12cdc in fuzzer::Fuzzer::ExitCallback() /src/llvm-project/compiler-rt/lib/fuzzer/FuzzerLoop.cpp:250:3 | #3 0x7f4ce09068a6 in __run_exit_handlers /build/glibc-SzIz7B/glibc-2.31/stdlib/exit.c:108:8 | #4 0x7f4ce0906a5f in exit /build/glibc-SzIz7B/glibc-2.31/stdlib/exit.c:139:3 | #5 0x7f4ce03b2c78 in libpython3.8.so.1.0 | #6 0x7f4ce03b76cf in libpython3.8.so.1.0 | #7 0x403ad2 in fuzz_requests.pkg | #8 0x403e67 in fuzz_requests.pkg | #9 0x7f4ce08e4082 in __libc_start_main /build/glibc-SzIz7B/glibc-2.31/csu/libc-start.c:308:16 | #10 0x40249d in fuzz_requests.pkg | | SUMMARY: libFuzzer: fuzz target exited ``` This is an attempted fix inspired by the requests [fuzz_server.py](https://github.com/google/oss-fuzz/blob/master/projects/requests/fuzz_server.py) where the lifecycle of the test thread is managed within the server. Since the web server is created at the start of `TestOneInput` I don't expect there to be any timing issues or thread initialisation issues.
evverx
pushed a commit
that referenced
this pull request
Sep 19, 2023
When running fuzz targets we check for validity by checking if `LLVMFuzzerTestOneInput` exists in the target file: https://github.com/google/oss-fuzz/blob/8d1f1306fda3464fb3a7ec8b4227308d315ed495/infra/base-images/base-runner/coverage#L307-L314 However, this is not done in the post processing step of the coverage utility: https://github.com/google/oss-fuzz/blob/8d1f1306fda3464fb3a7ec8b4227308d315ed495/infra/base-images/base-runner/coverage#L415-L418 This causes coverage build issues e.g. https://oss-fuzz-build-logs.storage.googleapis.com/log-b8d4899d-ecc3-498c-8485-2e88d162dc57.txt ``` Step #5: [INFO] Loading execution data file /workspace/out/libfuzzer-coverage-x86_64/dumps/OpenSSHConfigFuzzer.exec. Step #5: [INFO] Analyzing 3 classes. Step #5: [INFO] Loading execution data file /workspace/out/libfuzzer-coverage-x86_64/dumps/OpenSSHConfigFuzzer.exec. Step #5: [INFO] Writing execution data to /workspace/out/libfuzzer-coverage-x86_64/dumps/jacoco.merged.exec. Step #5: cp: cannot stat '/workspace/out/libfuzzer-coverage-x86_64/dumps/jsch-fuzzer-0.2.10-SNAPSHOT.jar_classes/*': No such file or directory Step #5: ******************************************************************************** ``` Signed-off-by: David Korczynski <[email protected]>
evverx
pushed a commit
that referenced
this pull request
Oct 19, 2023
The coverage builder complains that rust stdlib sources is missing. ``` Step #5: �[0;31merror: /workspace/out/libfuzzer-coverage-x86_64/rustc/187b8131d4f760f856b214fce34534903276f2ef/library/core/src/panic.rs: No such file or directory Step #5: �[0m�[0;31mwarning: The file '/rustc/187b8131d4f760f856b214fce34534903276f2ef/library/core/src/panic.rs' isn't covered. Step #5: �[0m[2023-10-04 06:50:42,764 DEBUG] Finished generating per-file code coverage summary. Step #5: [2023-10-04 06:50:42,764 DEBUG] Generating file view html index file as: "/workspace/out/libfuzzer-coverage-x86_64/report/linux/file_view_index.html". Step #5: Traceback (most recent call last): Step #5: File "/opt/code_coverage/coverage_utils.py", line 829, in <module> Step #5: sys.exit(Main()) Step #5: File "/opt/code_coverage/coverage_utils.py", line 823, in Main Step #5: return _CmdPostProcess(args) Step #5: File "/opt/code_coverage/coverage_utils.py", line 780, in _CmdPostProcess Step #5: processor.PrepareHtmlReport() Step #5: File "/opt/code_coverage/coverage_utils.py", line 577, in PrepareHtmlReport Step #5: self.GenerateFileViewHtmlIndexFile(per_file_coverage_summary, Step #5: File "/opt/code_coverage/coverage_utils.py", line 450, in GenerateFileViewHtmlIndexFile Step #5: self.GetCoverageHtmlReportPathForFile(file_path), Step #5: File "/opt/code_coverage/coverage_utils.py", line 422, in GetCoverageHtmlReportPathForFile Step #5: assert os.path.isfile( Step #5: AssertionError: "/rustc/187b8131d4f760f856b214fce34534903276f2ef/library/core/src/panic.rs" is not a file. ``` I think the sources would be copied in base-builder if we declare cras as a rust project: https://github.com/google/oss-fuzz/blob/d514fac92686c656633aa8549fd6f239c964b2bc/infra/base-images/base-builder/compile#L226-L231 Fixes: https://crbug.com/oss-fuzz/62974
evverx
pushed a commit
that referenced
this pull request
Dec 4, 2023
… count (google#10277) The current number of parallel fuzzers running is set to the number of available CPUs. This is causing issues in Tensorflow: ``` Step #5: error: Could not load coverage information Step #5: error: No such file or directory: Could not read profile data! Step #5: /usr/local/bin/coverage: line 75: 4501 Killed llvm-profdata merge -j=1 -sparse $profraw_file_mask -o $profdata_file ... Step #5: error: decode_compressed_fuzz: Failed to load coverage: No such file or directory Step #5: error: Could not load coverage information Step #5: error: No such file or directory: Could not read profile data! Step #5: /usr/local/bin/coverage: line 75: 4873 Killed lvm-cov show -instr-profile=$profdata_file -object=$target -line-coverage-gt=0 $shared_libraries $BRANCH_COV_ARGS $LL VM_COV_COMMON_ARGS > ${TEXTCOV_REPORT_DIR}/$target.covreport Step #5: /usr/local/bin/coverage: line 75: 4897 Killed llvm-profdata merge -j=1 -sparse $profraw_file_mask -o $profdata_file ... Step #5: error: saved_model_fuzz: Failed to load coverage: No such file or directory Step #5: error: Could not load coverage information Step #5: error: No such file or directory: Could not read profile data! Step #5: /usr/local/bin/coverage: line 75: 4638 Killed llvm-profdata merge -j=1 -sparse $profraw_file_mask -o $profdata_file Step #5: [2023-05-08 11:57:05,246 INFO] Finding shared libraries for targets (if any). ... Step #5: [2023-05-08 11:57:09,911 INFO] Finished finding shared libraries for targets. Step #5: /usr/local/bin/coverage: line 75: 4276 Killed llvm-cov expor -summary-only -instr-profile=$profdata_file -object=$target $shared_libraries $LLVM_COV_COMMON_ARGS > $FUZZER_STATS_DIR/$target.json Step #5: /usr/local/bin/coverage: line 75: 5450 Killed llvm-profdata merge -j=1 -sparse $profraw_file_mask -o $profdata_file Step #5: [2023-05-08 11:57:40,282 INFO] Finding shared libraries for targets (if any). Step #5: [2023-05-08 11:57:40,323 INFO] Finished finding shared libraries for targets. Step #5: error: end_to_end_fuzz: Failed to load coverage: No such file or directory Step #5: error: Could not load coverage information Step #5: error: No such file or directory: Could not read profile data! ``` [log](https://oss-fuzz-build-logs.storage.googleapis.com/log-050f4040-5009-4a23-81c4-9093922b4ffb.txt) (don't open in a browser but `wget`/`curl` it, as it's quite a large file and will probably annoy the browser). I assume this is because the fuzzers take up lots of the memory. A Tensorflow fuzzer can be ~3GB and there are ~50 fuzzers in Tensorflow, so I think the artifacts read by `llvm-profdata merge` will eat up memory, which consequently starts to crash processes on the system. I could imagine this happens for more projects with many fuzzers of large size? Signed-off-by: David Korczynski <[email protected]> Co-authored-by: Oliver Chang <[email protected]>
evverx
pushed a commit
that referenced
this pull request
Feb 1, 2024
Base PR apache/brpc#2420 ; NOTE: I can't enable memory sanitizer due to ```log BAD BUILD: /tmp/not-out/tmpmptlk01q/fuzz_esp seems to have either startup crash or exit: /tmp/not-out/tmpmptlk01q/fuzz_esp -rss_limit_mb=2560 -timeout=25 -seed=1337 -runs=4 < /dev/null Uninitialized bytes in MemcmpInterceptorCommon at offset 15 inside [0x7030000000f0, 19) ==428==WARNING: MemorySanitizer: use-of-uninitialized-value #0 0x682b90 in __interceptor_memcmp /src/llvm-project/compiler-rt/lib/msan/../sanitizer_common/sanitizer_common_interceptors.inc:892:10 #1 0x7fa8ef4cf62a in google::protobuf::SimpleDescriptorDatabase::DescriptorIndex<std::pair<void const*, int> >::FindLastLessOrEqual(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) (/tmp/not-out/tmpmptlk01q/lib/libprotobuf.so.17+0x15062a) (BuildId: 64affeb0f489ae4bcea211ed99e1eca15ff97d68) #2 0x7fa8ef4d259f in google::protobuf::SimpleDescriptorDatabase::DescriptorIndex<std::pair<void const*, int> >::AddSymbol(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::pair<void const*, int>) (/tmp/not-out/tmpmptlk01q/lib/libprotobuf.so.17+0x15359f) (BuildId: 64affeb0f489ae4bcea211ed99e1eca15ff97d68) #3 0x7fa8ef4d2a15 in google::protobuf::SimpleDescriptorDatabase::DescriptorIndex<std::pair<void const*, int> >::AddFile(google::protobuf::FileDescriptorProto const&, std::pair<void const*, int>) (/tmp/not-out/tmpmptlk01q/lib/libprotobuf.so.17+0x153a15) (BuildId: 64affeb0f489ae4bcea211ed99e1eca15ff97d68) #4 0x7fa8ef4cebef in google::protobuf::EncodedDescriptorDatabase::Add(void const*, int) (/tmp/not-out/tmpmptlk01q/lib/libprotobuf.so.17+0x14fbef) (BuildId: 64affeb0f489ae4bcea211ed99e1eca15ff97d68) #5 0x7fa8ef499f43 in google::protobuf::DescriptorPool::InternalAddGeneratedFile(void const*, int) (/tmp/not-out/tmpmptlk01q/lib/libprotobuf.so.17+0x11af43) (BuildId: 64affeb0f489ae4bcea211ed99e1eca15ff97d68) #6 0x7fa8ef49281d in protobuf_google_2fprotobuf_2fapi_2eproto::AddDescriptorsImpl() (/tmp/not-out/tmpmptlk01q/lib/libprotobuf.so.17+0x11381d) (BuildId: 64affeb0f489ae4bcea211ed99e1eca15ff97d68) ``` Signed-off-by: Arjun Singh <[email protected]>
evverx
pushed a commit
that referenced
this pull request
Mar 7, 2024
this should hopefully resolve the recent build issue: ``` Step #5: warning: /workspace/out/libfuzzer-coverage-x86_64/dumps/parser.14845295624977050394_0.profraw: unsupported instrumentation profile format version Step #5: error: no profile can be merged Step #5: [2024-02-28 06:09:14,766 INFO] Finding shared libraries for targets (if any). Step #5: [2024-02-28 06:09:14,775 INFO] Finished finding shared libraries for targets. Step #5: error: parser: Failed to load coverage: No such file or directory Step #5: error: Could not load coverage information Step #5: error: No such file or directory: Could not read profile data! Step #5: Traceback (most recent call last): Step #5: File "/usr/local/bin/profraw_update.py", line 129, in <module> Step #5: sys.exit(main()) Step #5: File "/usr/local/bin/profraw_update.py", line 120, in main Step #5: profraw_latest = upgrade(profraw_base, sect_prf_cnts, sect_prf_data) Step #5: File "/usr/local/bin/profraw_update.py", line 87, in upgrade Step #5: relativize_address(data, offset + 16, dataref, sect_prf_cnts, sect_prf_data) Step #5: File "/usr/local/bin/profraw_update.py", line 35, in relativize_address Step #5: value = struct.unpack('Q', data[offset:offset + 8])[0] Step #5: struct.error: unpack requires a buffer of 8 bytes ```
evverx
pushed a commit
that referenced
this pull request
Oct 8, 2024
## Description This will make it easier to debug coverage failures that are not reproducible locally. The failure that I am trying to debug: - https://bugs.chromium.org/p/oss-fuzz/issues/detail?id=62231 - https://oss-fuzz-build-logs.storage.googleapis.com/log-c420cf0c-f073-4c42-b75c-422971ef272e.txt ``` Step #5: Already have image (with digest): gcr.io/oss-fuzz-base/base-runner Step #5: Entering python fuzzing Step #5: Error happened getting coverage of fuzz_parse Step #5: This is likely because Atheris did not exit gracefully ``` Similar log data is displayed in other blocks: https://github.com/google/oss-fuzz/blob/f7165902492d5cff5ee23c018875395061a3bd2b/infra/base-images/base-runner/coverage#L101-L105 https://github.com/google/oss-fuzz/blob/f7165902492d5cff5ee23c018875395061a3bd2b/infra/base-images/base-runner/coverage#L149-L153 https://github.com/google/oss-fuzz/blob/f7165902492d5cff5ee23c018875395061a3bd2b/infra/base-images/base-runner/coverage#L206-L210 https://github.com/google/oss-fuzz/blob/f7165902492d5cff5ee23c018875395061a3bd2b/infra/base-images/base-runner/coverage#L255-L260 --- This PR is a continuation of google#12405 with a renamed branch to avoid trial-build errors: ``` ERROR: (gcloud.builds.submit) INVALID_ARGUMENT: invalid build: invalid build tag "testing-cm/display-coverage-log": must match format "^[\\w][\\w.-]{0,127}$" ``` Co-authored-by: Vitor Guidi <[email protected]>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
dependencies
Pull requests that update a dependency file
python
Pull requests that update Python code
0 participants
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Bumps pygithub from 1.51 to 1.55.
Release notes
Sourced from pygithub's releases.
... (truncated)
Changelog
Sourced from pygithub's changelog.
... (truncated)
Commits
3153833
Publish version 1.5527ba783
Do not import from unpackaged paths in typing (#1926)4faff23
Implement hash for CompletableGithubObject (#1922)e416810
Use property decorator to improve typing compatibility (#1925)54b6a97
Fix :rtype: directive (#1927)babcbcd
Update most URLs to docs.github.com (#1896)ad124ef
key_id could be int on Github Enterprise (#1894)2c77cfa
Adjust to Github API changes regarding emails (#1890)5aab6f5
Tighten asserts for new Permission tests (#1893)7687961
Adding attributes "maintain" and "triage" to class "Permissions" (#1810)Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting
@dependabot rebase
.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
@dependabot rebase
will rebase this PR@dependabot recreate
will recreate this PR, overwriting any edits that have been made to it@dependabot merge
will merge this PR after your CI passes on it@dependabot squash and merge
will squash and merge this PR after your CI passes on it@dependabot cancel merge
will cancel a previously requested merge and block automerging@dependabot reopen
will reopen this PR if it is closed@dependabot close
will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually@dependabot ignore this major version
will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this minor version
will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this dependency
will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)