You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I expect now logs lower than WARNING will be included in the training job logs. However, a lot of INFO level logs are observed. Particularly, a lot of INFO logs related to frequent checkpoint uploading (from instance to S3) are observed, which makes the entire log super long. I tried logging.ERROR and no luck either.
Expected behavior
No log except WARNING and ERROR level ones should be observed.
Screenshots or logs
If applicable, add screenshots or logs to help explain your problem.
System information
A description of your system. Please provide:
SageMaker Python SDK version: 2.5.1
Framework name (eg. PyTorch) or algorithm (eg. KMeans): TensorFlow
Framework version: 2.1
Python version: 3.7
CPU or GPU: GPU
Custom Docker image (Y/N): N
The text was updated successfully, but these errors were encountered:
With the above logic, as long as logging level >= logging.INFO, boto3 and s3transfer will ALWAYS have logging.INFO and botocore will always have logging.WARN.
We should pass the log_level as it is to training-toolkit's logger configure
Describe the bug
Parameter
container_log_level
does not work in TensorFlow estimatorTo reproduce
I have a TensorFlow estimator built roughly as follows
I expect now logs lower than WARNING will be included in the training job logs. However, a lot of INFO level logs are observed. Particularly, a lot of INFO logs related to frequent checkpoint uploading (from instance to S3) are observed, which makes the entire log super long. I tried logging.ERROR and no luck either.
Expected behavior
No log except WARNING and ERROR level ones should be observed.
Screenshots or logs
If applicable, add screenshots or logs to help explain your problem.
System information
A description of your system. Please provide:
The text was updated successfully, but these errors were encountered: