You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Traceback (most recent call last):
File "/var/task/function.py", line 496, in lambda_handler
_send_log_entry(log_line, context)
File "/var/task/function.py", line 203, in _send_log_entry
entry_type = _get_entry_type(log_entry)
File "/var/task/function.py", line 309, in _get_entry_type
if '"logGroup":"/aws/vpc/flow-logs"' in log_entry:
TypeError: a bytes-like object is required, not 'str'
Steps to reproduce:
Hook up the lambda to an S3 bucket (I'm not sure it matters which events you choose to trigger on, as long as there's data in the bucket)
Add a file to the bucket
Check the output of this Lambda
The text was updated successfully, but these errors were encountered:
bmcfeely
changed the title
Error reading from S3
Error dealing with S3 data
Oct 1, 2019
def _get_s3_data(bucket, key):
'''
This function gets a specific log file from the given S3 bucket and
decompresses it.
'''
s3_client = boto3.client('s3')
data = s3_client.get_object(Bucket=bucket, Key=key)['Body'].read()
#if key.split('.')[-1] == 'gz':
data = gzip.GzipFile(fileobj=BytesIO(data)).read().decode("utf-8")
return data
It looks like there might be a bug in https://github.com/newrelic/aws-log-ingestion/blob/master/src/function.py when reading s3.
The stack trace looks like:
Steps to reproduce:
The text was updated successfully, but these errors were encountered: