Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Kibana 7.12 reporting - blank csv #100470

Closed
laurentiubanica opened this issue May 24, 2021 · 16 comments
Closed

Kibana 7.12 reporting - blank csv #100470

laurentiubanica opened this issue May 24, 2021 · 16 comments
Labels
bug Fixes for quality problems that affect the customer experience (Deprecated) Feature:Reporting Use Reporting:Screenshot, Reporting:CSV, or Reporting:Framework instead feedback_needed impact:high Addressing this issue will have a high level of impact on the quality/strength of our product. loe:small Small Level of Effort

Comments

@laurentiubanica
Copy link

laurentiubanica commented May 24, 2021

Hi,

We are running Kibana 7.12.0 on docker with a Nginx reverse proxy in front of it.
The version of the Elastic cluster is also 7.12.0.
After I save the search in Discovery section and do Share, the status of the report shows as completed. However, the downloaded .csv contains only the name of the fields on the first row. The data shown in Discovery is not in the .csv.

These are the reporting settings in kibana.yml:

xpack.reporting.enabled: true
xpack.reporting.queue.timeout: 600000
xpack.reporting.kibanaServer.port: 443
xpack.reporting.kibanaServer.protocol: https
xpack.reporting.kibanaServer.hostname: 0.0.0.0
xpack.reporting.encryptionKey: "..."
xpack.reporting.csv.maxSizeBytes: 1048576000
xpack.reporting.csv.scroll.size: 1048576000
xpack.reporting.csv.scroll.duration: 10m

Can you help us identify the issues ?

Thank you

@laurentiubanica laurentiubanica added the bug Fixes for quality problems that affect the customer experience label May 24, 2021
@botelastic botelastic bot added the needs-team Issues missing a team label label May 24, 2021
@lukeelmers lukeelmers added (Deprecated) Feature:Reporting Use Reporting:Screenshot, Reporting:CSV, or Reporting:Framework instead Team:AppServices triage_needed labels May 26, 2021
@elasticmachine
Copy link
Contributor

Pinging @elastic/kibana-app-services (Team:AppServices)

@botelastic botelastic bot removed the needs-team Issues missing a team label label May 26, 2021
@tsullivan
Copy link
Member

Hi, is it possible that there is a conflict in the mapping of the data fields? For example, if there are multiple mappings applied to the index and both define a different mapping for a particular field?

If so, that is fixed in 7.13 with #88303

@exalate-issue-sync exalate-issue-sync bot added impact:low Addressing this issue will have a low level of impact on the quality/strength of our product. loe:small Small Level of Effort labels Jun 1, 2021
@tsullivan
Copy link
Member

tsullivan commented Jun 3, 2021

ping @laurentiubanica for feedback

@laurentiubanica
Copy link
Author

Hi,
I'm checking today the mappings and I'll get back to you as soon as I can.
Thank you for your feedback.

@exalate-issue-sync exalate-issue-sync bot added impact:high Addressing this issue will have a high level of impact on the quality/strength of our product. loe:medium Medium Level of Effort and removed impact:low Addressing this issue will have a low level of impact on the quality/strength of our product. loe:small Small Level of Effort labels Jun 3, 2021
@laurentiubanica
Copy link
Author

Hi,

There is only one template that applies to the index from which we are trying to run the reports.
There are no fields with conflict.

When running the reports, from Discover, we choose the index, we save the search and then we select Share, to run the report.
We get the the same behavior for any kind of index (using its own template).

However, when we use export to .csv, from various visualizations, the data get populated in the .csv .

Haven't upgraded yet to 7.13. We are using Kibana 7.12, on docker, same version of cluster.

Thank you,

@laurentiubanica
Copy link
Author

Forgot to mention that this occurred after we have set up the encryption between the nodes, to be able to use the alerting capability.
Before setting the encryption, we had a different issue with reporting. Only reports with less data were working.
The ones with large data were timing out, without any results. even though we increased the timeout and the max size.
Don't know if this is related.

@laurentiubanica
Copy link
Author

Hi,
We upgraded the cluster and Kibana app to 7.13.
The issue remains.
Not even to headers of the columns are shown anymore in the downloaded .csv file.

This is a sample backend log from the Kibana docker:

{"type":"response","@timestamp":"2021-06-07T18:02:37+00:00","tags":[],"pid":951,"method":"post","statusCode":200,"req":{"url":"/api/ui_counters/_report","method":"post","headers":{"connection":"Keep-Alive","proxy-connection":"Keep-Alive","host":"servers","content-length":"150","user-agent":"Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:89.0) Gecko/20100101 Firefox/89.0","accept":"/","accept-language":"en-US,en;q=0.5","accept-encoding":"gzip, deflate, br","referer":"https://kibanaurl/app/discover","content-type":"application/json","kbn-version":"7.13.1","kbn-system-request":"true","origin":"https://kibanaurl"},"remoteAddress":"172.17.0.1","userAgent":"Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:89.0) Gecko/20100101 Firefox/89.0","referer":"https://kibanaurl/app/discover"},"res":{"statusCode":200,"responseTime":950,"contentLength":15},"message":"POST /api/ui_counters/_report 200 950ms - 15.0B"}

@tsullivan
Copy link
Member

Hi @laurentiubanica the sample log you provided logs an unrelated HTTP request. CSV generation happens asynchronously a background job, not part of a request. It would help to share the logs that have the reporting tag. If you have multiple instances of Kibana, the logs would be found on the instance that the job was distributed to for execution.

  1. Please share any log info matching reporting or csv that you can find.

After I save the search in Discovery section and do Share, the status of the report shows as completed. However, the downloaded .csv contains only the name of the fields on the first row. The data shown in Discovery is not in the .csv.

  1. Can you provide the POST URL that is available in Discover? Please make sure to edit out any sensitive information from the string:
    sgdf

@laurentiubanica
Copy link
Author

This would be a sample log related to the last report I ran:

{"type":"log","@timestamp":"2021-06-15T14:26:06+00:00","tags":["info","plugins","reporting","queue-job"],"pid":952,"message":"Queued csv_searchsource report: kpy51o2600qgdfc2101bsx4h"}
{"type":"response","@timestamp":"2021-06-15T14:26:06+00:00","tags":[],"pid":952,"method":"post","statusCode":200,"req":{"url":"/api/reporting/generate/csv_searchsource","method":"post","headers":{"connection":"Keep-Alive","proxy-connection":"Keep-Alive","host":"servers","content-length":"516","user-agent":"Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:89.0) Gecko/20100101 Firefox/89.0","accept":"/","accept-language":"en-US,en;q=0.5","accept-encoding":"gzip, deflate, br","referer":"https://kibanalink/app/discover","content-type":"application/json","kbn-version":"7.13.1","origin":"https://kibanalink"},"remoteAddress":"172.17.0.1","userAgent":"Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:89.0) Gecko/20100101 Firefox/89.0","referer":"https://kibanalink/app/discover"},"res":{"statusCode":200,"responseTime":331,"contentLength":2612},"message":"POST /api/reporting/generate/csv_searchsource 200 331ms - 2.6KB"}

@laurentiubanica
Copy link
Author

{"type":"response","@timestamp":"2021-06-15T14:26:08+00:00","tags":[],"pid":952,"method":"get","statusCode":200,"req":{"url":"/api/reporting/jobs/list?page=0&ids=kpy51o2600qgdfc2101bsx4h","method":"get","headers":{"connection":"Keep-Alive","proxy-connection":"Keep-Alive","host":"servers","user-agent":"Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:89.0) Gecko/20100101 Firefox/89.0","accept":"/","accept-language":"en-US,en;q=0.5","accept-encoding":"gzip, deflate, br","referer":"https://kibanalink/app/discover","content-type":"application/json","kbn-version":"7.13.1"},"remoteAddress":"172.17.0.1","userAgent":"Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:89.0) Gecko/20100101 Firefox/89.0","referer":"https://kibanalink/app/discover"},"res":{"statusCode":200,"responseTime":36,"contentLength":2922},"message":"GET /api/reporting/jobs/list?page=0&ids=kpy51o2600qgdfc2101bsx4h 200 36ms - 2.9KB"}

{"type":"response","@timestamp":"2021-06-15T14:26:10+00:00","tags":[],"pid":952,"method":"get","statusCode":200,"req":{"url":"/api/reporting/jobs/download/kpy51o2600qgdfc2101bsx4h","method":"get","headers":{"connection":"Keep-Alive","proxy-connection":"Keep-Alive","host":"servers","user-agent":"Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:89.0) Gecko/20100101 Firefox/89.0","accept":"text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,/;q=0.8","accept-language":"en-US,en;q=0.5","accept-encoding":"gzip, deflate, br","upgrade-insecure-requests":"1"},"remoteAddress":"172.17.0.1","userAgent":"Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:89.0) Gecko/20100101 Firefox/89.0"},"res":{"statusCode":200,"responseTime":26,"contentLength":1},"message":"GET /api/reporting/jobs/download/kpy51o2600qgdfc2101bsx4h 200 26ms - 1.0B"}

@tsullivan
Copy link
Member

I just noticed that your xpack.reporting.csv.scroll.size: 1048576000 is really high. Elasticsearch is likely having problems returning search results with that many hits in a single response. Can you try setting it back to the default of 500? Once doing that, if it is still not working, can you see if a report with 500 documents or fewer can be generated?

xpack.reporting.csv.scroll.size
Number of documents retrieved from Elasticsearch for each scroll iteration during a CSV export. Defaults to 500.

I parsed the search that Discover is sending to Reporting as:

{
  "browserTimezone": "UTC",
  "columns": [],
  "objectType": "search",
  "searchSource": {
    "fields": [
      {
        "field": "*",
        "include_unmapped": "true"
      }
    ],
    "filter": [
      {
        "meta": {
          "index": "9af5b780-7d7d-11ea-813a-35e6418f1055",
          "params": {}
        },
        "range": {
          "@timestamp": {
            "format": "strict_date_optional_time",
            "gte": "2021-06-15T14:10:56.886Z",
            "lte": "2021-06-15T14:25:56.886Z"
          }
        }
      }
    ],
    "index": "9af5b780-7d7d-11ea-813a-35e6418f1055",
    "parent": {
      "filter": [],
      "index": "9af5b780-7d7d-11ea-813a-35e6418f1055",
      "query": {
        "language": "kuery",
        "query": ""
      }
    },
    "sort": [
      {
        "@timestamp": "desc"
      }
    ],
    "version": true
  },
  "title": "newtest"
}

Does the filter > range > @timestamp section look correct according to what you see in Discover? Is browserTimezone: 'UTC' what you expect to see?

@laurentiubanica
Copy link
Author

laurentiubanica commented Jun 17, 2021

Hi,
I have commented the following lines, to go back to the default settings:
#xpack.reporting.csv.maxSizeBytes: 1048576000
#xpack.reporting.csv.scroll.size: 1048576000
#xpack.reporting.csv.scroll.duration: 10m
After having done this, the csv contained only one row with the following characters: @timestamp,"_source". Nothing else.

Then, I changed the xpack.reporting.csv.scroll.size to 100 and ran a report for the last 15 minutes.
The excel got populated correctly with 18465 lines of events.
However, the status message was - Max size reached, meaning that not all the events were included in the report.

Then, I uncommented the following line:
xpack.reporting.csv.maxSizeBytes: 1048576000 ,
modfied the value of the xpack.reporting.csv.scroll.size to 400, and ran the same report.
The report was empty, again.

Then I changed to xpack.reporting.csv.scroll.size to 100. The result was a csv with 267k lines with the following incorrect format:
@timestamp,"_source"
Jun 17, 2021 @ 16:36:56.000,"-"

After that, I commented back xpack.reporting.csv.maxSizeBytes and modified xpack.reporting.csv.scroll.size to 400 and ran a report for last 15 minutes. The result was a single line with the following info: @timestamp,"_source".

Changed xpack.reporting.csv.scroll.size to 300. Csv was empty. The same for 200.

When I changed back to the settings that previously worked correctly:
#xpack.reporting.csv.maxSizeBytes: 1048576000
xpack.reporting.csv.scroll.size: 100
#xpack.reporting.csv.scroll.duration: 10m
the report was empty, again.

I really don't understand what's going on.

@tsullivan
Copy link
Member

Hi, how much memory is available in the system? Is it possible to add more memory?

@exalate-issue-sync exalate-issue-sync bot added impact:low Addressing this issue will have a low level of impact on the quality/strength of our product. impact:high Addressing this issue will have a high level of impact on the quality/strength of our product. and removed impact:high Addressing this issue will have a high level of impact on the quality/strength of our product. impact:low Addressing this issue will have a low level of impact on the quality/strength of our product. labels Jul 6, 2021
@laurentiubanica
Copy link
Author

Hi,
It seems that this issue was from the memory of the machine.
Thank you for your help and time !

@tsullivan
Copy link
Member

Glad to hear things are working out!

@exalate-issue-sync exalate-issue-sync bot added loe:small Small Level of Effort and removed loe:medium Medium Level of Effort labels Aug 12, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Fixes for quality problems that affect the customer experience (Deprecated) Feature:Reporting Use Reporting:Screenshot, Reporting:CSV, or Reporting:Framework instead feedback_needed impact:high Addressing this issue will have a high level of impact on the quality/strength of our product. loe:small Small Level of Effort
Projects
None yet
Development

No branches or pull requests

4 participants