-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Elasticsearch 7.2 throws an error he length of [content] field of [files:61364] doc of [fulltextsearch] index has exceeded [1000000] #73
Comments
I experienced the same error message and also fixed it by setting a higher value to
I'm not too much into Elastic search but is there a way to change the mapping / index configuration so as to use term vectors? Or are they already used as suggested by the mapping? |
https://www.elastic.co/guide/en/elasticsearch/reference/current/term-vector.html |
I have the same problem. I index a lot of pdf/images. Anthony |
@prolibre you can disable highlighting completely to get rid of this error message by disabling the advanced setting, doc_table:highlight (in kibana). |
As far as I know, Kibana could search several indexes started with specific prefix. |
Hi,
Not really sure how to fix this issue.
|
Please check your index settings: A result should be: Please look at the commit: 5256e3d |
onGetConfig + onIndexComparing
This can resolve problem, but not recommended to modify the configuration. It will cause kibana and es memory problem. If you take this way, do not change this number is too large. The reason of the problem is the message is too large. |
IIRC this has been fixed long time ago. |
Elasticsearch 7.2 throws the following error:
I could fix this by applying the following Curl command:
The text was updated successfully, but these errors were encountered: