Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Only add _type if ES version < 8 #892

Merged
merged 9 commits into from
Nov 15, 2019
Merged

Conversation

robbavey
Copy link
Contributor

No description provided.

@robbavey
Copy link
Contributor Author

Symptoms of this issue look like those described in elastic/logstash#11264 with ingestion not possible and log entries containing:

[2019-10-23T20:23:28,933][INFO ][logstash.agent ] Pipelines running {:count=>2, :running_pipelines=>[:".monitoring-logstash", :main], :non_running_pipelines=>[]}
[2019-10-23T20:23:29,466][ERROR][logstash.outputs.elasticsearch][main] Encountered a retryable error. Will Retry with exponential backoff {:code=>400, :url=>"https://10.0.2.15:9200/_bulk"}

@elasticsearch-bot elasticsearch-bot self-assigned this Nov 15, 2019
@colinsurprenant
Copy link
Contributor

LGTM - left cosmetic comment.

@robbavey
Copy link
Contributor Author

Thanks @colinsurprenant - I committed your suggestion

@manzoor77
Copy link

[2023-03-17T11:25:11,285][ERROR][logstash.outputs.elasticsearch][main][3404e8877b56c05b348412b7a316fdaceb1867306a80dce71719c975094faeb9] Encountered a retryable error (will retry with exponential backoff) {:code=>400, :url=>"http://localhost:9200/_bulk", :content_length=>41360}

I need to create indexing of exiting postgres database in elasticsearch. For this purpose I have setup elasticsearch 7.17.4, kibana 7.17.4 and logstash 7.17.4 on my local machine. I have downloaded csv file of posts table from db that contain almost 62k rows of data in it. I have setup logstash config file as per mentioned requirements of logstash doc. i.e.
input {
file {
path => "/Users/manzoorfaisal/Desktop/Laptop-Migration-2/logstash-7.17.4/source-posts.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
type => "doc"
#skip_header => true
#source => "message"

}
}
filter {

Add your filters here

csv{
separator=> ","
columns=>["id","createat","updateat","deleteat","userid","channelid","rootid","originalid","message","type","props","hashtags","filenames","fileids","hasreactions","editat","ispinned","remoteid"]

skip_header => true
}

mutate {
remove_field => ["updateat","deleteat", "rootid","originalid","props","filenames","fileids","hasreactions","editat", "ispinned", "remoteid", "@timestamp","@Version","host","path"]
add_field => {
"teamid" => "new_value"
}
}
}

output {
elasticsearch {
hosts => ["localhost:9200"]
index => "post_index-7"
document_id => "%{id}"

}

stdout{codec => rubydebug }
}
I need to create index of existing db in elasticsearch using logstash or using any other method. using logstash I am facing above issue. Can anyone get me out from this issue? or suggest any other authentic method for reindexing of existing postgres db tables into elasticsearch index.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants