Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

logstash_exporter_scrape report success when failing #18

Open
danielmotaleite opened this issue Feb 19, 2018 · 2 comments
Open

logstash_exporter_scrape report success when failing #18

danielmotaleite opened this issue Feb 19, 2018 · 2 comments

Comments

@danielmotaleite
Copy link

As per issue #13, i was trying the supplied query to detect errors, but it was not working... checking directly the exporter, i get this:

 curl  http://127.0.0.1:9198/metrics -s | grep scrape
# HELP logstash_exporter_scrape_duration_seconds logstash_exporter: Duration of a scrape job.
# TYPE logstash_exporter_scrape_duration_seconds summary
logstash_exporter_scrape_duration_seconds{collector="info",result="success",quantile="0.5"} 0.000544097
logstash_exporter_scrape_duration_seconds{collector="info",result="success",quantile="0.9"} 0.001007189
logstash_exporter_scrape_duration_seconds{collector="info",result="success",quantile="0.99"} 0.001895548
logstash_exporter_scrape_duration_seconds_sum{collector="info",result="success"} 0.025157706999999998
logstash_exporter_scrape_duration_seconds_count{collector="info",result="success"} 38
logstash_exporter_scrape_duration_seconds{collector="node",result="success",quantile="0.5"} 0.000491171
logstash_exporter_scrape_duration_seconds{collector="node",result="success",quantile="0.9"} 0.000994146
logstash_exporter_scrape_duration_seconds{collector="node",result="success",quantile="0.99"} 0.00149815
logstash_exporter_scrape_duration_seconds_sum{collector="node",result="success"} 0.0230836
logstash_exporter_scrape_duration_seconds_count{collector="node",result="success"} 38

But logstash is not even running:

 curl -XGET 'localhost:9600/_node/stats/pipeline?pretty' -sv 
* Hostname was NOT found in DNS cache
*   Trying 127.0.0.1...
* connect to 127.0.0.1 port 9600 failed: Connection refused
* Failed to connect to localhost port 9600: Connection refused
* Closing connection 0

and the docker logs even report that:

time="2018-02-19T18:06:30Z" level=error msg="Cannot retrieve metrics: Get http://localhost:9600/_node/stats: dial tcp 127.0.0.1:9600: getsockopt: connection refused" source="api_base.go:32"
time="2018-02-19T18:06:30Z" level=error msg="Cannot retrieve metrics: Get http://localhost:9600/_node: dial tcp 127.0.0.1:9600: getsockopt: connection refused" source="api_base.go:32"

On another machine, where i have a freezed logstash, the exporter reports this:

time="2018-02-19T18:07:54Z" level=error msg="Cannot retrieve metrics: Get http://localhost:9600/_node/stats: read tcp 127.0.0.1:16562->127.0.0.1:9600: read: connection reset by peer" source="api_base.go:32"
time="2018-02-19T18:07:54Z" level=error msg="Cannot retrieve metrics: Get http://localhost:9600/_node: read tcp 127.0.0.1:16560->127.0.0.1:9600: read: connection reset by peer" source="api_base.go:32"

yet in this case, the exporter not only still reports "Successful" scrapes, but also takes a long time to reply, about 100s, probably because it is waiting for the logstash reply, before reach some timeout at 100s. so a locked logstash will lock logstash-exporter for 100s

@mpucholblasco
Copy link

@danielmotaleite , metric logstash_exporter_scrape_duration_seconds_count represents the scrape performed to the prometheus endpoint, not the connection between logstash-exporter and logstash. In my opinion, I guess it's better to have feature presented on #13 .

@mpucholblasco
Copy link

After reviewing the code, you're right @danielmotaleite , this metric represents the time to scrape data to logstash, not logstash-exporter. I found the error and fixed it on https://github.com/sequra/logstash_exporter/pull/5

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants