You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: pipeline/inputs/node-exporter-metrics.md
+7-7Lines changed: 7 additions & 7 deletions
Original file line number
Diff line number
Diff line change
@@ -18,13 +18,13 @@ This plugin is generally supported on Linux-based operating systems, with macOS
18
18
19
19
`scrape_interval` sets the default for all scrapes. To set granular scrape intervals, set the specific interval. For example, `collector.cpu.scrape_interval`. When using a granular scrape interval, if a value greater than `0` is used, it overrides the global default. Otherwise the global default is used.
20
20
21
-
The plugin toplevel `scrape_interval` setting is the global default. Any custom settings for individual `scrape_intervals` override that specific metric scraping interval.
21
+
The plugin top-level `scrape_interval` setting is the global default. Any custom settings for individual `scrape_intervals` override that specific metric scraping interval.
22
22
23
23
Each `collector.xxx.scrape_interval` option only overrides the interval for that specific collector and updates the associated set of provided metrics.
24
24
25
25
Overridden intervals only change the collection interval, not the interval for publishing the metrics which is taken from the global setting.
26
26
27
-
For example, if the global interval is set to `5` and an override interval of `60` is used, the published metrics will be reported every five seconds. However, the specific collector will stay the same for sixty seconds until it's collected again.
27
+
For example, if the global interval is set to `5` and an override interval of `60` is used, the published metrics will be reported every five seconds. However, the specific collector will stay the same for 60 seconds until it's collected again.
28
28
29
29
This helps with downsampling when collecting metrics.
30
30
@@ -90,7 +90,7 @@ This input always runs in its own [thread](../../administration/multithreading.m
90
90
91
91
### Configuration file
92
92
93
-
In the following configuration file, the input plugin `node_exporter_metrics` collects metrics every 2 seconds and exposes them through the [Prometheus Exporter](../outputs/prometheus-exporter.md) output plugin on HTTP/TCP port 2021.
93
+
In the following configuration file, the input plugin `node_exporter_metrics` collects metrics every two seconds and exposes them through the [Prometheus Exporter](../outputs/prometheus-exporter.md) output plugin on HTTP/TCP port 2021.
94
94
95
95
{% tabs %}
96
96
{% tab title="fluent-bit.yaml" %}
@@ -178,16 +178,16 @@ docker run -ti -v /proc:/host/proc \
178
178
179
179
### Fluent Bit with Prometheus and Grafana
180
180
181
-
If you use dashboards for monitoring, Grafana is one options. The Fluent Bit source code repository contains a `docker-compose` example.
181
+
If you use dashboards for monitoring, Grafana is one option. The Fluent Bit source code repository contains a `docker-compose` example.
182
182
183
-
1. Download the Fluent Bit source code.
183
+
1. Download the Fluent Bit source code:
184
184
185
185
```bash
186
186
git clone https://github.com/fluent/fluent-bit
187
187
cd fluent-bit/docker_compose/node-exporter-dashboard/
188
188
```
189
189
190
-
1. Start the service and view your dashboard.
190
+
1. Start the service and view your dashboard:
191
191
192
192
```bash
193
193
docker-compose up --force-recreate -d --build
@@ -209,6 +209,6 @@ docker-compose down
209
209
210
210
## Enhancement requests
211
211
212
-
The plugin implements a subset of the available collectors in the original Prometheus Node exporter. If you would like a specific collector prioritized, open a Github issue by using the following template:
212
+
The plugin implements a subset of the available collectors in the original Prometheus Node exporter. If you would like a specific collector prioritized, open a GitHub issue by using the following template:
0 commit comments