You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
DEVX-1531: add client config files and get all settings from configs across clients, update cloud client readmes (#441)
* add client config files and get all settings from configs across clients
add example.config files to each client, get all settings from config files, update readmes
* change .ccloud to .confluent, update config file instructions for clients
Copy file name to clipboardExpand all lines: clients/cloud/c/README.md
+4-24
Original file line number
Diff line number
Diff line change
@@ -2,17 +2,11 @@
2
2
3
3
Produce messages to and consume messages from a Kafka cluster using the C client [librdkafka](https://github.com/edenhill/librdkafka).
4
4
5
-
6
5
# Prerequisites
7
6
8
7
*[librdkafka](https://github.com/edenhill/librdkafka) installed on your machine, see [installation instructions](https://github.com/edenhill/librdkafka/blob/master/README.md#instructions).
9
-
10
-
To run this example, create a local file with configuration parameters to connect to your Kafka cluster, which can be on your local host, [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud), or any other cluster.
11
-
If this is a Confluent Cloud cluster, you must have:
12
-
13
-
* Access to a [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud) cluster
14
-
* Local file with configuration parameters to connect to your Confluent Cloud instance ([how do I find those?](https://docs.confluent.io/current/cloud/using/config-client.html#librdkafka-based-c-clients?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud)).
15
-
8
+
* Create a local file (e.g. at `$HOME/.confluent/librdkafka.config`) with configuration parameters to connect to your Kafka cluster, which can be on your local host, [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud), or any other cluster. Follow [these detailed instructions](https://github.com/confluentinc/configuration-templates/tree/master/README.md) to properly create this file.
9
+
* If you are running on Confluent Cloud, you must have access to a [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud) cluster
cc producer.c common.c json.c -o producer -lrdkafka -lm
25
19
```
26
20
27
-
# Create a configuration file
28
-
29
-
The configuration file must contain the bootstrap servers and
30
-
the SASL username and password, as shown in your Confluent Cloud settings.
31
-
32
-
Additional configuration properties are supported, see [CONFIGURATION.md](https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md) for the full list.
33
-
34
-
```bash
35
-
$ cat $HOME/.ccloud/example.config
36
-
bootstrap.servers=<broker-1,broker-2,broker-3>
37
-
sasl.username=<api-key-id>
38
-
sasl.password=<secret-access-key>
39
-
```
40
-
41
21
# Example 1: Hello World!
42
22
43
23
In this example, the producer writes JSON data to a topic in Confluent Cloud.
@@ -47,7 +27,7 @@ The consumer reads the same topic from Confluent Cloud and keeps a rolling sum o
47
27
1. Run the producer, passing in arguments for (a) the topic name, and (b) the local file with configuration parameters to connect to your Confluent Cloud instance:
Producing message #0 to test1: alice={ "count": 1 }
@@ -78,7 +58,7 @@ Message delivered to test1 [0] at offset 9 in 22.81ms: { "count": 10 }
78
58
2. Run the consumer, passing in arguments for (a) the same topic name as used above, (b) the local file with configuration parameters to connect to your Confluent Cloud instance. Verify that the consumer received all the messages, then press Ctrl-C to exit.
Copy file name to clipboardExpand all lines: clients/cloud/clojure/README.md
+4-18
Original file line number
Diff line number
Diff line change
@@ -8,22 +8,8 @@ For more information, please see the [application development documentation](htt
8
8
9
9
* Java 8 or higher (Clojure 1.10 recommends using Java 8 or Java 11)
10
10
* The [Leiningen](https://leiningen.org/#install) tool to compile and run the demos
11
-
12
-
To run this example, create a local file with configuration parameters to connect to your Kafka cluster, which can be on your local host, [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud), or any other cluster.
13
-
If this is a Confluent Cloud cluster, you must have:
14
-
15
-
* Access to a [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud) cluster
16
-
* Local file with configuration parameters to connect to your Confluent Cloud instance ([how do I find those?](https://docs.confluent.io/current/cloud/using/config-client.html#librdkafka-based-c-clients?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud)). Format the file as follows:
17
-
18
-
```bash
19
-
$ cat ~/.ccloud/example.config
20
-
bootstrap.servers=<broker-1,broker-2,broker-3>
21
-
sasl.username=<api-key-id>
22
-
sasl.password=<secret-access-key>
23
-
ssl.endpoint.identification.algorithm=https
24
-
security.protocol=SASL_SSL
25
-
sasl.mechanism=PLAIN
26
-
```
11
+
* Create a local file (e.g. at `$HOME/.confluent/java.config`) with configuration parameters to connect to your Kafka cluster, which can be on your local host, [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud), or any other cluster. Follow [these detailed instructions](https://github.com/confluentinc/configuration-templates/tree/master/README.md) to properly create this file.
12
+
* If you are running on Confluent Cloud, you must have access to a [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud) cluster
27
13
28
14
# Example 1: Hello World!
29
15
@@ -34,7 +20,7 @@ The consumer reads the same topic from Confluent Cloud and keeps a rolling sum o
34
20
1. Run the producer, passing in arguments for (a) the local file with configuration parameters to connect to your Confluent Cloud instance and (b) the topic name:
@@ -62,7 +48,7 @@ Produced record to topic test1 partiton [0] @ offest 9
62
48
2. Run the consumer, passing in arguments for (a) the local file with configuration parameters to connect to your Confluent Cloud instance and (b) the same topic name as used above. Verify that the consumer received all the messages:
Copy file name to clipboardExpand all lines: clients/cloud/confluent-cli/README.md
+10-22
Original file line number
Diff line number
Diff line change
@@ -9,20 +9,8 @@ Produce messages to and consume messages from a Kafka cluster using [Confluent C
9
9
10
10
*[Confluent Platform 5.4](https://www.confluent.io/download/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud), which includes the Confluent CLI
11
11
12
-
To run this example, create a local file with configuration parameters to connect to your Kafka cluster, which can be on your local host, [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud), or any other cluster.
13
-
If this is a Confluent Cloud cluster, you must have:
14
-
15
-
* Access to a [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud) cluster
16
-
* Initialize a properties file at `$HOME/.ccloud/config` with configuration to your Confluent Cloud cluster:
* Create a local file (e.g. at `$HOME/.confluent/java.config`) with configuration parameters to connect to your Kafka cluster, which can be on your local host, [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud), or any other cluster. Follow [these detailed instructions](https://github.com/confluentinc/configuration-templates/tree/master/README.md) to properly create this file.
13
+
* If you are running on Confluent Cloud, you must have access to a [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud) cluster
26
14
27
15
28
16
# Example 1: Hello World!
@@ -34,12 +22,12 @@ The consumer reads the same topic from Confluent Cloud.
2. Run the [Confluent CLI producer](https://docs.confluent.io/current/cli/command-reference/confluent-produce.html#cli-confluent-produce?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud), writing messages to topic `test1`, passing in additional arguments:
41
29
42
-
*`--cloud`: write messages to the Confluent Cloud cluster specified in `$HOME/.ccloud/config`
30
+
*`--cloud`: write messages to the Confluent Cloud cluster specified in `$HOME/.confluent/java.config`
43
31
*`--property parse.key=true --property key.separator=,`: pass key and value, separated by a comma
44
32
45
33
```bash
@@ -58,7 +46,7 @@ When you are done, press `<ctrl>-d`.
58
46
59
47
2. Run the [Confluent CLI consumer](https://docs.confluent.io/current/cli/command-reference/confluent-consume.html#cli-confluent-consume?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud), reading messages from topic `test1`, passing in additional arguments:
60
48
61
-
*`--cloud`: read messages from the Confluent Cloud cluster specified in `$HOME/.ccloud/config`
49
+
*`--cloud`: read messages from the Confluent Cloud cluster specified in `$HOME/.confluent/java.config`
62
50
*`--property print.key=true`: print key and value (by default, it only prints value)
63
51
*`--from-beginning`: print all messages from the beginning of the topic
64
52
@@ -93,14 +81,14 @@ Note that your VPC must be able to connect to the Confluent Cloud Schema Registr
93
81
# View the list of registered subjects
94
82
$ curl -u <SR API KEY>:<SR API SECRET> https://<SR ENDPOINT>/subjects
95
83
96
-
# Same as above, as a single bash command to parse the values out of $HOME/.ccloud/config
3. Add the following parameters to your local Confluent Cloud configuration file (``$HOME/.ccloud/config``). In the output below, substitute values for`<SR API KEY>`, `<SR API SECRET>`, and `<SR ENDPOINT>`.
88
+
3. Add the following parameters to your local Confluent Cloud configuration file (``$HOME/.confluent/java.config``). In the output below, substitute values for`<SR API KEY>`, `<SR API SECRET>`, and `<SR ENDPOINT>`.
101
89
102
90
```shell
103
-
$ cat $HOME/.ccloud/config
91
+
$ cat $HOME/.confluent/java.config
104
92
...
105
93
basic.auth.credentials.source=USER_INFO
106
94
schema.registry.basic.auth.user.info=<SR API KEY>:<SR API SECRET>
@@ -111,7 +99,7 @@ Note that your VPC must be able to connect to the Confluent Cloud Schema Registr
5. Run the [Confluent CLI producer](https://docs.confluent.io/current/cli/command-reference/confluent-produce.html#cli-confluent-produce?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud), writing messages to topic `test2`, passing in additional arguments. The additional Schema Registry parameters are required to be passed in as properties instead of a properties file due to https://github.com/confluentinc/schema-registry/issues/1052.
Copy file name to clipboardExpand all lines: clients/cloud/csharp/README.md
+6-19
Original file line number
Diff line number
Diff line change
@@ -6,21 +6,8 @@ Produce messages to and consume messages from a Kafka cluster using the .NET Pro
6
6
# Prerequisites
7
7
8
8
*[.NET Core 2.1](https://dotnet.microsoft.com/download) or higher to run the demo application
9
-
10
-
To run this example, create a local file with configuration parameters to connect to your Kafka cluster, which can be on your local host, [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud), or any other cluster.
11
-
If this is a Confluent Cloud cluster, you must have:
12
-
13
-
* Access to a [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud) cluster
14
-
* Initialize a properties file at `$HOME/.ccloud/config` with configuration to your Confluent Cloud cluster:
* Create a local file (e.g. at `$HOME/.confluent/librdkafka.config`) with configuration parameters to connect to your Kafka cluster, which can be on your local host, [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud), or any other cluster. Follow [these detailed instructions](https://github.com/confluentinc/configuration-templates/tree/master/README.md) to properly create this file.
10
+
* If you are running on Confluent Cloud, you must have access to a [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud) cluster
24
11
25
12
# Example
26
13
@@ -40,10 +27,10 @@ Run the example application, passing in arguments for (a) whether to produce or
40
27
$ dotnet build
41
28
42
29
# Run the producer (Windows)
43
-
$ dotnet run produce test1 %HOMEPATH%/.ccloud/config /path/to/curl/cacert.pem
30
+
$ dotnet run produce test1 $HOME/.confluent/librdkafka.config /path/to/curl/cacert.pem
44
31
45
32
# Run the producer (other)
46
-
$ dotnet run produce test1 $HOME/.ccloud/config
33
+
$ dotnet run produce test1 $HOME/.confluent/librdkafka.config
47
34
```
48
35
49
36
You should see:
@@ -78,10 +65,10 @@ Run the consumer, passing in arguments for (a) whether to produce or consume (co
78
65
79
66
```shell
80
67
# Run the consumer (Windows)
81
-
$ dotnet run consume test1 %HOMEPATH%/.ccloud/config /path/to/curl/cacert.pem
68
+
$ dotnet run consume test1 $HOME/.confluent/librdkafka.config /path/to/curl/cacert.pem
82
69
83
70
# Run the consumer (other)
84
-
$ dotnet run consume test1 $HOME/.ccloud/config
71
+
$ dotnet run consume test1 $HOME/.confluent/librdkafka.config
0 commit comments