Skip to content

Commit 1104fdc

Browse files
DEVX-1531: add client config files and get all settings from configs across clients, update cloud client readmes (#441)
* add client config files and get all settings from configs across clients add example.config files to each client, get all settings from config files, update readmes * change .ccloud to .confluent, update config file instructions for clients
1 parent d333312 commit 1104fdc

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

51 files changed

+237
-509
lines changed

clients/cloud/c/README.md

+4-24
Original file line numberDiff line numberDiff line change
@@ -2,17 +2,11 @@
22

33
Produce messages to and consume messages from a Kafka cluster using the C client [librdkafka](https://github.com/edenhill/librdkafka).
44

5-
65
# Prerequisites
76

87
* [librdkafka](https://github.com/edenhill/librdkafka) installed on your machine, see [installation instructions](https://github.com/edenhill/librdkafka/blob/master/README.md#instructions).
9-
10-
To run this example, create a local file with configuration parameters to connect to your Kafka cluster, which can be on your local host, [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud), or any other cluster.
11-
If this is a Confluent Cloud cluster, you must have:
12-
13-
* Access to a [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud) cluster
14-
* Local file with configuration parameters to connect to your Confluent Cloud instance ([how do I find those?](https://docs.confluent.io/current/cloud/using/config-client.html#librdkafka-based-c-clients?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud)).
15-
8+
* Create a local file (e.g. at `$HOME/.confluent/librdkafka.config`) with configuration parameters to connect to your Kafka cluster, which can be on your local host, [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud), or any other cluster. Follow [these detailed instructions](https://github.com/confluentinc/configuration-templates/tree/master/README.md) to properly create this file.
9+
* If you are running on Confluent Cloud, you must have access to a [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud) cluster
1610

1711
# Build the example applications
1812

@@ -24,20 +18,6 @@ cc consumer.c common.c json.c -o consumer -lrdkafka -lm
2418
cc producer.c common.c json.c -o producer -lrdkafka -lm
2519
```
2620

27-
# Create a configuration file
28-
29-
The configuration file must contain the bootstrap servers and
30-
the SASL username and password, as shown in your Confluent Cloud settings.
31-
32-
Additional configuration properties are supported, see [CONFIGURATION.md](https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md) for the full list.
33-
34-
```bash
35-
$ cat $HOME/.ccloud/example.config
36-
bootstrap.servers=<broker-1,broker-2,broker-3>
37-
sasl.username=<api-key-id>
38-
sasl.password=<secret-access-key>
39-
```
40-
4121
# Example 1: Hello World!
4222

4323
In this example, the producer writes JSON data to a topic in Confluent Cloud.
@@ -47,7 +27,7 @@ The consumer reads the same topic from Confluent Cloud and keeps a rolling sum o
4727
1. Run the producer, passing in arguments for (a) the topic name, and (b) the local file with configuration parameters to connect to your Confluent Cloud instance:
4828

4929
```bash
50-
$ ./producer test1 $HOME/.ccloud/example.config
30+
$ ./producer test1 $HOME/.confluent/librdkafka.config
5131
Creating topic test1
5232
Topic test1 successfully created
5333
Producing message #0 to test1: alice={ "count": 1 }
@@ -78,7 +58,7 @@ Message delivered to test1 [0] at offset 9 in 22.81ms: { "count": 10 }
7858
2. Run the consumer, passing in arguments for (a) the same topic name as used above, (b) the local file with configuration parameters to connect to your Confluent Cloud instance. Verify that the consumer received all the messages, then press Ctrl-C to exit.
7959
8060
```bash
81-
$ ./consumer test1 $HOME/.ccloud/example.config
61+
$ ./consumer test1 $HOME/.confluent/librdkafka.config
8262
Subscribing to test1, waiting for assignment and messages...
8363
Press Ctrl-C to exit.
8464
Received message on test1 [0] at offset 0: { "count": 1 }

clients/cloud/c/common.c

-13
Original file line numberDiff line numberDiff line change
@@ -53,19 +53,6 @@ rd_kafka_conf_t *read_config (const char *config_file) {
5353
}
5454

5555
conf = rd_kafka_conf_new();
56-
57-
/* Set up basic Confluent Cloud connectivity parameters,
58-
* we expect the bootstrap.servers, sasl.username, and sasl.password
59-
* to be specified in the configuration file. */
60-
if (rd_kafka_conf_set(conf, "security.protocol", "SASL_SSL",
61-
errstr, sizeof(errstr)) != RD_KAFKA_CONF_OK ||
62-
rd_kafka_conf_set(conf, "sasl.mechanisms", "PLAIN",
63-
errstr, sizeof(errstr)) != RD_KAFKA_CONF_OK) {
64-
fprintf(stderr, "Configuration failed: %s\n", errstr);
65-
rd_kafka_conf_destroy(conf);
66-
return NULL;
67-
}
68-
6956
/* Read configuration file, line by line. */
7057
while (fgets(buf, sizeof(buf), fp)) {
7158
char *s = buf;

clients/cloud/clojure/README.md

+4-18
Original file line numberDiff line numberDiff line change
@@ -8,22 +8,8 @@ For more information, please see the [application development documentation](htt
88

99
* Java 8 or higher (Clojure 1.10 recommends using Java 8 or Java 11)
1010
* The [Leiningen](https://leiningen.org/#install) tool to compile and run the demos
11-
12-
To run this example, create a local file with configuration parameters to connect to your Kafka cluster, which can be on your local host, [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud), or any other cluster.
13-
If this is a Confluent Cloud cluster, you must have:
14-
15-
* Access to a [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud) cluster
16-
* Local file with configuration parameters to connect to your Confluent Cloud instance ([how do I find those?](https://docs.confluent.io/current/cloud/using/config-client.html#librdkafka-based-c-clients?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud)). Format the file as follows:
17-
18-
```bash
19-
$ cat ~/.ccloud/example.config
20-
bootstrap.servers=<broker-1,broker-2,broker-3>
21-
sasl.username=<api-key-id>
22-
sasl.password=<secret-access-key>
23-
ssl.endpoint.identification.algorithm=https
24-
security.protocol=SASL_SSL
25-
sasl.mechanism=PLAIN
26-
```
11+
* Create a local file (e.g. at `$HOME/.confluent/java.config`) with configuration parameters to connect to your Kafka cluster, which can be on your local host, [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud), or any other cluster. Follow [these detailed instructions](https://github.com/confluentinc/configuration-templates/tree/master/README.md) to properly create this file.
12+
* If you are running on Confluent Cloud, you must have access to a [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud) cluster
2713

2814
# Example 1: Hello World!
2915

@@ -34,7 +20,7 @@ The consumer reads the same topic from Confluent Cloud and keeps a rolling sum o
3420
1. Run the producer, passing in arguments for (a) the local file with configuration parameters to connect to your Confluent Cloud instance and (b) the topic name:
3521

3622
```shell
37-
$ lein producer ~/.ccloud/example.config test1
23+
$ lein producer $HOME/.confluent/java.config test1
3824
3925
Producing record: alice {"count":0}
4026
Producing record: alice {"count":1}
@@ -62,7 +48,7 @@ Produced record to topic test1 partiton [0] @ offest 9
6248
2. Run the consumer, passing in arguments for (a) the local file with configuration parameters to connect to your Confluent Cloud instance and (b) the same topic name as used above. Verify that the consumer received all the messages:
6349

6450
```shell
65-
$ lein consumer ~/.ccloud/example.config test1
51+
$ lein consumer $HOME/.confluent/java.config test1
6652
6753
Waiting for message in KafkaConsumer.poll
6854
Consumed record with key alice and value {"count":0}, and updated total count to 0

clients/cloud/clojure/src/io/confluent/examples/clients/clj/consumer.clj

+2-1
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,8 @@
1111
(defn- build-properties [config-fname]
1212
(with-open [config (jio/reader config-fname)]
1313
(doto (Properties.)
14-
(.putAll {ConsumerConfig/KEY_DESERIALIZER_CLASS_CONFIG "org.apache.kafka.common.serialization.StringDeserializer"
14+
(.putAll {ConsumerConfig/GROUP_ID_CONFIG, "clojure_example_group"
15+
ConsumerConfig/KEY_DESERIALIZER_CLASS_CONFIG "org.apache.kafka.common.serialization.StringDeserializer"
1516
ConsumerConfig/VALUE_DESERIALIZER_CLASS_CONFIG "org.apache.kafka.common.serialization.StringDeserializer"})
1617
(.load config))))
1718

clients/cloud/confluent-cli/README.md

+10-22
Original file line numberDiff line numberDiff line change
@@ -9,20 +9,8 @@ Produce messages to and consume messages from a Kafka cluster using [Confluent C
99

1010
* [Confluent Platform 5.4](https://www.confluent.io/download/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud), which includes the Confluent CLI
1111

12-
To run this example, create a local file with configuration parameters to connect to your Kafka cluster, which can be on your local host, [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud), or any other cluster.
13-
If this is a Confluent Cloud cluster, you must have:
14-
15-
* Access to a [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud) cluster
16-
* Initialize a properties file at `$HOME/.ccloud/config` with configuration to your Confluent Cloud cluster:
17-
18-
```shell
19-
$ cat $HOME/.ccloud/config
20-
bootstrap.servers=<BROKER ENDPOINT>
21-
ssl.endpoint.identification.algorithm=https
22-
security.protocol=SASL_SSL
23-
sasl.mechanism=PLAIN
24-
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username\="<API KEY>" password\="<API SECRET>";
25-
```
12+
* Create a local file (e.g. at `$HOME/.confluent/java.config`) with configuration parameters to connect to your Kafka cluster, which can be on your local host, [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud), or any other cluster. Follow [these detailed instructions](https://github.com/confluentinc/configuration-templates/tree/master/README.md) to properly create this file.
13+
* If you are running on Confluent Cloud, you must have access to a [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud) cluster
2614

2715

2816
# Example 1: Hello World!
@@ -34,12 +22,12 @@ The consumer reads the same topic from Confluent Cloud.
3422
1. Create the topic in Confluent Cloud
3523

3624
```bash
37-
$ kafka-topics --bootstrap-server `grep "^\s*bootstrap.server" $HOME/.ccloud/config | tail -1` --command-config $HOME/.ccloud/config --topic test1 --create --replication-factor 3 --partitions 6
25+
$ kafka-topics --bootstrap-server `grep "^\s*bootstrap.server" $HOME/.confluent/java.config | tail -1` --command-config $HOME/.confluent/java.config --topic test1 --create --replication-factor 3 --partitions 6
3826
```
3927

4028
2. Run the [Confluent CLI producer](https://docs.confluent.io/current/cli/command-reference/confluent-produce.html#cli-confluent-produce?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud), writing messages to topic `test1`, passing in additional arguments:
4129

42-
* `--cloud`: write messages to the Confluent Cloud cluster specified in `$HOME/.ccloud/config`
30+
* `--cloud`: write messages to the Confluent Cloud cluster specified in `$HOME/.confluent/java.config`
4331
* `--property parse.key=true --property key.separator=,`: pass key and value, separated by a comma
4432

4533
```bash
@@ -58,7 +46,7 @@ When you are done, press `<ctrl>-d`.
5846

5947
2. Run the [Confluent CLI consumer](https://docs.confluent.io/current/cli/command-reference/confluent-consume.html#cli-confluent-consume?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud), reading messages from topic `test1`, passing in additional arguments:
6048

61-
* `--cloud`: read messages from the Confluent Cloud cluster specified in `$HOME/.ccloud/config`
49+
* `--cloud`: read messages from the Confluent Cloud cluster specified in `$HOME/.confluent/java.config`
6250
* `--property print.key=true`: print key and value (by default, it only prints value)
6351
* `--from-beginning`: print all messages from the beginning of the topic
6452

@@ -93,14 +81,14 @@ Note that your VPC must be able to connect to the Confluent Cloud Schema Registr
9381
# View the list of registered subjects
9482
$ curl -u <SR API KEY>:<SR API SECRET> https://<SR ENDPOINT>/subjects
9583

96-
# Same as above, as a single bash command to parse the values out of $HOME/.ccloud/config
97-
$ curl -u $(grep "^schema.registry.basic.auth.user.info" $HOME/.ccloud/config | cut -d'=' -f2) $(grep "^schema.registry.url" $HOME/.ccloud/config | cut -d'=' -f2)/subjects
84+
# Same as above, as a single bash command to parse the values out of $HOME/.confluent/java.config
85+
$ curl -u $(grep "^schema.registry.basic.auth.user.info" $HOME/.confluent/java.config | cut -d'=' -f2) $(grep "^schema.registry.url" $HOME/.confluent/java.config | cut -d'=' -f2)/subjects
9886
```
9987

100-
3. Add the following parameters to your local Confluent Cloud configuration file (``$HOME/.ccloud/config``). In the output below, substitute values for `<SR API KEY>`, `<SR API SECRET>`, and `<SR ENDPOINT>`.
88+
3. Add the following parameters to your local Confluent Cloud configuration file (``$HOME/.confluent/java.config``). In the output below, substitute values for `<SR API KEY>`, `<SR API SECRET>`, and `<SR ENDPOINT>`.
10189

10290
```shell
103-
$ cat $HOME/.ccloud/config
91+
$ cat $HOME/.confluent/java.config
10492
...
10593
basic.auth.credentials.source=USER_INFO
10694
schema.registry.basic.auth.user.info=<SR API KEY>:<SR API SECRET>
@@ -111,7 +99,7 @@ Note that your VPC must be able to connect to the Confluent Cloud Schema Registr
11199
4. Create the topic in Confluent Cloud
112100

113101
```bash
114-
$ kafka-topics --bootstrap-server `grep "^\s*bootstrap.server" $HOME/.ccloud/config | tail -1` --command-config $HOME/.ccloud/config --topic test2 --create --replication-factor 3 --partitions 6
102+
$ kafka-topics --bootstrap-server `grep "^\s*bootstrap.server" $HOME/.confluent/java.config | tail -1` --command-config $HOME/.confluent/java.config --topic test2 --create --replication-factor 3 --partitions 6
115103
```
116104

117105
5. Run the [Confluent CLI producer](https://docs.confluent.io/current/cli/command-reference/confluent-produce.html#cli-confluent-produce?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud), writing messages to topic `test2`, passing in additional arguments. The additional Schema Registry parameters are required to be passed in as properties instead of a properties file due to https://github.com/confluentinc/schema-registry/issues/1052.

clients/cloud/confluent-cli/confluent-cli-ccsr-example.sh

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
#!/bin/bash
22

3-
CONFIG_FILE=~/.ccloud/config
3+
CONFIG_FILE=$HOME/.confluent/java.config
44

55
source ../../../utils/helper.sh
66

clients/cloud/confluent-cli/confluent-cli-example.sh

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
#!/bin/bash
22

3-
CONFIG_FILE=~/.ccloud/config
3+
CONFIG_FILE=$HOME/.confluent/java.config
44

55
source ../../../utils/helper.sh
66

clients/cloud/csharp/Program.cs

+6-17
Original file line numberDiff line numberDiff line change
@@ -28,18 +28,6 @@ namespace CCloud
2828
{
2929
class Program
3030
{
31-
/// <summary>
32-
/// Extract the value associated with key <paramref name="key"/> from the
33-
/// sasl.jaas.config (<paramref name="jaasConfig"/>) configuration value.
34-
/// </summary>
35-
static string ExtractJaasValue(string jaasConfig, string key)
36-
{
37-
var beginToken = key + "\\=\"";
38-
var startIdx = jaasConfig.IndexOf(beginToken);
39-
var endIdx = jaasConfig.IndexOf("\"", startIdx + beginToken.Length);
40-
return jaasConfig.Substring(startIdx + beginToken.Length, endIdx - startIdx - beginToken.Length);
41-
}
42-
4331
static async Task<ClientConfig> LoadConfig(string configPath, string certDir)
4432
{
4533
try
@@ -49,14 +37,15 @@ static async Task<ClientConfig> LoadConfig(string configPath, string certDir)
4937
.ToDictionary(
5038
line => line.Substring(0, line.IndexOf('=')),
5139
line => line.Substring(line.IndexOf('=') + 1));
52-
40+
Enum.TryParse(cloudConfig["sasl.mechanisms"], out SaslMechanism saslMechanism);
41+
Enum.TryParse(cloudConfig["security.protocol"], out SecurityProtocol securityProtocol);
5342
var clientConfig = new ClientConfig
5443
{
5544
BootstrapServers = cloudConfig["bootstrap.servers"].Replace("\\", ""),
56-
SaslMechanism = SaslMechanism.Plain,
57-
SecurityProtocol = SecurityProtocol.SaslSsl,
58-
SaslUsername = ExtractJaasValue(cloudConfig["sasl.jaas.config"], "username"),
59-
SaslPassword = ExtractJaasValue(cloudConfig["sasl.jaas.config"], "password")
45+
SaslMechanism = saslMechanism,
46+
SecurityProtocol = securityProtocol,
47+
SaslUsername = cloudConfig["sasl.username"],
48+
SaslPassword = cloudConfig["sasl.password"],
6049
};
6150

6251
if (certDir != null)

clients/cloud/csharp/README.md

+6-19
Original file line numberDiff line numberDiff line change
@@ -6,21 +6,8 @@ Produce messages to and consume messages from a Kafka cluster using the .NET Pro
66
# Prerequisites
77

88
* [.NET Core 2.1](https://dotnet.microsoft.com/download) or higher to run the demo application
9-
10-
To run this example, create a local file with configuration parameters to connect to your Kafka cluster, which can be on your local host, [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud), or any other cluster.
11-
If this is a Confluent Cloud cluster, you must have:
12-
13-
* Access to a [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud) cluster
14-
* Initialize a properties file at `$HOME/.ccloud/config` with configuration to your Confluent Cloud cluster:
15-
16-
```shell
17-
$ cat $HOME/.ccloud/config
18-
bootstrap.servers=<BROKER ENDPOINT>
19-
ssl.endpoint.identification.algorithm=https
20-
security.protocol=SASL_SSL
21-
sasl.mechanism=PLAIN
22-
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username\="<API KEY>" password\="<API SECRET>";
23-
```
9+
* Create a local file (e.g. at `$HOME/.confluent/librdkafka.config`) with configuration parameters to connect to your Kafka cluster, which can be on your local host, [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud), or any other cluster. Follow [these detailed instructions](https://github.com/confluentinc/configuration-templates/tree/master/README.md) to properly create this file.
10+
* If you are running on Confluent Cloud, you must have access to a [Confluent Cloud](https://www.confluent.io/confluent-cloud/?utm_source=github&utm_medium=demo&utm_campaign=ch.examples_type.community_content.clients-ccloud) cluster
2411

2512
# Example
2613

@@ -40,10 +27,10 @@ Run the example application, passing in arguments for (a) whether to produce or
4027
$ dotnet build
4128

4229
# Run the producer (Windows)
43-
$ dotnet run produce test1 %HOMEPATH%/.ccloud/config /path/to/curl/cacert.pem
30+
$ dotnet run produce test1 $HOME/.confluent/librdkafka.config /path/to/curl/cacert.pem
4431

4532
# Run the producer (other)
46-
$ dotnet run produce test1 $HOME/.ccloud/config
33+
$ dotnet run produce test1 $HOME/.confluent/librdkafka.config
4734
```
4835

4936
You should see:
@@ -78,10 +65,10 @@ Run the consumer, passing in arguments for (a) whether to produce or consume (co
7865

7966
```shell
8067
# Run the consumer (Windows)
81-
$ dotnet run consume test1 %HOMEPATH%/.ccloud/config /path/to/curl/cacert.pem
68+
$ dotnet run consume test1 $HOME/.confluent/librdkafka.config /path/to/curl/cacert.pem
8269

8370
# Run the consumer (other)
84-
$ dotnet run consume test1 $HOME/.ccloud/config
71+
$ dotnet run consume test1 $HOME/.confluent/librdkafka.config
8572
```
8673

8774
You should see:

0 commit comments

Comments
 (0)