diff --git a/txeventq/okafka/.gitignore b/txeventq/okafka/.gitignore index 4f33f1ba..18997188 100644 --- a/txeventq/okafka/.gitignore +++ b/txeventq/okafka/.gitignore @@ -24,4 +24,3 @@ doc/ .settings/ ojdbc.properties -TxProducer/ diff --git a/txeventq/okafka/Quickstart/README.md b/txeventq/okafka/Quickstart/README.md index 99e1615a..4b66cb90 100644 --- a/txeventq/okafka/Quickstart/README.md +++ b/txeventq/okafka/Quickstart/README.md @@ -105,7 +105,7 @@ END; ## Step 4: Investigate and Try Simple Producer and Consumer -The repository contains 2 common OKafka application examples in `Simple` folder. +The repository contains two common OKafka application examples in `Simple` folder. 1. The Producer `ProducerOKafka.java` @@ -197,7 +197,7 @@ You should see some output that looks very similar to this: 13:33:31.862 [main] INFO org.oracle.okafka.clients.producer.KafkaProducer -- [Producer clientId=] Overriding the default retries config to the recommended value of 2147483647 since the idempotent producer is enabled. 13:33:31.865 [kafka-producer-network-thread | ] DEBUG org.oracle.okafka.clients.producer.internals.SenderThread -- [Producer clientId=] Starting Kafka producer I/O thread. 13:33:31.866 [kafka-producer-network-thread | ] DEBUG org.oracle.okafka.clients.producer.internals.SenderThread -- [Producer clientId=] Sender waiting for 100 -13:33:31.866 [main] INFO org.apache.kafka.common.utils.AppInfoParser -- Kafka version: 2.8.1 +13:33:31.866 [main] INFO org.apache.kafka.common.utils.AppInfoParser -- Kafka version: 3.7.1 13:33:31.867 [main] INFO org.apache.kafka.common.utils.AppInfoParser -- Kafka commitId: 839b886f9b732b15 13:33:31.867 [main] INFO org.apache.kafka.common.utils.AppInfoParser -- Kafka startTimeMs: 1724258011865 13:33:31.867 [main] DEBUG org.oracle.okafka.clients.producer.KafkaProducer -- [Producer clientId=] Kafka producer started @@ -229,8 +229,6 @@ Initiating close 13:33:48.738 [main] INFO org.apache.kafka.common.utils.AppInfoParser -- App info kafka.producer for unregistered 13:33:48.738 [main] DEBUG org.oracle.okafka.clients.producer.KafkaProducer -- [Producer clientId=] Kafka producer has been closed -BUILD SUCCESSFUL in 17s -3 actionable tasks: 3 executed ``` And, querying the topic `TOPIC_1` at the Database, you should see some output that looks very similar to this: @@ -290,7 +288,7 @@ gradle :Simple:Consumer:run ..... value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer -[main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka version: 2.8.1 +[main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka version: 3.7.1 [main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka commitId: 839b886f9b732b15 [main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka startTimeMs: 1724268189943 [main] INFO org.oracle.okafka.clients.NetworkClient - [Consumer clientId=consumer-consumer_grp_1-1, groupId=consumer_grp_1] Available Nodes 1 @@ -301,10 +299,68 @@ gradle :Simple:Consumer:run [main] INFO org.oracle.okafka.clients.NetworkClient - [Consumer clientId=consumer-consumer_grp_1-1, groupId=consumer_grp_1] Reconnect successful to node 1:localhost:1521:FREEPDB1:FREE:OKAFKA_USER [main] INFO org.oracle.okafka.clients.Metadata - Cluster ID: FREE [main] INFO org.oracle.okafka.clients.NetworkClient - [Consumer clientId=consumer-consumer_grp_1-1, groupId=consumer_grp_1] Available Nodes 1 +No Record Fetched. Retrying in 1 second +partition = 0, offset = 0, key = Just some key for OKafka0, value =0This is test with 128 characters Payload used to test Oracle Kafka. Read https://github.com/oracle/okafka/blob/master/README.md + partition = 0, offset = 1, key = Just some key for OKafka1, value =1This is test with 128 characters Payload used to test Oracle Kafka. Read https://github.com/oracle/okafka/blob/master/README.md + partition = 0, offset = 2, key = Just some key for OKafka2, value =2This is test with 128 characters Payload used to test Oracle Kafka. Read https://github.com/oracle/okafka/blob/master/README.md + partition = 0, offset = 3, key = Just some key for OKafka3, value =3This is test with 128 characters Payload used to test Oracle Kafka. Read https://github.com/oracle/okafka/blob/master/README.md + partition = 0, offset = 4, key = Just some key for OKafka4, value =4This is test with 128 characters Payload used to test Oracle Kafka. Read https://github.com/oracle/okafka/blob/master/README.md + partition = 0, offset = 5, key = Just some key for OKafka5, value =5This is test with 128 characters Payload used to test Oracle Kafka. Read https://github.com/oracle/okafka/blob/master/README.md + partition = 0, offset = 6, key = Just some key for OKafka6, value =6This is test with 128 characters Payload used to test Oracle Kafka. Read https://github.com/oracle/okafka/blob/master/README.md + partition = 0, offset = 7, key = Just some key for OKafka7, value =7This is test with 128 characters Payload used to test Oracle Kafka. Read https://github.com/oracle/okafka/blob/master/README.md + partition = 0, offset = 8, key = Just some key for OKafka8, value =8This is test with 128 characters Payload used to test Oracle Kafka. Read https://github.com/oracle/okafka/blob/master/README.md + partition = 0, offset = 9, key = Just some key for OKafka9, value =9This is test with 128 characters Payload used to test Oracle Kafka. Read https://github.com/oracle/okafka/blob/master/README.md + Committing records10 +No Record Fetched. Retrying in 1 second +``` -..... +## Step 5: Investigate and Try Administration API + +With Administration API it is possible create and delete Topics. + +### Task 1: Try the Producer + +Let’s build and run the Admin Example Class. Use your IDE or open a command line (or terminal) and navigate to the folder +where you have the project files `/`. We can build and run the application by issuing the following command: + +```cmd +gradle Simple:Admin:run +Usage: java OKafkaAdminTopic [CREATE|DELETE] topic1 ... topicN +``` + +This command requires at least two parameters. The first is specify if you wants to create or delete the topics informed +in sequence. For example: + +```shell +gradle Simple:Admin:run --args="CREATE TOPIC_ADMIN_2 TOPIC_ADMIN_3" +``` + +As a result you will see the two new topics created. + +```sql +SQL> select name, queue_table, dequeue_enabled,enqueue_enabled, sharded, queue_category, recipients + 2 from all_queues + 3 where OWNER='OKAFKA_USER' + 4* and QUEUE_TYPE<>'EXCEPTION_QUEUE'; + +NAME QUEUE_TABLE DEQUEUE_ENABLED ENQUEUE_ENABLED SHARDED QUEUE_CATEGORY RECIPIENTS +________________ ________________ __________________ __________________ __________ ____________________________ _____________ +...... +TOPIC_ADMIN_2 TOPIC_ADMIN_2 YES YES TRUE Sharded Queue MULTIPLE +TOPIC_ADMIN_3 TOPIC_ADMIN_3 YES YES TRUE Sharded Queue MULTIPLE ``` + +## Transaction in OKafka Examples + +Kafka Client for Oracle Transactional Event Queues allow developers use the transaction API effectively. + +Transactions allow for atomic writes across multiple TxEventQ topics and partitions, ensuring that either all messages +within the transaction are successfully written, or none are. For instance, if an error occurs during processing, the +transaction may be aborted, preventing any of the messages from being committed to the topic or accessed by consumers. + +You can now build and run the [Transactional Examples](./Transactional/TRANSACTIONAL_EXAMPLES.MD). + ## Want to Learn More? - [Kafka APIs for Oracle Transactional Event Queues](https://docs.oracle.com/en/database/oracle/oracle-database/19/adque/) diff --git a/txeventq/okafka/Quickstart/Simple/Admin/src/main/java/org/oracle/okafka/examples/OKafkaDeleteTopic.java b/txeventq/okafka/Quickstart/Simple/Admin/src/main/java/org/oracle/okafka/examples/OKafkaDeleteTopic.java deleted file mode 100644 index ce0be2fb..00000000 --- a/txeventq/okafka/Quickstart/Simple/Admin/src/main/java/org/oracle/okafka/examples/OKafkaDeleteTopic.java +++ /dev/null @@ -1,58 +0,0 @@ -/* - ** OKafka Java Client version 23.4. - ** - ** Copyright (c) 2019, 2024 Oracle and/or its affiliates. - ** Licensed under the Universal Permissive License v 1.0 as shown at https://oss.oracle.com/licenses/upl. - */ - -package org.oracle.okafka.examples; - -import java.util.Collections; -import java.util.Properties; -import java.util.concurrent.ExecutionException; - -import org.apache.kafka.clients.admin.Admin; -import org.apache.kafka.clients.admin.CreateTopicsResult; -import org.apache.kafka.clients.admin.NewTopic; -import org.apache.kafka.common.KafkaFuture; -import org.oracle.okafka.clients.admin.AdminClient; -import org.oracle.okafka.clients.admin.DeleteTopicsResult; -import org.oracle.okafka.clients.admin.KafkaAdminClient; - -public class OKafkaDeleteTopic { - - public static void main(String[] args) { - Properties props = new Properties(); - //IP or Host name where Oracle Database 23c is running and Database Listener's Port - props.put("bootstrap.servers", "localhost:1521"); - //name of the service running on the database instance - props.put("oracle.service.name", "FREEPDB1"); - props.put("security.protocol","PLAINTEXT"); - // location for ojdbc.properties file where user and password properties are saved - props.put("oracle.net.tns_admin","."); - - /* - //Option 2: Connect to Oracle Autonomous Database using Oracle Wallet - //This option to be used when connecting to Oracle autonomous database instance on OracleCloud - props.put("security.protocol","SSL"); - // location for Oracle Wallet, tnsnames.ora file and ojdbc.properties file - props.put("oracle.net.tns_admin","."); - props.put("tns.alias","Oracle23ai_high"); - */ - try (Admin admin = AdminClient.create(props)) { - - org.apache.kafka.clients.admin.DeleteTopicsResult delResult = admin.deleteTopics(Collections.singletonList("TXEQ")); - - //DeleteTopicsResult delResult = kAdminClient.deleteTopics(Collections.singletonList("TEQ2"), new org.oracle.okafka.clients.admin.DeleteTopicsOptions()); - - Thread.sleep(5000); - System.out.println("Auto Clsoing admin now"); - } - catch(Exception e) - { - System.out.println("Exception while creating topic " + e); - e.printStackTrace(); - } - } - -} diff --git a/txeventq/okafka/Quickstart/Simple/Admin/src/main/resources/config.properties b/txeventq/okafka/Quickstart/Simple/Admin/src/main/resources/config.properties index c3a12208..e7a45cbf 100644 --- a/txeventq/okafka/Quickstart/Simple/Admin/src/main/resources/config.properties +++ b/txeventq/okafka/Quickstart/Simple/Admin/src/main/resources/config.properties @@ -6,7 +6,6 @@ bootstrap.servers= oracle.service.name= oracle.net.tns_admin= - #Option 2: Connect to Oracle Database deployed in Oracle Autonomous Cloud using Wallet security.protocol=PLAINTEXT #oracle.net.tns_admin= diff --git a/txeventq/okafka/Quickstart/Simple/Consumer/src/main/resources/config.properties b/txeventq/okafka/Quickstart/Simple/Consumer/src/main/resources/config.properties index 551aa833..1ef6b6fe 100644 --- a/txeventq/okafka/Quickstart/Simple/Consumer/src/main/resources/config.properties +++ b/txeventq/okafka/Quickstart/Simple/Consumer/src/main/resources/config.properties @@ -12,14 +12,16 @@ oracle.net.tns_admin= #tns.alias= # Application specific OKafka consumer properties -topic.name= +topic.name= group.id= - enable.auto.commit=true max.poll.records=1000 default.api.timeout.ms=180000 +# Start consuming from the beginning (Default = latest); +auto.offset.reset=earliest + key.deserializer=org.apache.kafka.common.serialization.StringDeserializer value.deserializer=org.apache.kafka.common.serialization.StringDeserializer diff --git a/txeventq/okafka/Quickstart/Simple/Producer/src/main/resources/config.properties b/txeventq/okafka/Quickstart/Simple/Producer/src/main/resources/config.properties index a790c136..c660f18b 100644 --- a/txeventq/okafka/Quickstart/Simple/Producer/src/main/resources/config.properties +++ b/txeventq/okafka/Quickstart/Simple/Producer/src/main/resources/config.properties @@ -12,7 +12,8 @@ oracle.net.tns_admin= #tns.alias= #Appliction specific OKafka Producer properties -topic.name= +allow.auto.create.topics=FALSE +topic.name= batch.size=200 linger.ms=100 diff --git a/txeventq/okafka/Quickstart/Transactional/Consumer/src/main/java/org/oracle/okafka/examples/TransactionalConsumerOKafka.java b/txeventq/okafka/Quickstart/Transactional/Consumer/src/main/java/org/oracle/okafka/examples/TransactionalConsumerOKafka.java index 65aedbaa..8b584c8c 100644 --- a/txeventq/okafka/Quickstart/Transactional/Consumer/src/main/java/org/oracle/okafka/examples/TransactionalConsumerOKafka.java +++ b/txeventq/okafka/Quickstart/Transactional/Consumer/src/main/java/org/oracle/okafka/examples/TransactionalConsumerOKafka.java @@ -7,6 +7,9 @@ package org.oracle.okafka.examples; +import java.io.FileNotFoundException; +import java.io.IOException; +import java.io.InputStream; import java.util.Properties; import java.sql.Connection; import java.time.Duration; @@ -24,68 +27,32 @@ import org.oracle.okafka.clients.consumer.KafkaConsumer; - public class TransactionalConsumerOKafka { - // Dummy implementation of ConsumerRebalanceListener interface - // It only maintains the list of assigned partitions in assignedPartitions list - static class ConsumerRebalance implements ConsumerRebalanceListener { - - public List assignedPartitions = new ArrayList(); + public static void main(String[] args) { + System.setProperty("org.slf4j.simpleLogger.defaultLogLevel", "DEBUG"); - @Override - public synchronized void onPartitionsAssigned(Collection partitions) { - System.out.println("Newly Assigned Partitions:"); - for (TopicPartition tp :partitions ) { - System.out.println(tp); - assignedPartitions.add(tp); + // Get application properties + Properties appProperties = null; + try { + appProperties = getProperties(); + if (appProperties == null) { + System.out.println("Application properties not found!"); + System.exit(-1); } + } catch (Exception e) { + System.out.println("Application properties not found!"); + System.out.println("Exception: " + e); + System.exit(-1); } - @Override - public synchronized void onPartitionsRevoked(Collection partitions) { - System.out.println("Revoked previously assigned partitions. "); - for (TopicPartition tp :assignedPartitions ) { - System.out.println(tp); - } - assignedPartitions.clear(); - } - } + String topicName = appProperties.getProperty("topic.name", "TXEQ"); + appProperties.remove("topic.name"); // Pass props to build OKafkaProducer - public static void main(String[] args) { - Properties props = new Properties(); - - // Option 1: Connect to Oracle Database with database username and password - props.put("security.protocol","PLAINTEXT"); - //IP or Host name where Oracle Database 23ai is running and Database Listener's Port - props.put("bootstrap.servers", "localhost:1521"); - props.put("oracle.service.name", "freepdb1"); //name of the service running on the database instance - // location for ojdbc.properties file where user and password properties are saved - props.put("oracle.net.tns_admin","."); - - /* - //Option 2: Connect to Oracle Autonomous Database using Oracle Wallet - //This option to be used when connecting to Oracle autonomous database instance on OracleCloud - props.put("security.protocol","SSL"); - // location for Oracle Wallet, tnsnames.ora file and ojdbc.properties file - props.put("oracle.net.tns_admin","."); - props.put("tns.alias","Oracle23ai_high"); - */ - - //Consumer Group Name - props.put("group.id" , "CG1"); - props.put("enable.auto.commit","false"); - - // Maximum number of records fetched in single poll call - props.put("max.poll.records", 10); - - props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer"); - props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer"); - - Consumer consumer = new KafkaConsumer(props); + Consumer consumer = new KafkaConsumer(appProperties); ConsumerRebalanceListener rebalanceListener = new ConsumerRebalance(); - consumer.subscribe(Arrays.asList("TXEQ"), rebalanceListener); + consumer.subscribe(Arrays.asList(topicName), rebalanceListener); int expectedMsgCnt = 100; int msgCnt = 0; @@ -152,4 +119,55 @@ private static void processRecord(Connection conn, ConsumerRecord assignedPartitions = new ArrayList(); + + @Override + public synchronized void onPartitionsAssigned(Collection partitions) { + System.out.println("Newly Assigned Partitions:"); + for (TopicPartition tp :partitions ) { + System.out.println(tp); + assignedPartitions.add(tp); + } + } + + @Override + public synchronized void onPartitionsRevoked(Collection partitions) { + System.out.println("Revoked previously assigned partitions. "); + for (TopicPartition tp :assignedPartitions ) { + System.out.println(tp); + } + assignedPartitions.clear(); + } + } } diff --git a/txeventq/okafka/Quickstart/Transactional/Consumer/src/main/resources/config.properties b/txeventq/okafka/Quickstart/Transactional/Consumer/src/main/resources/config.properties new file mode 100644 index 00000000..4d9b62b7 --- /dev/null +++ b/txeventq/okafka/Quickstart/Transactional/Consumer/src/main/resources/config.properties @@ -0,0 +1,34 @@ +# OKafka Producer example properties + +#Properties to connect to Oracle Database +#Option 1: Connect to Oracle database using plaintext + +bootstrap.servers= +oracle.service.name= +oracle.net.tns_admin= + + +#Option 2: Connect to Oracle Database deployed in Oracle Autonomous Cloud using Wallet +#security.protocol=SSL +security.protocol=PLAINTEXT + +#oracle.net.tns_admin= +#tns.alias= + +# Application specific OKafka consumer properties +topic.name= +group.id= + +enable.auto.commit=false + +# Start consuming from the beginning (Default = latest); +auto.offset.reset=earliest + +# Maximum number of records fetched in single poll call +max.poll.records=10 + +default.api.timeout.ms=180000 + +key.deserializer=org.apache.kafka.common.serialization.StringDeserializer +value.deserializer=org.apache.kafka.common.serialization.StringDeserializer + diff --git a/txeventq/okafka/Quickstart/Transactional/Producer/src/main/java/org/oracle/okafka/examples/TransactionalProducerOKafka.java b/txeventq/okafka/Quickstart/Transactional/Producer/src/main/java/org/oracle/okafka/examples/TransactionalProducerOKafka.java index 703ed636..580c196e 100644 --- a/txeventq/okafka/Quickstart/Transactional/Producer/src/main/java/org/oracle/okafka/examples/TransactionalProducerOKafka.java +++ b/txeventq/okafka/Quickstart/Transactional/Producer/src/main/java/org/oracle/okafka/examples/TransactionalProducerOKafka.java @@ -7,90 +7,111 @@ package org.oracle.okafka.examples; -import org.oracle.okafka.clients.producer.KafkaProducer; import org.apache.kafka.clients.producer.Producer; import org.apache.kafka.clients.producer.ProducerRecord; -import org.apache.kafka.common.KafkaException; import org.apache.kafka.common.errors.DisconnectException; import org.apache.kafka.common.header.internals.RecordHeader; +import org.apache.kafka.common.KafkaException; + +import org.oracle.okafka.clients.producer.KafkaProducer; +import java.io.FileNotFoundException; +import java.io.IOException; +import java.io.InputStream; import java.sql.Connection; +import java.util.Optional; import java.util.Properties; +import java.util.concurrent.atomic.AtomicBoolean; public class TransactionalProducerOKafka { + public static void main(String[] args) { + System.setProperty("org.slf4j.simpleLogger.defaultLogLevel", "DEBUG"); + + // Get application properties + Properties appProperties = null; + try { + appProperties = getProperties(); + if (appProperties == null) { + System.out.println("Application properties not found!"); + System.exit(-1); + } + } catch (Exception e) { + System.out.println("Application properties not found!"); + System.out.println("Exception: " + e); + System.exit(-1); + } + + + boolean failProcessing = false; + if ((args != null) && (args.length == 1)) { + failProcessing = args[0].equals("FAIL"); + } + + String topicName = appProperties.getProperty("topic.name", "TXEQ"); + appProperties.remove("topic.name"); // Pass props to build OKafkaProducer + + Producer producer = null; try { - Properties props = new Properties(); - - // Option 1: Connect to Oracle Database with database username and password - props.put("security.protocol","PLAINTEXT"); - //IP or Host name where Oracle Database 23ai is running and Database Listener's Port - props.put("bootstrap.servers", "localhost:1521"); - props.put("oracle.service.name", "freepdb1"); //name of the service running on the database instance - // location for ojdbc.properties file where user and password properties are saved - props.put("oracle.net.tns_admin","."); - - /* - //Option 2: Connect to Oracle Autonomous Database using Oracle Wallet - //This option to be used when connecting to Oracle autonomous database instance on OracleCloud - props.put("security.protocol","SSL"); - // location for Oracle Wallet, tnsnames.ora file and ojdbc.properties file - props.put("oracle.net.tns_admin","."); - props.put("tns.alias","Oracle23ai_high"); - */ - - props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer"); - props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer"); - - //Property to create a Transactional Producer - props.put("oracle.transactional.producer", "true"); - - producer = new KafkaProducer(props); + producer = new KafkaProducer(appProperties); int msgCnt = 100; - String jsonPayload = "{\"name\":\"Programmer"+msgCnt+"\",\"status\":\"classy\",\"catagory\":\"general\",\"region\":\"north\",\"title\":\"programmer\"}"; + int limitMessages = 33; + + String jsonPayload = "{\"name\":\"Programmer"+msgCnt+"\",\"status\":\"classy\",\"catagory\":\"general\"," + + "\"region\":\"north\",\"title\":\"programmer\"}"; + System.out.println(jsonPayload); producer.initTransactions(); Connection conn = ((KafkaProducer )producer).getDBConnection(); - String topicName = "TXEQ"; + // Produce 100 records in a transaction and commit. try { producer.beginTransaction(); - boolean fail = false; - for( int i=0;i producerRecord = - new ProducerRecord(topicName, i+"", jsonPayload); + new ProducerRecord(topicName, String.valueOf(messagesProduced), jsonPayload); producerRecord.headers().add(rH1).add(rH2); + try { - processRecord(conn, producerRecord); + processRecord(conn, producerRecord, failProcessing, messagesProduced, limitMessages); } catch(Exception e) { - //Retry processRecord or abort the Okafka transaction and close the producer - fail = true; + System.out.println(e.getMessage()); + //Retry processRecord or abort the OKafka transaction and close the producer + fail.set(true); break; } + producer.send(producerRecord); + messagesProduced++; } - if(fail) // Failed to process the records. Abort Okafka transaction + String returnMsg = String.format("Produced %d messages.", messagesProduced); + + if(fail.get()) { // Failed to process the records. Abort Okafka transaction producer.abortTransaction(); - else // Successfully process all the records. Commit OKafka transaction + System.out.println(returnMsg.concat(" Process aborted and not committed.")); + + } else { // Successfully process all the records. Commit OKafka transaction producer.commitTransaction(); + System.out.println(returnMsg); + } - System.out.println("Produced 100 messages."); - }catch( DisconnectException dcE) { + } catch( DisconnectException dcE) { producer.close(); - }catch (KafkaException e) { + } catch (KafkaException e) { producer.abortTransaction(); } } - catch(Exception e) - { + catch(Exception e) { System.out.println("Exception in Main " + e ); e.printStackTrace(); } @@ -98,8 +119,7 @@ public static void main(String[] args) { try { if(producer != null) producer.close(); - }catch(Exception e) - { + } catch(Exception e) { System.out.println("Exception while closing producer " + e); e.printStackTrace(); @@ -108,9 +128,40 @@ public static void main(String[] args) { } } - private static void processRecord(Connection conn, ProducerRecord record) throws Exception + private static void processRecord(Connection conn, ProducerRecord record, + boolean failProcessing, + int messagesProduced, int limitMessages) throws Exception { //Application specific logic + + // Samples logic to fail during execution and abort the transaction + if ((failProcessing) && (messagesProduced == limitMessages)) + throw new Exception("Exception while processing record"); + } + + private static java.util.Properties getProperties() throws IOException { + InputStream inputStream = null; + Properties appProperties; + + try { + Properties prop = new Properties(); + String propFileName = "config.properties"; + inputStream = TransactionalProducerOKafka.class.getClassLoader().getResourceAsStream(propFileName); + if (inputStream != null) { + prop.load(inputStream); + } else { + throw new FileNotFoundException("property file '" + propFileName + "' not found."); + } + appProperties = prop; + + } catch (Exception e) { + System.out.println("Exception: " + e); + throw e; + } finally { + if (inputStream != null) + inputStream.close(); + } + return appProperties; } } \ No newline at end of file diff --git a/txeventq/okafka/Quickstart/Transactional/Producer/src/main/resources/config.properties b/txeventq/okafka/Quickstart/Transactional/Producer/src/main/resources/config.properties new file mode 100644 index 00000000..520d53e4 --- /dev/null +++ b/txeventq/okafka/Quickstart/Transactional/Producer/src/main/resources/config.properties @@ -0,0 +1,31 @@ +# OKafka Producer example properties + +#Properties to connect to Oracle Database +#Option 1: Connect to Oracle database using plaintext +bootstrap.servers= +oracle.service.name= +oracle.net.tns_admin= + +#Option 2: Connect to Oracle Database deployed in Oracle Autonomous Cloud using Wallet +#security.protocol=SSL +security.protocol=PLAINTEXT + +#oracle.net.tns_admin= +#tns.alias= + +#Appliction specific OKafka Producer properties +allow.auto.create.topics=FALSE +topic.name= + +batch.size=200 +linger.ms=100 +buffer.memory=335544 + +enable.idempotence=true + +# Property to create a Transactional Producer +oracle.transactional.producer=true + +key.serializer=org.apache.kafka.common.serialization.StringSerializer +value.serializer=org.apache.kafka.common.serialization.StringSerializer + diff --git a/txeventq/okafka/Quickstart/Transactional/TRANSACTIONAL_EXAMPLES.MD b/txeventq/okafka/Quickstart/Transactional/TRANSACTIONAL_EXAMPLES.MD new file mode 100644 index 00000000..874aeda3 --- /dev/null +++ b/txeventq/okafka/Quickstart/Transactional/TRANSACTIONAL_EXAMPLES.MD @@ -0,0 +1,258 @@ +# OKafka Transactional Examples + +The repository contains two transactional OKafka application examples in `Transactional` folder. + +1. The Producer `TransactionalProducerOKafka.java` + + - Produces 100 messages into `TX_TOPIC_1` topic. + +2. The Consumer `TransactionalConsumerOKafka.java` + + - Consumes 100 messages from `TX_TOPIC_1` topic. + +## Task 1: Applications Configurations + +The steps to create the topic and to configure the Producer and Consumer follow the same instructions detailed in +[Quickstart Guide](../README.md) with only one difference to the Transactional Producer that requires an additional +property. + +```text +# Property to create a Transactional Producer +oracle.transactional.producer=true +``` + +## Task 2: Try the Producer +Let’s build and run the Producer. Use your IDE or open a command line (or terminal) and navigate to the folder where you have the project +files `/`. We can build and run the application by issuing the following command: + +```shell +gradle Transactional:Producer:run +``` + +You should see some output that looks very similar to this: + +```text +12:10:52 PM: Executing ':Transactional:Producer:TransactionalProducerOKafka.main()'... + +> Task :Transactional:Producer:TransactionalProducerOKafka.main() +{"name":"Programmer100","status":"classy","catagory":"general","region":"north","title":"programmer"} +[main] INFO org.oracle.okafka.clients.producer.ProducerConfig - ProducerConfig values: + acks = 1 + batch.size = 200 + bootstrap.servers = [localhost:1521] + buffer.memory = 335544 + client.dns.lookup = use_all_dns_ips + client.id = + compression.type = none + connections.max.idle.ms = 540000 + delivery.timeout.ms = 120000 + enable.idempotence = true + ... + +[main] DEBUG org.oracle.okafka.clients.producer.KafkaProducer - [Producer clientId=] Transactioal Producer set to true +[main] DEBUG org.oracle.okafka.clients.Metadata - Update Metadata. isBootstap? true +[main] DEBUG org.oracle.okafka.clients.Metadata - Updated cluster metadata version 1 to Cluster(id = null, nodes = [0:localhost:1521:FREEPDB1::], partitions = [], controller = null) +[main] DEBUG org.oracle.okafka.clients.producer.KafkaProducer - [Producer clientId=] Setting externally supplied database conneciton +[main] DEBUG org.oracle.okafka.clients.producer.internals.AQKafkaProducer - [Producer clientId=] Setting externally supplied db connection null +[main] DEBUG org.oracle.okafka.clients.producer.internals.OracleTransactionManager - [Producer clientId=] Transaction id null +[main] WARN org.oracle.okafka.clients.producer.ProducerConfig - The configuration 'batch.size' was supplied but isn't a known config. +[main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka version: 3.7.1 +[main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka commitId: 839b886f9b732b15 +[main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka startTimeMs: 1724857854059 +[main] DEBUG org.oracle.okafka.clients.producer.KafkaProducer - [Producer clientId=] Kafka producer started +[main] DEBUG org.oracle.okafka.clients.producer.internals.AQKafkaProducer - [Producer clientId=] Transactional producer trying to connect to BootstrapNode. 0:localhost:1521:FREEPDB1:: +[main] DEBUG org.oracle.okafka.clients.producer.internals.AQKafkaProducer - [Producer clientId=] Creating new connection for node 0:localhost:1521:FREEPDB1:: +[main] INFO org.oracle.okafka.clients.producer.internals.AQKafkaProducer - [Producer clientId=] Connecting to Oracle Database : jdbc:oracle:thin:@(DESCRIPTION=(ADDRESS=(PROTOCOL=tcp)(PORT=1521)(HOST=localhost))(CONNECT_DATA=(SERVICE_NAME=FREEPDB1))) +[main] INFO org.oracle.okafka.clients.producer.internals.AQKafkaProducer - [Producer clientId=] Database Producer Session_Info:59,29991. Process Id:103759. Instance Name:FREE +[main] INFO org.oracle.okafka.clients.producer.internals.AQKafkaProducer - [Producer clientId=] Connected nodes: [0:localhost:1521:FREEPDB1::] +[main] DEBUG org.oracle.okafka.clients.producer.internals.AQKafkaProducer - [Producer clientId=] CONNECTED NODES: [0:localhost:1521:FREEPDB1::] +[main] DEBUG org.oracle.okafka.clients.producer.internals.AQKafkaProducer - [Producer clientId=] Leader node is null +[main] DEBUG org.oracle.okafka.clients.producer.internals.AQKafkaProducer - [Producer clientId=] Controller Node null +[main] DEBUG org.oracle.okafka.clients.Metadata - Leader Node for Version null_1:0:localhost:1521:FREEPDB1:: +[main] DEBUG org.oracle.okafka.clients.producer.internals.AQKafkaProducer - [Producer clientId=] getDBConnection: Controller Node and LeaderNode set to 0:localhost:1521:FREEPDB1:: +[main] DEBUG org.oracle.okafka.clients.producer.internals.AQKafkaProducer - [Producer clientId=] Returning already created db connection. oracle.jdbc.driver.T4CConnection@2b91004a +[main] DEBUG org.oracle.okafka.clients.producer.internals.OracleTransactionManager - [Producer clientId=] Client Transaction id FREE_OKAFKA_USER_59_29991_28_08_2024_12_10_56_638 +[main] DEBUG org.oracle.okafka.clients.producer.internals.OracleTransactionManager - [Producer clientId=] Transaction id FREE_OKAFKA_USER_59_29991_28_08_2024_12_10_56_638 +[main] DEBUG org.oracle.okafka.clients.producer.internals.OracleTransactionManager - [Producer clientId=] Client Transaction id FREE_OKAFKA_USER_59_29991_28_08_2024_12_10_56_638 +[main] DEBUG org.oracle.okafka.clients.producer.internals.OracleTransactionManager - [Producer clientId=] Client Transaction id FREE_OKAFKA_USER_59_29991_28_08_2024_12_10_56_638 +[main] DEBUG org.oracle.okafka.clients.producer.internals.AQKafkaProducer - [Producer clientId=] Returning already created db connection. oracle.jdbc.driver.T4CConnection@2b91004a +.... +[main] DEBUG org.oracle.okafka.clients.Metadata - Updated cluster metadata version 2 to Cluster(id = FREE, nodes = [0:localhost:1521:FREEPDB1::], partitions = [Partition(topic = TX_TOPIC_1, partition = 0, leader = INSTANCE_0, replicas = [], isr = [], offlineReplicas = [])], controller = 0:localhost:1521:FREEPDB1::) +[main] DEBUG org.oracle.okafka.clients.producer.internals.AQKafkaProducer - [Producer clientId=] Message for TopicPartition TX_TOPIC_1-0 +[main] DEBUG org.oracle.okafka.clients.producer.internals.AQKafkaProducer - [Producer clientId=] Controller 0:localhost:1521:FREEPDB1:: +[main] DEBUG org.oracle.okafka.clients.producer.internals.AQKafkaProducer - [Producer clientId=] Leader Node 0:localhost:1521:FREEPDB1:: +[main] DEBUG org.oracle.okafka.clients.producer.internals.AQKafkaProducer - [Producer clientId=] Available Topic Publishers Session_Info:59,29991. Process Id:103759. Instance Name:FREE. Acknowledge_mode:0. +.... +[main] DEBUG org.oracle.okafka.clients.producer.internals.OracleTransactionManager - [Producer clientId=] Commiting Transaction. TransactionState BEGIN. Local Transaction Id:FREE_OKAFKA_USER_59_29991_28_08_2024_12_10_56_638. Database Connection oracle.jdbc.driver.T4CConnection@2b91004a +[main] DEBUG org.oracle.okafka.clients.producer.internals.OracleTransactionManager - [Producer clientId=] Commiting database transaction at instance: FREE. Session information: 59,29991. Process id:103759. +Produced 100 messages. +[main] INFO org.oracle.okafka.clients.producer.KafkaProducer - [Producer clientId=] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. +[main] INFO org.apache.kafka.common.metrics.Metrics - Metrics scheduler closed +[main] INFO org.apache.kafka.common.metrics.Metrics - Closing reporter org.apache.kafka.common.metrics.JmxReporter +[main] INFO org.apache.kafka.common.metrics.Metrics - Metrics reporters closed +[main] INFO org.apache.kafka.common.utils.AppInfoParser - App info kafka.producer for unregistered +[main] DEBUG org.oracle.okafka.clients.producer.KafkaProducer - [Producer clientId=] Kafka producer has been closed +Producer Closed +``` + +And, querying the topic `TX_TOPIC_1` at the Database, you should see some output that looks very similar to this: + +```roomsql +SQL> select MSGID, ENQUEUE_TIME from TX_TOPIC_1; + +MSGID ENQUEUE_TIME +___________________________________ __________________________________ +00000000000000000000000000660000 28/08/24 15:11:01,442151000 GMT +.... +00000000000000000000000000666200 28/08/24 15:11:23,670875000 GMT +00000000000000000000000000666300 28/08/24 15:11:23,888144000 GMT + +100 rows selected. +``` + +## Task 3: Try the Producer with + +Now, let's experience the transactional behavior working when there is any issue during the processing of the messages +before sending. The Producer is prepared to abort the transaction when reaches thirty-three messages and you asked it +to FAIL just passing "FAIL" as a command-line argument. + +```shell + gradle Transactional:Producer:run --args="FAIL" +``` + +You should see some output that looks very similar to this: + +```text +> Task :Transactional:Producer:run +[main] INFO org.oracle.okafka.clients.producer.ProducerConfig - ProducerConfig values: + acks = 1 + batch.size = 200 + .... +[main] DEBUG org.oracle.okafka.clients.producer.KafkaProducer - [Producer clientId=] Transactioal Producer set to true +[main] DEBUG org.oracle.okafka.clients.Metadata - Update Metadata. isBootstap? true +[main] DEBUG org.oracle.okafka.clients.Metadata - Updated cluster metadata version 1 to Cluster(id = null, nodes = [0:localhost:1521:FREEPDB1::], partitions = [], controller = null) +[main] DEBUG org.oracle.okafka.clients.producer.KafkaProducer - [Producer clientId=] Setting externally supplied database conneciton +[main] DEBUG org.oracle.okafka.clients.producer.internals.AQKafkaProducer - [Producer clientId=] Setting externally supplied db connection null +[main] DEBUG org.oracle.okafka.clients.producer.internals.OracleTransactionManager - [Producer clientId=] Transaction id null +[main] WARN org.oracle.okafka.clients.producer.ProducerConfig - The configuration 'batch.size' was supplied but isn't a known config. +[main] WARN org.oracle.okafka.clients.producer.ProducerConfig - The configuration 'allow.auto.create.topics' was supplied but isn't a known config. +[main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka version: 3.7.1 +[main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka commitId: 839b886f9b732b15 +[main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka startTimeMs: 1724942627369 +[main] DEBUG org.oracle.okafka.clients.producer.KafkaProducer - [Producer clientId=] Kafka producer started +{"name":"Programmer100","status":"classy","catagory":"general","region":"north","title":"programmer"} +[main] DEBUG org.oracle.okafka.clients.producer.internals.AQKafkaProducer - [Producer clientId=] Transactional producer trying to connect to BootstrapNode. 0:localhost:1521:FREEPDB1:: +.... +[main] INFO org.oracle.okafka.clients.Metadata - Cluster ID: FREE +[main] DEBUG org.oracle.okafka.clients.Metadata - Updated cluster metadata version 2 to Cluster(id = FREE, nodes = [0:localhost:1521:FREEPDB1::], partitions = [Partition(topic = TX_TOPIC_1, partition = 0, leader = INSTANCE_0, replicas = [], isr = [], offlineReplicas = [])], controller = 0:localhost:1521:FREEPDB1::) +[main] DEBUG org.oracle.okafka.clients.producer.internals.AQKafkaProducer - [Producer clientId=] Message for TopicPartition TX_TOPIC_1-0 +[main] DEBUG org.oracle.okafka.clients.producer.internals.AQKafkaProducer - [Producer clientId=] Controller 0:localhost:1521:FREEPDB1:: +[main] DEBUG org.oracle.okafka.clients.producer.internals.AQKafkaProducer - [Producer clientId=] Leader Node 0:localhost:1521:FREEPDB1:: +[main] DEBUG org.oracle.okafka.clients.producer.internals.AQKafkaProducer - [Producer clientId=] Available Topic Publishers Session_Info:58,59501. Process Id:153706. Instance Name:FREE. Acknowledge_mode:0. +.... +[main] DEBUG org.oracle.okafka.clients.producer.internals.AQKafkaProducer - [Producer clientId=] Available Topic Publishers Session_Info:58,59501. Process Id:153706. Instance Name:FREE. Acknowledge_mode:0.Topics:[TX_TOPIC_1]. +Exception while processing record +[main] INFO org.oracle.okafka.clients.producer.KafkaProducer - [Producer clientId=] Aborting incomplete transaction +Produced 33 messages. Process aborted and not committed. +[main] INFO org.oracle.okafka.clients.producer.KafkaProducer - [Producer clientId=] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. +[main] INFO org.apache.kafka.common.metrics.Metrics - Metrics scheduler closed +[main] INFO org.apache.kafka.common.metrics.Metrics - Closing reporter org.apache.kafka.common.metrics.JmxReporter +[main] INFO org.apache.kafka.common.metrics.Metrics - Metrics reporters closed +[main] INFO org.apache.kafka.common.utils.AppInfoParser - App info kafka.producer for unregistered +[main] DEBUG org.oracle.okafka.clients.producer.KafkaProducer - [Producer clientId=] Kafka producer has been closed +Producer Closed +``` + +And, querying the topic `TX_TOPIC_1` at the Database, you should see that the amount of messages didn't change. + +## Task 4: Try the Consumer + +Let’s now build and run the Consumer. Use your IDE or open a command line (or terminal) and navigate to the folder where you have the project +files `/`. We can build and run the application by issuing the following command: + +```shell +gradle Transactional:Consumer:run +``` + +You should see some output that looks very similar to this: + +```shell +> Task :Transactional:Consumer:TransactionalConsumerOKafka.main() +[main] INFO org.oracle.okafka.clients.consumer.ConsumerConfig - ConsumerConfig values: + allow.auto.create.topics = true + auto.commit.interval.ms = 5000 + auto.offset.reset = latest + bootstrap.servers = [localhost:1521] + check.crcs = true + client.dns.lookup = use_all_dns_ips + client.id = consumer-TX_CONSUMER_GRP_1-1 + client.rack = + connections.max.idle.ms = 540000 + default.api.timeout.ms = 180000 + enable.auto.commit = false + exclude.internal.topics = true + fetch.max.bytes = 52428800 + fetch.max.wait.ms = 500 + fetch.min.bytes = 1 + group.id = TX_CONSUMER_GRP_1 + group.instance.id = null + heartbeat.interval.ms = 3000 + .... + +[main] DEBUG org.oracle.okafka.clients.consumer.KafkaConsumer - [Consumer clientId=consumer-TX_CONSUMER_GRP_1-1, groupId=TX_CONSUMER_GRP_1] Initializing Kafka Consumer +[main] DEBUG org.oracle.okafka.clients.consumer.KafkaConsumer - [Consumer clientId=consumer-TX_CONSUMER_GRP_1-1, groupId=TX_CONSUMER_GRP_1] Setting up bootstrap cluster +[main] DEBUG org.oracle.okafka.clients.Metadata - Update Metadata. isBootstap? true +[main] DEBUG org.oracle.okafka.clients.Metadata - Updated cluster metadata version 1 to Cluster(id = null, nodes = [0:localhost:1521:FREEPDB1::], partitions = [], controller = null) +[main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka version: 3.7.1 +[main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka commitId: 839b886f9b732b15 +[main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka startTimeMs: 1724860596395 +[main] DEBUG org.oracle.okafka.clients.consumer.KafkaConsumer - [Consumer clientId=consumer-TX_CONSUMER_GRP_1-1, groupId=TX_CONSUMER_GRP_1] Kafka consumer initialized +[main] DEBUG org.oracle.okafka.clients.consumer.KafkaConsumer - [Consumer clientId=consumer-TX_CONSUMER_GRP_1-1, groupId=TX_CONSUMER_GRP_1] Subscribed to topic(s): TX_TOPIC_1 +[main] INFO org.oracle.okafka.clients.NetworkClient - [Consumer clientId=consumer-TX_CONSUMER_GRP_1-1, groupId=TX_CONSUMER_GRP_1] Available Nodes 1 +[main] DEBUG org.oracle.okafka.clients.NetworkClient - [Consumer clientId=consumer-TX_CONSUMER_GRP_1-1, groupId=TX_CONSUMER_GRP_1] 0:localhost:1521:FREEPDB1:: +[main] INFO org.oracle.okafka.clients.NetworkClient - [Consumer clientId=consumer-TX_CONSUMER_GRP_1-1, groupId=TX_CONSUMER_GRP_1] All Known nodes are disconnected. Try one time to connect. +[main] INFO org.oracle.okafka.clients.NetworkClient - [Consumer clientId=consumer-TX_CONSUMER_GRP_1-1, groupId=TX_CONSUMER_GRP_1] Initiating connection to node 0:localhost:1521:FREEPDB1:: +[main] INFO org.oracle.okafka.clients.consumer.internals.AQKafkaConsumer - [Consumer clientId=consumer-TX_CONSUMER_GRP_1-1, groupId=TX_CONSUMER_GRP_1] Connecting to Oracle Database : jdbc:oracle:thin:@(DESCRIPTION=(ADDRESS=(PROTOCOL=tcp)(PORT=1521)(HOST=localhost))(CONNECT_DATA=(SERVICE_NAME=FREEPDB1))) +[main] INFO org.oracle.okafka.clients.consumer.internals.AQKafkaConsumer - [Consumer clientId=consumer-TX_CONSUMER_GRP_1-1, groupId=TX_CONSUMER_GRP_1] Database Consumer Session Info: 222,47901. Process Id 105388 Instance Name FREE +[main] DEBUG org.oracle.okafka.clients.NetworkClient - [Consumer clientId=consumer-TX_CONSUMER_GRP_1-1, groupId=TX_CONSUMER_GRP_1] Connection is established to node 1:localhost:1521:FREEPDB1:FREE:OKAFKA_USER +.... +[main] DEBUG org.oracle.okafka.clients.consumer.internals.AQKafkaConsumer - [Consumer clientId=consumer-TX_CONSUMER_GRP_1-1, groupId=TX_CONSUMER_GRP_1] Returning Connection for node 1:localhost:1521:FREEPDB1:FREE:OKAFKA_USER +[main] DEBUG org.oracle.okafka.clients.consumer.internals.AQKafkaConsumer - [Consumer clientId=consumer-TX_CONSUMER_GRP_1-1, groupId=TX_CONSUMER_GRP_1] Returning Connection for node 1:localhost:1521:FREEPDB1:FREE:OKAFKA_USER +partition = 0, offset = 100, key = 0, value ={"name":"Programmer100","status":"classy","catagory":"general","region":"north","title":"programmer"} + Header: RecordHeader(key = CLIENT_ID, value = [70, 73, 82, 83, 84, 95, 67, 76, 73, 69, 78, 84]) +Header: RecordHeader(key = REPLY_TO, value = [84, 88, 95, 84, 79, 80, 73, 67, 95, 49, 95, 82, 69, 84, 85, 82, 78]) +partition = 0, offset = 101, key = 1, value ={"name":"Programmer100","status":"classy","catagory":"general","region":"north","title":"programmer"} + Header: RecordHeader(key = CLIENT_ID, value = [70, 73, 82, 83, 84, 95, 67, 76, 73, 69, 78, 84]) +Header: RecordHeader(key = REPLY_TO, value = [84, 88, 95, 84, 79, 80, 73, 67, 95, 49, 95, 82, 69, 84, 85, 82, 78]) +partition = 0, offset = 102, key = 2, value ={"name":"Programmer100","status":"classy","catagory":"general","region":"north","title":"programmer"} +.... +[main] DEBUG org.oracle.okafka.clients.consumer.internals.AQKafkaConsumer - [Consumer clientId=consumer-TX_CONSUMER_GRP_1-1, groupId=TX_CONSUMER_GRP_1] Committing now for node 1:localhost:1521:FREEPDB1:FREE:OKAFKA_USER=[TX_TOPIC_1-0] +partition = 0, offset = 197, key = 97, value ={"name":"Programmer100","status":"classy","catagory":"general","region":"north","title":"programmer"} + Header: RecordHeader(key = CLIENT_ID, value = [70, 73, 82, 83, 84, 95, 67, 76, 73, 69, 78, 84]) +Header: RecordHeader(key = REPLY_TO, value = [84, 88, 95, 84, 79, 80, 73, 67, 95, 49, 95, 82, 69, 84, 85, 82, 78]) +partition = 0, offset = 198, key = 98, value ={"name":"Programmer100","status":"classy","catagory":"general","region":"north","title":"programmer"} + Header: RecordHeader(key = CLIENT_ID, value = [70, 73, 82, 83, 84, 95, 67, 76, 73, 69, 78, 84]) +Header: RecordHeader(key = REPLY_TO, value = [84, 88, 95, 84, 79, 80, 73, 67, 95, 49, 95, 82, 69, 84, 85, 82, 78]) +partition = 0, offset = 199, key = 99, value ={"name":"Programmer100","status":"classy","catagory":"general","region":"north","title":"programmer"} + Header: RecordHeader(key = CLIENT_ID, value = [70, 73, 82, 83, 84, 95, 67, 76, 73, 69, 78, 84]) +Header: RecordHeader(key = REPLY_TO, value = [84, 88, 95, 84, 79, 80, 73, 67, 95, 49, 95, 82, 69, 84, 85, 82, 78]) +[main] DEBUG org.oracle.okafka.clients.consumer.internals.AQKafkaConsumer - [Consumer clientId=consumer-TX_CONSUMER_GRP_1-1, groupId=TX_CONSUMER_GRP_1] Commit done +[main] DEBUG org.oracle.okafka.clients.NetworkClient - [Consumer clientId=consumer-TX_CONSUMER_GRP_1-1, groupId=TX_CONSUMER_GRP_1] Response Received OffsetCommit +[main] DEBUG org.oracle.okafka.clients.consumer.internals.ConsumerNetworkClient - [Consumer clientId=consumer-TX_CONSUMER_GRP_1-1, groupId=TX_CONSUMER_GRP_1] Commited to topic partiton: TX_TOPIC_1-0 with offset: OffsetAndMetadata{offset=199, leaderEpoch=null, metadata=''} +Received 100 Expected 100. Exiting Now. +Closing OKafka Consumer. Received 100 records. +[main] INFO org.apache.kafka.common.metrics.Metrics - Metrics scheduler closed +[main] INFO org.apache.kafka.common.metrics.Metrics - Closing reporter org.apache.kafka.common.metrics.JmxReporter +[main] INFO org.apache.kafka.common.metrics.Metrics - Metrics reporters closed +[main] INFO org.apache.kafka.common.utils.AppInfoParser - App info kafka.consumer for consumer-TX_CONSUMER_GRP_1-1 unregistered +[main] DEBUG org.oracle.okafka.clients.consumer.KafkaConsumer - [Consumer clientId=consumer-TX_CONSUMER_GRP_1-1, groupId=TX_CONSUMER_GRP_1] Kafka consumer has been closed +``` + +## License + +Copyright (c) 2019, 2024 Oracle and/or its affiliates. + +Released under the Universal Permissive License v1.0 as shown at +. + +[> Quickstart Guide](../README.md) \ No newline at end of file diff --git a/txeventq/okafka/Quickstart/build.gradle b/txeventq/okafka/Quickstart/build.gradle index 30bbb2d1..5998a3fb 100644 --- a/txeventq/okafka/Quickstart/build.gradle +++ b/txeventq/okafka/Quickstart/build.gradle @@ -11,26 +11,6 @@ buildscript { } } -allprojects { - - repositories { - mavenCentral() - } - - group = 'org.oracle.okafka' - version = '23.4.0.0' - - tasks.withType(Javadoc) { - // disable the crazy super-strict doclint tool in Java 8 - // noinspection SpellCheckingInspection - title ="Kafka Java Client for Oracle Transactional Event Queues 23.4.0.0" - options.addStringOption('Xdoclint:none', '-quiet') - options.windowTitle = "Oracle Database Transactional Event Queues Java API Reference" - options.header = """Oracle® Database Transactional Event Queues Java API Reference
23ai

FF46992-04
""" - options.bottom = """
Copyright © 2001, 2024, Oracle and/or its affiliates. All rights reserved.


""" - } -} - ext { gradleVersion = '8.10' minJavaVersion = JavaVersion.VERSION_17 @@ -41,22 +21,42 @@ ext { } -project(':Simple:Admin') { +allprojects { + + repositories { + mavenCentral() + } + apply plugin : 'java' apply plugin : 'application' sourceCompatibility = minJavaVersion targetCompatibility = minJavaVersion + group = 'org.oracle.okafka' + version = '23.5.0.0' + dependencies { // These dependencies are used by the application. - // https://mvnrepository.com/artifact/com.oracle.database.messaging/okafka - implementation group: 'com.oracle.database.messaging', name: 'okafka', version: '23.4.0.0' + implementation group: 'com.oracle.database.messaging', name: 'okafka', version: version implementation group: 'com.oracle.database.security', name: 'oraclepki', version: '23.4.0.24.05' - implementation group: 'org.apache.kafka', name: 'kafka-clients', version: '2.8.1' - implementation group: 'ch.qos.logback', name: 'logback-classic', version: '1.5.7' + implementation group: 'org.apache.kafka', name: 'kafka-clients', version: '3.7.1' + implementation group: 'org.slf4j', name: 'slf4j-simple', version: '2.0.16' + implementation group: 'org.slf4j', name: 'slf4j-api', version: '2.0.16' } + tasks.withType(Javadoc) { + // disable the crazy super-strict doclint tool in Java 8 + // noinspection SpellCheckingInspection + title ="Kafka Java Client for Oracle Transactional Event Queues ${rootProject.version}" + options.addStringOption('Xdoclint:none', '-quiet') + options.windowTitle = "Oracle Database Transactional Event Queues Java API Reference" + options.header = """Oracle® Database Transactional Event Queues Java API Reference
23ai

FF46992-04
""" + options.bottom = """
Copyright © 2001, 2024, Oracle and/or its affiliates. All rights reserved.


""" + } +} + +project(':Simple:Admin') { tasks.named('jar') { description('Generates okafka client admin jar ') @@ -75,26 +75,12 @@ project(':Simple:Admin') { tasks.named('run') { description('Run okafka client admin create topic') application { - mainClass = 'org.oracle.okafka.examples.OKafkaCreateTopic' + mainClass = 'org.oracle.okafka.examples.OKafkaAdminTopic' } } } project(':Simple:Producer') { - apply plugin: 'java' - apply plugin: 'application' - - sourceCompatibility = minJavaVersion - targetCompatibility = minJavaVersion - - dependencies { - // These dependencies are used by the application. - // implementation project(':clients') - implementation group: 'com.oracle.database.messaging', name: 'okafka', version: '23.4.0.0' - implementation group: 'com.oracle.database.security', name: 'oraclepki', version: '23.4.0.24.05' - implementation group: 'org.apache.kafka', name: 'kafka-clients', version: '2.8.1' - implementation group: 'ch.qos.logback', name: 'logback-classic', version: '1.5.7' - } tasks.named('jar') { description('Generates okafka client simple producer jar ') @@ -119,24 +105,6 @@ project(':Simple:Producer') { } project(':Simple:Consumer') { - apply plugin : 'java' - apply plugin : 'application' - - sourceCompatibility = minJavaVersion - targetCompatibility = minJavaVersion - - dependencies { - // These dependencies are used by the application. - // https://mvnrepository.com/artifact/com.oracle.database.messaging/okafka - implementation group: 'com.oracle.database.messaging', name: 'okafka', version: '23.4.0.0' - implementation group: 'com.oracle.database.security', name: 'oraclepki', version: '23.4.0.24.05' - implementation group: 'org.apache.kafka', name: 'kafka-clients', version: '2.8.1' - implementation group: 'org.slf4j', name: 'slf4j-simple', version: '2.0.16' - implementation group: 'org.slf4j', name: 'slf4j-api', version: '2.0.16' - - } - - tasks.named('jar') { description('Generates okafka client admin jar ') archiveBaseName = 'okafka' @@ -159,94 +127,48 @@ project(':Simple:Consumer') { } } -project(':Transactional:Consumer') { - apply plugin : 'java' - apply plugin : 'application' - - sourceCompatibility = minJavaVersion - targetCompatibility = minJavaVersion - - dependencies { - // These dependencies are used by the application. - // https://mvnrepository.com/artifact/com.oracle.database.messaging/okafka - implementation group: 'com.oracle.database.messaging', name: 'okafka', version: '23.4.0.0' - implementation group: 'com.oracle.database.security', name: 'oraclepki', version: '23.4.0.24.05' - implementation group: 'org.apache.kafka', name: 'kafka-clients', version: '2.8.1' - implementation group: 'ch.qos.logback', name: 'logback-classic', version: '1.5.7' - } - - +project(':Transactional:Producer') { tasks.named('jar') { - description('Generates okafka client consumer jar ') + description('Generates okafka client Transactional producer jar ') archiveBaseName = 'okafka' - archiveClassifier = 'consumer' + archiveClassifier = 'producer' from "${rootProject.projectDir}/LICENSE.txt" from "${rootProject.projectDir}/NOTICE" manifest { - attributes( 'Implementation-Title' : 'okafka consumer', + attributes( 'Implementation-Title' : 'okafka producer', 'Implementation-Version': project.version) } } tasks.named('run') { - description('Run okafka client Transactional consumer') + description('Run okafka client Transactional producer') application { - mainClass = 'org.oracle.okafka.examples.ConsumerOKafka' + mainClass = 'org.oracle.okafka.examples.TransactionalProducerOKafka' } } } -project(':Transactional:Producer') { - apply plugin : 'java' - apply plugin : 'application' - - sourceCompatibility = minJavaVersion - targetCompatibility = minJavaVersion - - dependencies { - // These dependencies are used by the application. - // implementation project(':clients') - implementation group: 'com.oracle.database.messaging', name: 'okafka', version: '23.4.0.0' - implementation group: 'com.oracle.database.security', name: 'oraclepki', version: '23.4.0.24.05' - implementation group: 'org.apache.kafka', name: 'kafka-clients', version: '3.7.1' - implementation group: 'ch.qos.logback', name: 'logback-classic', version: '1.5.7' - } - +project(':Transactional:Consumer') { tasks.named('jar') { - description('Generates okafka client Transactional producer jar ') + description('Generates okafka client consumer jar ') archiveBaseName = 'okafka' - archiveClassifier = 'producer' + archiveClassifier = 'consumer' from "${rootProject.projectDir}/LICENSE.txt" from "${rootProject.projectDir}/NOTICE" manifest { - attributes( 'Implementation-Title' : 'okafka producer', + attributes( 'Implementation-Title' : 'okafka consumer', 'Implementation-Version': project.version) } } tasks.named('run') { - description('Run okafka client Transactional producer') + description('Run okafka client Transactional consumer') application { - mainClass = 'org.oracle.okafka.examples.ProducerOKafka' + mainClass = 'org.oracle.okafka.examples.TransactionalConsumerOKafka' } } } - -configurations { - childJar -} - -dependencies { - subprojects.each { - childJar project(it.path) - } -} - -task multiProjectJar (type: Jar, dependsOn: configurations.childJar) { - description 'Generates a jar containing okafka client, all its dependencies and examples for okafka demo' - from { configurations.childJar.collect { zipTree(it) } } -} diff --git a/txeventq/okafka/Quickstart/gradle/wrapper/gradle-wrapper.properties b/txeventq/okafka/Quickstart/gradle/wrapper/gradle-wrapper.properties new file mode 100644 index 00000000..1af9e093 --- /dev/null +++ b/txeventq/okafka/Quickstart/gradle/wrapper/gradle-wrapper.properties @@ -0,0 +1,7 @@ +distributionBase=GRADLE_USER_HOME +distributionPath=wrapper/dists +distributionUrl=https\://services.gradle.org/distributions/gradle-8.5-bin.zip +networkTimeout=10000 +validateDistributionUrl=true +zipStoreBase=GRADLE_USER_HOME +zipStorePath=wrapper/dists diff --git a/txeventq/okafka/Quickstart/gradlew b/txeventq/okafka/Quickstart/gradlew index 1b6c7873..1aa94a42 100755 --- a/txeventq/okafka/Quickstart/gradlew +++ b/txeventq/okafka/Quickstart/gradlew @@ -55,7 +55,7 @@ # Darwin, MinGW, and NonStop. # # (3) This script is generated from the Groovy template -# https://github.com/gradle/gradle/blob/master/subprojects/plugins/src/main/resources/org/gradle/api/internal/plugins/unixStartScript.txt +# https://github.com/gradle/gradle/blob/HEAD/subprojects/plugins/src/main/resources/org/gradle/api/internal/plugins/unixStartScript.txt # within the Gradle project. # # You can find Gradle at https://github.com/gradle/gradle/. @@ -80,13 +80,11 @@ do esac done -APP_HOME=$( cd "${APP_HOME:-./}" && pwd -P ) || exit - -APP_NAME="Gradle" +# This is normally unused +# shellcheck disable=SC2034 APP_BASE_NAME=${0##*/} - -# Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script. -DEFAULT_JVM_OPTS='"-Xmx64m" "-Xms64m"' +# Discard cd standard output in case $CDPATH is set (https://github.com/gradle/gradle/issues/25036) +APP_HOME=$( cd "${APP_HOME:-./}" > /dev/null && pwd -P ) || exit # Use the maximum available, or set MAX_FD != -1 to use that value. MAX_FD=maximum @@ -133,22 +131,29 @@ location of your Java installation." fi else JAVACMD=java - which java >/dev/null 2>&1 || die "ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH. + if ! command -v java >/dev/null 2>&1 + then + die "ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH. Please set the JAVA_HOME variable in your environment to match the location of your Java installation." + fi fi # Increase the maximum file descriptors if we can. if ! "$cygwin" && ! "$darwin" && ! "$nonstop" ; then case $MAX_FD in #( max*) + # In POSIX sh, ulimit -H is undefined. That's why the result is checked to see if it worked. + # shellcheck disable=SC2039,SC3045 MAX_FD=$( ulimit -H -n ) || warn "Could not query maximum file descriptor limit" esac case $MAX_FD in #( '' | soft) :;; #( *) + # In POSIX sh, ulimit -n is undefined. That's why the result is checked to see if it worked. + # shellcheck disable=SC2039,SC3045 ulimit -n "$MAX_FD" || warn "Could not set maximum file descriptor limit to $MAX_FD" esac @@ -193,11 +198,15 @@ if "$cygwin" || "$msys" ; then done fi -# Collect all arguments for the java command; -# * $DEFAULT_JVM_OPTS, $JAVA_OPTS, and $GRADLE_OPTS can contain fragments of -# shell script including quotes and variable substitutions, so put them in -# double quotes to make sure that they get re-expanded; and -# * put everything else in single quotes, so that it's not re-expanded. + +# Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script. +DEFAULT_JVM_OPTS='"-Xmx64m" "-Xms64m"' + +# Collect all arguments for the java command: +# * DEFAULT_JVM_OPTS, JAVA_OPTS, JAVA_OPTS, and optsEnvironmentVar are not allowed to contain shell fragments, +# and any embedded shellness will be escaped. +# * For example: A user cannot expect ${Hostname} to be expanded, as it is an environment variable and will be +# treated as '${Hostname}' itself on the command line. set -- \ "-Dorg.gradle.appname=$APP_BASE_NAME" \ @@ -205,6 +214,12 @@ set -- \ org.gradle.wrapper.GradleWrapperMain \ "$@" +# Stop when "xargs" is not available. +if ! command -v xargs >/dev/null 2>&1 +then + die "xargs is not available" +fi + # Use "xargs" to parse quoted args. # # With -n1 it outputs one arg per line, with the quotes and backslashes removed. diff --git a/txeventq/okafka/Quickstart/gradlew.bat b/txeventq/okafka/Quickstart/gradlew.bat index 107acd32..93e3f59f 100644 --- a/txeventq/okafka/Quickstart/gradlew.bat +++ b/txeventq/okafka/Quickstart/gradlew.bat @@ -14,7 +14,7 @@ @rem limitations under the License. @rem -@if "%DEBUG%" == "" @echo off +@if "%DEBUG%"=="" @echo off @rem ########################################################################## @rem @rem Gradle startup script for Windows @@ -25,7 +25,8 @@ if "%OS%"=="Windows_NT" setlocal set DIRNAME=%~dp0 -if "%DIRNAME%" == "" set DIRNAME=. +if "%DIRNAME%"=="" set DIRNAME=. +@rem This is normally unused set APP_BASE_NAME=%~n0 set APP_HOME=%DIRNAME% @@ -40,7 +41,7 @@ if defined JAVA_HOME goto findJavaFromJavaHome set JAVA_EXE=java.exe %JAVA_EXE% -version >NUL 2>&1 -if "%ERRORLEVEL%" == "0" goto execute +if %ERRORLEVEL% equ 0 goto execute echo. echo ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH. @@ -75,13 +76,15 @@ set CLASSPATH=%APP_HOME%\gradle\wrapper\gradle-wrapper.jar :end @rem End local scope for the variables with windows NT shell -if "%ERRORLEVEL%"=="0" goto mainEnd +if %ERRORLEVEL% equ 0 goto mainEnd :fail rem Set variable GRADLE_EXIT_CONSOLE if you need the _script_ return code instead of rem the _cmd.exe /c_ return code! -if not "" == "%GRADLE_EXIT_CONSOLE%" exit 1 -exit /b 1 +set EXIT_CODE=%ERRORLEVEL% +if %EXIT_CODE% equ 0 set EXIT_CODE=1 +if not ""=="%GRADLE_EXIT_CONSOLE%" exit %EXIT_CODE% +exit /b %EXIT_CODE% :mainEnd if "%OS%"=="Windows_NT" endlocal diff --git a/txeventq/okafka/SECURITY.md b/txeventq/okafka/SECURITY.md new file mode 100644 index 00000000..2ca81027 --- /dev/null +++ b/txeventq/okafka/SECURITY.md @@ -0,0 +1,38 @@ +# Reporting security vulnerabilities + +Oracle values the independent security research community and believes that +responsible disclosure of security vulnerabilities helps us ensure the security +and privacy of all our users. + +Please do NOT raise a GitHub Issue to report a security vulnerability. If you +believe you have found a security vulnerability, please submit a report to +[secalert_us@oracle.com][1] preferably with a proof of concept. Please review +some additional information on [how to report security vulnerabilities to Oracle][2]. +We encourage people who contact Oracle Security to use email encryption using +[our encryption key][3]. + +We ask that you do not use other channels or contact the project maintainers +directly. + +Non-vulnerability related security issues including ideas for new or improved +security features are welcome on GitHub Issues. + +## Security updates, alerts and bulletins + +Security updates will be released on a regular cadence. Many of our projects +will typically release security fixes in conjunction with the +Oracle Critical Patch Update program. Additional +information, including past advisories, is available on our [security alerts][4] +page. + +## Security-related information + +We will provide security related information such as a threat model, considerations +for secure use, or any known security issues in our documentation. Please note +that labs and sample code are intended to demonstrate a concept and may not be +sufficiently hardened for production use. + +[1]: mailto:secalert_us@oracle.com +[2]: https://www.oracle.com/corporate/security-practices/assurance/vulnerability/reporting.html +[3]: https://www.oracle.com/security-alerts/encryptionkey.html +[4]: https://www.oracle.com/security-alerts/ diff --git a/txeventq/okafka/THIRD_PARTY_LICENSE.txt b/txeventq/okafka/THIRD_PARTY_LICENSE.txt new file mode 100644 index 00000000..b1a9d278 --- /dev/null +++ b/txeventq/okafka/THIRD_PARTY_LICENSE.txt @@ -0,0 +1,236 @@ +Apache Kafka +SLF4J +_______ + +Apache Kafka: +https://github.com/kafka-dev/kafka/blob/master/LICENSE +Copyright 2010 LinkedIn + +Licensed under the Apache License, Version 2.0 (the "License"); +you may not use this file except in compliance with the License. + +You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 + +Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, +WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + +See the License for the specific language governing permissions and limitations under the License. + +Note that the Gradle Wrapper may be useful in building an okafka jar file, and is available at: +https://github.com/BrightcoveOS/gradle-wrapper/blob/master/LICENSE and is licensed under the Apache License, Version 2.0 (the "License") +which is reproduced below for reference. + + +Apache License terms for reference: + + Apache License + Version 2.0, January 2004 + http://www.apache.org/licenses/ + + TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION + + 1. Definitions. + + "License" shall mean the terms and conditions for use, reproduction, + and distribution as defined by Sections 1 through 9 of this document. + + "Licensor" shall mean the copyright owner or entity authorized by + the copyright owner that is granting the License. + + "Legal Entity" shall mean the union of the acting entity and all + other entities that control, are controlled by, or are under common + control with that entity. For the purposes of this definition, + "control" means (i) the power, direct or indirect, to cause the + direction or management of such entity, whether by contract or + otherwise, or (ii) ownership of fifty percent (50%) or more of the + outstanding shares, or (iii) beneficial ownership of such entity. + + "You" (or "Your") shall mean an individual or Legal Entity + exercising permissions granted by this License. + + "Source" form shall mean the preferred form for making modifications, + including but not limited to software source code, documentation + source, and configuration files. + + "Object" form shall mean any form resulting from mechanical + transformation or translation of a Source form, including but + not limited to compiled object code, generated documentation, + and conversions to other media types. + + "Work" shall mean the work of authorship, whether in Source or + Object form, made available under the License, as indicated by a + copyright notice that is included in or attached to the work + (an example is provided in the Appendix below). + +"Derivative Works" shall mean any work, whether in Source or Object +form, that is based on (or derived from) the Work and for which the +editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. + +"Contribution" shall mean any work of authorship, including +the original version of the Work and any modifications or additions +to that Work or Derivative Works thereof, that is intentionally +submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." + + "Contributor" shall mean Licensor and any individual or Legal Entity + on behalf of whom a Contribution has been received by Licensor and + subsequently incorporated within the Work. + + 2. Grant of Copyright License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + copyright license to reproduce, prepare Derivative Works of, + publicly display, publicly perform, sublicense, and distribute the + Work and such Derivative Works in Source or Object form. + + 3. Grant of Patent License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + (except as stated in this section) patent license to make, have made, + use, offer to sell, sell, import, and otherwise transfer the Work, + where such license applies only to those patent claims licensable + by such Contributor that are necessarily infringed by their + Contribution(s) alone or by combination of their Contribution(s) + with the Work to which such Contribution(s) was submitted. If You + institute patent litigation against any entity (including a + cross-claim or counterclaim in a lawsuit) alleging that the Work + or a Contribution incorporated within the Work constitutes direct + or contributory patent infringement, then any patent licenses + granted to You under this License for that Work shall terminate + as of the date such litigation is filed. + + 4. Redistribution. You may reproduce and distribute copies of the + Work or Derivative Works thereof in any medium, with or without + modifications, and in Source or Object form, provided that You + meet the following conditions: + + (a) You must give any other recipients of the Work or + Derivative Works a copy of this License; and + + (b) You must cause any modified files to carry prominent notices + stating that You changed the files; and + + (c) You must retain, in the Source form of any Derivative Works + that You distribute, all copyright, patent, trademark, and + attribution notices from the Source form of the Work, + excluding those notices that do not pertain to any part of + the Derivative Works; and + + (d) If the Work includes a "NOTICE" text file as part of its + distribution, then any Derivative Works that You distribute must + include a readable copy of the attribution notices contained + within such NOTICE file, excluding those notices that do not + pertain to any part of the Derivative Works, in at least one + of the following places: within a NOTICE text file distributed + as part of the Derivative Works; within the Source form or + documentation, if provided along with the Derivative Works; or, + within a display generated by the Derivative Works, if and + wherever such third-party notices normally appear. The contents + of the NOTICE file are for informational purposes only and + do not modify the License. You may add Your own attribution + notices within Derivative Works that You distribute, alongside + or as an addendum to the NOTICE text from the Work, provided + that such additional attribution notices cannot be construed + as modifying the License. + + You may add Your own copyright statement to Your modifications and + may provide additional or different license terms and conditions + for use, reproduction, or distribution of Your modifications, or + for any such Derivative Works as a whole, provided Your use, + reproduction, and distribution of the Work otherwise complies with + the conditions stated in this License. + + 5. Submission of Contributions. Unless You explicitly state otherwise, + any Contribution intentionally submitted for inclusion in the Work + by You to the Licensor shall be under the terms and conditions of + this License, without any additional terms or conditions. + Notwithstanding the above, nothing herein shall supersede or modify + the terms of any separate license agreement you may have executed + with Licensor regarding such Contributions. + +6. Trademarks. This License does not grant permission to use the trade +names, trademarks, service marks, or product names of the Licensor, +except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. + +7. Disclaimer of Warranty. Unless required by applicable law or + agreed to in writing, Licensor provides the Work (and each + Contributor provides its Contributions) on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or + implied, including, without limitation, any warranties or conditions + of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A + PARTICULAR PURPOSE. You are solely responsible for determining the + appropriateness of using or redistributing the Work and assume any + risks associated with Your exercise of permissions under this + License. + + 8. Limitation of Liability. In no event and under no legal theory, + whether in tort (including negligence), contract, or otherwise, + unless required by applicable law (such as deliberate and grossly + negligent acts) or agreed to in writing, shall any Contributor be + liable to You for damages, including any direct, indirect, special, + incidental, or consequential damages of any character arising as a + result of this License or out of the use or inability to use the + Work (including but not limited to damages for loss of goodwill, + work stoppage, computer failure or malfunction, or any and all + other commercial damages or losses), even if such Contributor + has been advised of the possibility of such damages. + + 9. Accepting Warranty or Additional Liability. While redistributing + the Work or Derivative Works thereof, You may choose to offer, + and charge a fee for, acceptance of support, warranty, indemnity, + or other liability obligations and/or rights consistent with this + License. However, in accepting such obligations, You may act only + on Your own behalf and on Your sole responsibility, not on behalf + of any other Contributor, and only if You agree to indemnify, + defend, and hold each Contributor harmless for any liability + incurred by, or claims asserted against, such Contributor by reason + of your accepting any such warranty or additional liability. + + END OF TERMS AND CONDITIONS + + APPENDIX: How to apply the Apache License to your work. + + To apply the Apache License to your work, attach the following + boilerplate notice, with the fields enclosed by brackets "{}" + replaced with your own identifying information. (Don't include + the brackets!) The text should be enclosed in the appropriate + comment syntax for the file format. We also recommend that a + file or class name and description of purpose be included on the + same "printed page" as the copyright notice for easier + identification within third-party archives. + + Copyright {yyyy} {name of copyright owner} + + Licensed under the Apache License, Version 2.0 (the "License"); + you may not use this file except in compliance with the License. + You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions and + limitations under the License. + + +SLF4J: +Licensing terms are at https://www.slf4j.org/license.html and are reproduced below. + +Copyright (c) 2004-2017 QOS.ch + +All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated +documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, +copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software +is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE +FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION +WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. + +These terms are identical to those of the MIT License, also called the X License or the X11 License, which is a simple, permissive +non-copyleft free software license. It is deemed compatible with virtually all types of licenses, commercial or otherwise. +In particular, the Free Software Foundation has declared it compatible with GNU GPL. It is also known to be approved by the +Apache Software Foundation as compatible with Apache Software License.