Contributing to Camel-Kafka-connector

First of all, thank you for having an interest in contributing to Camel-Kafka-connector.

The Apache Camel community is a great group of contributors and Camel-Kafka-connector is the newer subproject in the ecosystem.

There are multiple areas in which camel-kafka-connector could be improved. Here are some examples:

  • Surfing the basic documentation - if something is not clear, let us know or fix it yourself.

  • Try the Getting started guides

  • Digging in the codebase and tune some operations or add new features.

  • Take a look at the open issues and leave a comment on the issue to let us know you are working on it.

Getting in touch

Apache Camel Kafka Connector is an Apache Software Foundation project.

All communication is subject to the ASF Code of Conduct.

There are various ways of communicating with the Camel community.

We can also be reached on the Zulip chat at https://camel.zulipchat.com.

We track issues using the issues tracker in Github

When you’re ready to contribute create a Pull request to the camel-kafka-connector repository

Expect that your Pull request will receive a review and that you will need to respond and correspond to that via comments at GitHub.

Building the project

Basically you could run

mvn clean package

Build the project and run integration tests

To build the project it is sufficient to:

mvn clean install

To run the integration tests it is required to:

  • have Docker version 17.05 or higher running

  • run:

mvn -DskipIntegrationTests=false clean verify package

It is also possible to point the tests to use an external services. To do so, you must set properties for the services that you want to run. This causes the tests to not launch the local container and use existing remote instances. At the moment, the following properties can be set for remote testing:

  • kafka.instance.type

    • kafka.bootstrap.servers

  • aws-service.instance.type

    • access.key: AWS access key (mandatory for remote testing)

    • secret.key: AWS secret key (mandatory for remote testing)

    • aws.region: AWS region (optional)

    • aws.host: AWS host (optional)

  • aws-service.kinesis.instance.type

    • access.key: AWS access key (mandatory for remote testing)

    • secret.key: AWS secret key (mandatory for remote testing)

    • aws.region: AWS region (optional)

    • aws.host: AWS host (optional)

  • elasticsearch.instance.type

    • elasticsearch.host

    • elasticsearch.port

  • cassandra.instance.type

    • cassandra.host

    • cassandra.cql3.port

  • jms-service.instance.type

    • jms.broker.address

For example you can run

mvn -Dkafka.bootstrap.servers=host1:port -Dkafka.instance.type=remote -DskipIntegrationTests=false clean verify package

It’s possible to use a properties file to set these properties. To do so use -Dtest.properties=/path/to/file.properties.

Running Salesforce integration tests

The first step is to create an account on Salesforce. The service allows developers to create a free account for testing. The account can be created on their login website.

The next step is to create a new connected application. This will provide you with the API keys that allow the automation to run. The Salesforce help contains the documentation and steps related to this part of the process.

After you create the API keys proceed to adjust a few more parameters necessary for the tests to run. Specifically, the IP Relaxation/range policy parameter should be adjusted (usually to the value "Relax IP restrictions").

You need setup change data capture to allow our tests to read from the Account table. The Change Data Capture Developer Guide on the Salesforce documentation is the recommended starting point for this.

Lastly, you need to create the configuration files for the sfdx CLI client. The CLI client interacts with the account, creating, updating and deleting records as required for the test execution.

To generate the configuration files, execute the following steps:

  1. Run the Salesforce CLI container: docker run --rm --name salesforce-cli -it -v /path/to/sfdx:/root/.sfdx salesforce/salesforcedx

  2. Within the container, use the following command to login: sfdx force:auth:device:login -s -d -i <client ID>

  3. Provide the client secret when request and execute the steps requested by the CLI.

  4. Verify that you are logged in correctly using the following command sfdx force:auth:list

It should present an output like:

#### authenticated orgs
ALIAS  USERNAME              ORG ID              INSTANCE URL                 OAUTH METHOD
─────  ────────────────────  ──────────────────  ───────────────────────────  ────────────
       your-user@email.com  SOME NUMERIC ID     https://eu31.salesforce.com  web

Note: after leaving the container you might need to adjust the permissions of the directory containing the sfdx configuration files (/path/to/sfdx).

Using the IDs, credentials and configurations that you created, you need to set the following system properties to run the tests using maven:

  • -Dit.test.salesforce.enable=true to enable the test

  • -Dit.test.salesforce.client.id=<client ID> with the client ID obtained when you created the API keys

  • -Dit.test.salesforce.client.secret=<client secret> with the client secret obtained when you created the API keys

  • -Dit.test.salesforce.password=<user password> the password of your account

  • -Dit.test.salesforce.username=<user name> the username of your account.

  • -Dit.test.salesforce.sfdx.path=/path/to/sfdx the path to the sfdx configuration (explained further).

Note: the it.test.salesforce.sfdx.path property should point to the directory containing the sfdx CLI client configuration.

To run the tests, enable the salesforce profile so that DTOs are generated and set the aforementioned properties to the values setup previously.

mvn -U -Psalesforce -DskipIntegrationTests=false -Dit.test.salesforce.sfdx.path=/path/to/sfdx -Dit.test.salesforce.enable=true -Dit.test.salesforce.client.id=<client id> -Dit.test.salesforce.client.secret=<client secret> -Dit.test.salesforce.password=<password> -Dit.test.salesforce.username=<your account> compile test-compile test

Speeding up the build

Building the project may take several minutes to complete. This is, in parts, because of how Maven works in its default by default: downloading dependencies; sequentially going through the modules; building them and installing them.

Although this conservative approach helps the project to ensure consistency and stability throughout the builds, it fails to make use of available processing power from modern computers.

A few approaches to leverage this available processing power have emerged recently. One of them works particularly well with this project: mvnd - The Maven Daemon. The project embeds Maven itself to leverage its native parallelization and implement lower level build optimizations. It can greatly reduce the build time:

  • Build with Maven: 24:22 min.

  • Same build with mvnd: 05:09 min.

Note: consult the mvnd project documentation for details about the supported installation methods.

Having mvnd installed and available on your system, building the project is as simple as replacing mvn with mvnd in almost all the commands of this guide. A notable exception is the integration test execution.

To build the project using mvnd instead of Maven, use the following command:

mvnd -Darchetype.test.skip -DskipIntegrationTests=true clean install

Troubleshooting:

Most of the build problems when using mvnd are caused by problems on misbehaved plugins. Should you face build problems using mvnd, it is recommended to try the build again using Maven.