|
kafka-tools
|
This project is based on cppkafka library (https://github.com/mfontanini/cppkafka) and so on, based on librdkafka library (https://github.com/confluentinc/librdkafka).
This image is already available at github container registry and docker hub for every repository tag, and also for master as latest:
You could also build it using the script ./build.sh located at project root:
This image is built with ./Dockerfile.
To run compilation over this image, just run with docker. The entrypoint (check it at ./deps/build.sh) will fall back from cmake (looking for CMakeLists.txt file at project root, i.e. mounted on working directory /code to generate makefiles) to make, in order to build your source code. There are two available environment variables used by the builder script of this image: BUILD_TYPE (for cmake) and MAKE_PROCS (for make):
This image is already available at github container registry and docker hub for every repository tag, and also for master as latest:
You could also build it using the script ./build.sh located at project root:
This image is built with ./Dockerfile.build.
Builder image is used to build the project library. To run compilation over this image, again, just run with docker:
You could generate documentation passing extra arguments to the entry point behind:
You could also build the library using the script ./build.sh located at project root:
This is a cmake-based building library, so you may install cmake:
And then generate the makefiles from project root directory:
You could specify type of build, 'Debug' or 'Release', for example:
You could also change the compilers used:
or
Optionally you could specify another prefix for installation:
Download and extract latest version from https://kafka.apache.org/downloads, for example:
Install JRE requirement:
Then, start zookeeper and kafka server:
Create a test a topic:
You could also test kafka installation using this simple producer, just using docker image, for example:
You could omit entry point, as that simple producer is the default for project image.
This is an advanced kafka producer which triggers actively kafka messages for every UDP reception. You can use netcat in bash, to generate UDP messages easily:
But, you could also use the h2agent project UDP client generator (https://github.com/testillano/h2agent/tree/master?tab=readme-ov-file#execution-of-udp-client-utility) which allows to drive UDP traffic load with specific rate and ramp up time.
Powerful parsing capabilities allow to create any kind of message dynamically using patterns for message configured. This, together with UDP client generator will enable any kind of kafka production needs.
It is recommended to read this guide to work with unix sockets and docker containers: https://github.com/testillano/h2agent/tree/master?tab=readme-ov-file#working-with-unix-sockets-and-docker-containers. There, udp-server-h2client is the functional equivalent to this udp-server-kafka-producer.
In the following example, we will produce 1000 messages per second during about 10 seconds, with the sequence as message content for topic 'test':
Please, execute astyle formatting (using frankwolf image) before any pull request:
1.8.13