What is a simple, effective way to debug custom Kafka connectors? - java

I'm working a couple of Kafka connectors and I don't see any errors in their creation/deployment in the console output, however I am not getting the result that I'm looking for (no results whatsoever for that matter, desired or otherwise). I made these connectors based on Kafka's example FileStream connectors, so my debug technique was based off the use of the SLF4J Logger that is used in the example. I've searched for the log messages that I thought would be produced in the console output, but to no avail. Am I looking in the wrong place for these messages? Or perhaps is there a better way of going about debugging these connectors?
Example uses of the SLF4J Logger that I referenced for my implementation:
Kafka FileStreamSinkTask
Kafka FileStreamSourceTask

I will try to reply to your question in a broad way. A simple way to do Connector development could be as follows:
Structure and build your connector source code by looking at one of the many Kafka Connectors available publicly (you'll find an extensive list available here: https://www.confluent.io/product/connectors/ )
Download the latest Confluent Open Source edition (>= 3.3.0) from https://www.confluent.io/download/
Make your connector package available to Kafka Connect in one of the following ways:
Store all your connector jar files (connector jar plus dependency jars excluding Connect API jars) to a location in your filesystem and enable plugin isolation by adding this location to the
plugin.path property in the Connect worker properties. For instance, if your connector jars are stored in /opt/connectors/my-first-connector, you will set plugin.path=/opt/connectors in your worker's properties (see below).
Store all your connector jar files in a folder under ${CONFLUENT_HOME}/share/java. For example: ${CONFLUENT_HOME}/share/java/kafka-connect-my-first-connector. (Needs to start with kafka-connect- prefix to be picked up by the startup scripts). $CONFLUENT_HOME is where you've installed Confluent Platform.
Optionally, increase your logging by changing the log level for Connect in ${CONFLUENT_HOME}/etc/kafka/connect-log4j.properties to DEBUG or even TRACE.
Use Confluent CLI to start all the services, including Kafka Connect. Details here: http://docs.confluent.io/current/connect/quickstart.html
Briefly: confluent start
Note: The Connect worker's properties file currently loaded by the CLI is ${CONFLUENT_HOME}/etc/schema-registry/connect-avro-distributed.properties. That's the file you should edit if you choose to enable classloading isolation but also if you need to change your Connect worker's properties.
Once you have Connect worker running, start your connector by running:
confluent load <connector_name> -d <connector_config.properties>
or
confluent load <connector_name> -d <connector_config.json>
The connector configuration can be either in java properties or JSON format.
Run
confluent log connect to open the Connect worker's log file, or navigate directly to where your logs and data are stored by running
cd "$( confluent current )"
Note: change where your logs and data are stored during a session of the Confluent CLI by setting the environment variable CONFLUENT_CURRENT appropriately. E.g. given that /opt/confluent exists and is where you want to store your data, run:
export CONFLUENT_CURRENT=/opt/confluent
confluent current
Finally, to interactively debug your connector a possible way is to apply the following before starting Connect with Confluent CLI :
confluent stop connect
export CONNECT_DEBUG=y; export DEBUG_SUSPEND_FLAG=y;
confluent start connect
and then connect with your debugger (for instance remotely to the Connect worker (default port: 5005). To stop running connect in debug mode, just run: unset CONNECT_DEBUG; unset DEBUG_SUSPEND_FLAG; when you are done.
I hope the above will make your connector development easier and ... more fun!

i love the accepted answer. one thing - the environment variables didn't work for me... i'm using confluent community edition 5.3.1...
here's what i did that worked...
i installed the confluent cli from here:
https://docs.confluent.io/current/cli/installing.html#tarball-installation
i ran confluent using the command confluent local start
i got the connect app details using the command ps -ef | grep connect
i copied the resulting command to an editor and added the arg (right after java):
-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=5005
then i stopped connect using the command confluent local stop connect
then i ran the connect command with the arg
brief intermission ---
vs code development is led by erich gamma - of gang of four fame, who also wrote eclipse. vs code is becoming a first class java ide see https://en.wikipedia.org/wiki/Erich_Gamma
intermission over ---
next i launched vs code and opened the debezium oracle connector folder (cloned from here) https://github.com/debezium/debezium-incubator
then i chose Debug - Open Configurations
and entered the highlighted debugging configuration
and then run the debugger - it will hit your breakpoints !!
the connect command should look something like this:
/Library/Java/JavaVirtualMachines/jdk1.8.0_221.jdk/Contents/Home/bin/java -agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=5005 -Xms256M -Xmx2G -server -XX:+UseG1GC -XX:MaxGCPauseMillis=20 -XX:InitiatingHeapOccupancyPercent=35 -XX:+ExplicitGCInvokesConcurrent -Djava.awt.headless=true -Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote.authenticate=false -Dcom.sun.management.jmxremote.ssl=false -Dkafka.logs.dir=/var/folders/yn/4k6t1qzn5kg3zwgbnf9qq_v40000gn/T/confluent.CYZjfRLm/connect/logs -Dlog4j.configuration=file:/Users/myuserid/confluent-5.3.1/bin/../etc/kafka/connect-log4j.properties -cp /Users/myuserid/confluent-5.3.1/share/java/kafka/*:/Users/myuserid/confluent-5.3.1/share/java/confluent-common/*:/Users/myuserid/confluent-5.3.1/share/java/kafka-serde-tools/*:/Users/myuserid/confluent-5.3.1/bin/../share/java/kafka/*:/Users/myuserid/confluent-5.3.1/bin/../support-metrics-client/build/dependant-libs-2.12.8/*:/Users/myuserid/confluent-5.3.1/bin/../support-metrics-client/build/libs/*:/usr/share/java/support-metrics-client/* org.apache.kafka.connect.cli.ConnectDistributed /var/folders/yn/4k6t1qzn5kg3zwgbnf9qq_v40000gn/T/confluent.CYZjfRLm/connect/connect.properties

Connector module is executed by the kafka connector framework. For debugging, we can use the standalone mode. we can configure IDE to use the ConnectStandalone main function as entry point.
create debug configure as the following. Need remember to tick "Include dependencies with "Provided" scope if it is maven project
connector properties file need specify the connector class name "connector.class" for debugging
worker properties file can copied from kafka folder /usr/local/etc/kafka/connect-standalone.properties

Related

Cannot connect to Wildfly in Dockerfile

I'm creating a custom Dockerfile with extensions for official keycloak docker image. I want to change web-context and add some custom providers.
Here's my Dockerfile:
FROM jboss/keycloak:7.0.0
COPY startup-config.cli /opt/jboss/tools/cli/startup-config.cli
RUN /opt/jboss/keycloak/bin/jboss-cli.sh --connect --controller=localhost:9990 --file="/opt/jboss/tools/cli/startup-config.cli"
ENV KEYCLOAK_USER=admin
ENV KEYCLOAK_PASSWORD=admin
and startup-config.cli file:
/subsystem=keycloak-server/:write-attribute(name=web-context,value="keycloak/auth")
/subsystem=keycloak-server/:add(name=providers,value="module:module:x.y.z.some-custom-provider")
Bu unfortunately I receive such error:
The controller is not available at localhost:9990: java.net.ConnectException: WFLYPRT0053: Could not connect to remote+http://localhost:9990. The connection failed: WFLYPRT0053: Could not connect to remote+http://localhost:9990. The connection failed: Connection refused
The command '/bin/sh -c /opt/jboss/keycloak/bin/jboss-cli.sh --connect --controller=localhost:9990 --file="/opt/jboss/tools/cli/startup-config.cli"' returned a non-zero code: 1
Is it a matter of invalid localhost? How should I refer to the management API?
Edit: I also tried with ENTRYPOINT instead of RUN, but the same error occurred during container initialization.
You are trying to have Wildfly load your custom config file at build-time here. The trouble is, that the Wildfly server is not running while the Dockerfile is building.
Wildfly actually already has you covered regarding automatically loading custom config, there is built in support for what you want to do. You simply need to put your config file in a "magic location" inside the image.
You need to drop your config file here:
/opt/jboss/startup-scripts/
So that your Dockerfile looks like this:
FROM jboss/keycloak:7.0.0
COPY startup-config.cli /opt/jboss/startup-scripts/startup-config.cli
ENV KEYCLOAK_USER=admin
ENV KEYCLOAK_PASSWORD=admin
Excerpt from the keycloak documentation:
Adding custom script using Dockerfile
A custom script can be added by
creating your own Dockerfile:
FROM keycloak
COPY custom-scripts/ /opt/jboss/startup-scripts/
Now you can simply start the image, and the built features in keycloak (Wildfly feature really) will go look for a config in that spedific directory, and then attempt to load it up.
Edit from comment with final solution:
While the original answer solved the issue with being able to pass configuration to the server at all, an issue remained with the content of the script. The following error was received when starting the container:
=========================================================================
Executing cli script: /opt/jboss/startup-scripts/startup-config.cli
No connection to the controller.
=========================================================================
The issue turned out to be in the startup-config.cli script, where the jboss command embed-server was missing, needed to initiate a connection to the jboss instance. Also missing was the closing stop-embedded-server command. More about configuring jboss in this manner in the docs here: CHAPTER 8. EMBEDDING A SERVER FOR OFFLINE CONFIGURATION
The final script:
embed-server --std-out=echo
/subsystem=keycloak-server/theme=defaults/:write-attribute(name=cacheThemes,value=false)
/subsystem=keycloak-server/theme=defaults/:write-attribute(name=cacheTemplates,value=false)
stop-embedded-server
WildFly management interfaces are not available when building the Docker image. Your only option is to start the CLI in embedded mode as discussed here Running CLI commands in WildFly Dockerfile.
A more advanced approach consists in using the S2I installation scripts to trigger CLI commands.

How to remote debug an enterprise application running on web logic server in eclipse IDE(same machine)

I am new to weblogic application server and remote debugging & have gone through several post to set up remote debugging. Some post suggest to edit setDomainEnv.cmd file while others suggest to edit startWeblogic.cmd file in my WEBLOGIC_HOME\user_projects\domains\my_domain\bin.
But neither of the solutions worked for me. Listed below are solutions which I tried :
1) Edit setDomainEnv.cmd file
set JAVA_DEBUG=-Xdebug -Xnoagent -Xrunjdwp:transport=dt_socket,address=%DEBUG_PORT%,server=y,suspend=n -Djava.compiler=NONE
set JAVA_OPTIONS=%JAVA_OPTIONS% %enableHotswapFlag% -ea -da:com.bea... -da:javelin... -da:weblogic... -ea:com.bea.wli... -ea:com.bea.broker... -ea:com.bea.sbconsole...
The port number is set to 8543 in the file
if "%DEBUG_PORT%"=="" (
set DEBUG_PORT=8453
)
2)Edit startWeblogic.cmd file
I added the following line at the top of the file
-Xdebug -Xnoagent -Xrunjdwp:transport=dt_socket,address=8543,server=y,suspend=n
Then in eclipse,when i run debug configuration(port number : 8543), I get Failed to connect to remote VM. Connection refused.
Connection refused: connect
Please let me know
1) How remote debugging works?
2) How to set up remote debugging in eclipse with weblogic server ?
3) What is the difference between above 2 methods ?
4) Where do I need to add the debug command(-Xdebug....) in the startWeblogic.cmd file(at the top)?
5) What is the purpose of setDomainEnv.cmd file in weblogic server ?
Thanks in advance
I think it should be sufficient to just set environment variable before starting Weblogic. Start cmd and before starting weblogic set debugFlag=true. This should make weblogic open debug port which should be set to some default like 8453. You can also set DEBUG_PORT=8888 or any other free port, if you want to change it.
Start Weblogic and verify that the port has been opened. You can use tools like cports or ProcessExplorer for that (or event netstat).
In case debug port isn't open, check if you run Weblogic in development mode, because debugFlag can be ignored in production mode.
In Eclipse create a remote debug configuration (and remember that this option is not available in Run configuration that is next to it on the toolbar):
In point 3 on the screenshot select the project you deploy to weblogic that you want to debug.
It that does not work post the problems you have.

FileNotFoundException while running SolrCloud on Tomcat

I have a Solr 4.2.0 server which is running under the Tomcat 7.0 container. I'm trying to wire it with my external zookeeper (actually, it doesn't work with the embdedded zookeeper too).
I tried this java opts:
-Dbootstrap_confdir=./solr/collection1/conf
-Dcollection.configName=myconf
-DzkRun
-DnumShards=2
for running the embedded zookeeper.
And also this java opts:
-Dbootstrap_confdir=./solr/collection1/conf
-Dcollection.configName=myconf
-DzkHost=localhost:2181
-DnumShards=2
For connecting to external zookeeper
In both cases I continue to get the same exception:
java.io.FileNotFoundException: File '.\solr\collection1\conf \admin-extra.html' does not exist
But the problem is that file admin-extra.html exists and it's right here. And I can't figure out what the problem is.
From your exception it seems your path has a white space after the config directory.
Try to define your bootstrap_configdir between "", like:
-Dbootstrap_confdir="./solr/collection1/conf"

Unable to open debugger port in IntelliJ

Unable to open debugger port in intellij.
The port number 9009 matches the one which has been set in the configuration file for the application.
<java-config debug-options="-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=n,address=9009" system-classpath="" native-library-path-prefix="D:\Project\lib\windows\64bit" classpath-suffix="">
<jvm-options>-XX:MaxPermSize=192m</jvm-options>
<jvm-options>-client</jvm-options>
<jvm-options>-XX:+UnlockDiagnosticVMOptions</jvm-options>
<jvm-options>-XX:+LogVMOutput</jvm-options>
<jvm-options>-XX:LogFile=${com.sun.aas.instanceRoot}/logs/jvm.log</jvm-options>
<jvm-options>-Djava.endorsed.dirs=${com.sun.aas.installRoot}/modules/endorsed${path.separator}${com.sun.aas.installRoot}/lib/endorsed</jvm-options>
<jvm-options>-Djava.security.policy=${com.sun.aas.instanceRoot}/config/server.policy</jvm-options>
<jvm-options>-Djava.security.auth.login.config=${com.sun.aas.instanceRoot}/config/login.conf</jvm-options>
<jvm-options>-Dcom.sun.enterprise.security.httpsOutboundKeyAlias=s1as</jvm-options>
<jvm-options>-Djavax.net.ssl.keyStore=${com.sun.aas.instanceRoot}/config/keystore.jks</jvm-options>
<jvm-options>-Djavax.net.ssl.trustStore=${com.sun.aas.instanceRoot}/config/cacerts.jks</jvm-options>
<jvm-options>-Djava.ext.dirs=${com.sun.aas.javaRoot}/lib/ext${path.separator}${com.sun.aas.javaRoot}/jre/lib/ext${path.separator}${com.sun.aas.instanceRoot}/lib/ext</jvm-options>
<jvm-options>-Djdbc.drivers=org.apache.derby.jdbc.ClientDriver</jvm-options>
<jvm-options>-DANTLR_USE_DIRECT_CLASS_LOADING=true</jvm-options>
<jvm-options>-Dcom.sun.enterprise.config.config_environment_factory_class=com.sun.enterprise.config.serverbeans.AppserverConfigEnvironmentFactory</jvm-options>
<jvm-options>-Dosgi.shell.telnet.port=4766</jvm-options>
<jvm-options>-Dosgi.shell.telnet.maxconn=1</jvm-options>
<jvm-options>-Dosgi.shell.telnet.ip=127.0.0.1</jvm-options>
<jvm-options>-Dfelix.fileinstall.dir=${com.sun.aas.installRoot}/modules/autostart/</jvm-options>
<jvm-options>-Dfelix.fileinstall.poll=5000</jvm-options>
<jvm-options>-Dfelix.fileinstall.debug=1</jvm-options>
<jvm-options>-Dfelix.fileinstall.bundles.new.start=true</jvm-options>
<jvm-options>-Dorg.glassfish.web.rfc2109_cookie_names_enforced=false</jvm-options>
<jvm-options>-XX:NewRatio=2</jvm-options>
<jvm-options>-Xmx2048m</jvm-options>
</java-config>
Configuration in IntelliJ:
When I try and enable the remote debugging in for this application it comes up with the following error:
You may have to change the debugger port if your port is already used by another program. To do so:
Run
Edit Configurations
Startup/Connection tab
Debug
Change the port here
Or, maybe in other versions:
Run
Edit Configurations
Remote > Remote debug in the list on the left
Configuration tab, Settings section
Port: change the port here
Add the following parameter debug-enabled="true" to this line in the glassfish configuration.
Example:
<java-config debug-options="-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=n,address=9009" debug-enabled="true"
system-classpath="" native-library-path-prefix="D:\Project\lib\windows\64bit" classpath-suffix="">
Start and stop the glassfish domain or service which was using this configuration.
I had the same problem and this solution also did the trick for me: Provide the IP 127.0.0.1 in the Intellij Debug configuration instead of the host name "localhost", in case you're using this hostname.
You must set CHMOD +x (execute for *.sh or *.bat files). For example, I am using macOS
cd /Users/donhuvy/Documents/tools/apache-tomcat-9.0.12/bin
sudo chmod +x *.sh
Then IntelliJ IDEA, and Apache Tomcat running or debugging just good.
In glassfish\domains\domain1\config\domain.xml set before start server
<java-config classpath-suffix="" debug-options="-agentlib:jdwp=transport=dt_socket,address=9009,server=y,suspend=n" java-home="C:\Program Files\Java\jdk1.8.0_162" debug-enabled="true" system-classpath="">
or set debug-enabled="true" server=y,suspend=n in http://localhost:4848/common/index.jsf
In current Idea 2018 - Server Run Configuration - Debug - Port - address
I'm hoping your problem has been solved by now. If not, try this... It looks like you have server=y for both your app and IDEA. IDEA should probably be server=n. Also, the (IDEA) client should have an address that includes both the host name and the port, e.g., address=127.0.0.1:9009.
This one worked for me--
If the issue still persists (in case you are not using a glassFish server), then close your JIdea and stop the server. This will disable the ports connectivity. Then start your server and JIdea, this will start fresh connectivity with the ports, resolving the issue.
For me, the problem was that catalina.sh didnt have execute permissions. The "Unable to open debugger port in intellij" message appeared in Intellij, but it sort of masked the 'could not execute catalina.sh' error that appeared in the logs immediately prior.
This error can happen Tomcat is already running. So make sure Tomcat isn't running in the background if you've asked Intellij to start it up ( default ).
Also, check the full output window for more errors. As a more useful error may have preceded this one ( as was the case with my configuration just now )
Answer is pretty simple,
I also faced the problem finally I got perfect solution.
Create Debug
Create Remote debug with following configuration
Firstly run by debug.
It gives you waitng for socket 5005
then run with remote debug
Try to connect with telnet , if it connects then it shows below:
$telnet 10.238.136.165 9999
Trying 10.238.136.165...
Connected to 10.238.136.165.
Escape character is '^]'.
Connection closed by foreign host.
If port is not available (either because someone else is already connected to it or the port is not open etc) then it shows something like it shows like below:
$telnet 10.238.136.165 9999
Trying 10.238.136.165...
telnet: connect to address 10.238.136.165: Connection refused
telnet: Unable to connect to remote host
So I think one needs to see whether:
the application is property listening to port or not
or someone else has already connected to it
Also try to connect on that m/c itself first like
$telnet localhost 9999
Set the MAVEN_OPTS. It should work !!
export MAVEN_OPTS="-Xdebug -Xnoagent -Djava.compiler=NONE -Xrunjdwp:transport=dt_socket,address=4000,server=y,suspend=n"
mvn spring-boot:run -Dserver.port=8090
Run your Spring Boot application with the given command to enable debugging on port 6006 while the server is up on port 8090:
mvn spring-boot:run -Drun.jvmArguments='-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=6006' -Dserver.port=8090
Your Service/ Application might already be running. In Search enter Services and you will get list of services. Stop yours and then try again.
I had the same issue, I just have to remove the HTTP protocol from the URL. That's it.
I hope it works for you.
I once have this problem too.
My solution is to work around this problem by kill the application which is using the port.
Here is a article to teach us how to check which application is using which port, find it and kill/close it.
In my case, I was not setting the debug port while starting the application.
I am using tomcat to deploy 3 war files, and I forgot to configure the debug port.
Tomcat allows us to configure this via setenv.sh.
Here are the commands to create setenv.sh file in the bin directory of my tomcat installation and provide the debug arguments/port.
tee /usr/share/tomcat9/bin/setenv.sh << EOF
export CATALINA_OPTS="$CATALINA_OPTS -agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=5005"
EOF
service tomcat9 restart
Merely hitting the debug icon again fixed my problem in a few seconds.
Make sure to specify an SDK and Project SDK for your app under File --> Project Structure (Project | SDKs)

How to monitor c3p0 connections

I am using Hibernate in my JBoss war, using c3p0 for connection pooling, both configured within a hibernate.cfg.xml config file in my classpath
<property name="connection.provider_class">org.hibernate.connection.C3P0ConnectionProvider</property>
I've seen server.log generates lines with interesting information about the connection pool:
DEBUG [com.mchange.v2.resourcepool.BasicResourcePool] trace com.mchange.v2.resourcepool.BasicResourcePool#63f5e4b6 [managed: 10, unused: 9, excluded: 0]
For my monitoring pool (I am using nagios) I'd like to provide a JSP telling how many connections are being used and how many are free, as the log file says.
How can I ask c3p0 how many managed and unused connections are there?
You can monitor your connection pool(s) via JMX. From the documentation:
Configuring and Managing c3p0 via JMX
If JMX libraries and a JMX
MBeanServer are available in your
environment (they are include in JDK
1.5 and above), you can inspect and configure your c3p0 datasources via a
JMX administration tool (such as
jconsole, bundled with jdk 1.5). You
will find that c3p0 registers MBeans
under com.mchange.v2.c3p0, one with
statistics about the library as a
whole (called C3P0Registry), and an
MBean for each PooledDataSource you
deploy. You can view and modify your
DataSource's configuration properties,
track the activity of Connection,
Statement, and Thread pools, and reset
pools and DataSources via the
PooledDataSource MBean. (You may
wish to view the API docs of
PooledDataSource for
documentation of the available
operations.)
By the way, there seem to be JMX plugins for Nagios, you're not forced to use a JSP.
You can monitor with Icinga/Nagios like this.
Download JMXQuery from google code. You will need to check out revision 18 like so.
svn checkout -r 18 http://jmxquery.googlecode.com/svn/trunk/ jmxquery-read-only
Download this patch. wildcard patch for c3p0
use this command to patch the source code: (make sure you are in the jmxquery-read-only/src/main directory)
patch -p0 -i wildcard_patch.diff
now download Apache Maven and extract it using this command
tar -zxvf apache-maven-*-bin.tar.gz
now cd into the jmxquery-read-only folder and run the following command (assuming the apache maven and the jmxquery are in the same folder)
../apache-maven-*/bin/mvn compile
then run the following command:
../apache-maven-3.0.3/bin/mvn package
now you should have produced a jmxquery.jar file that you can use to query the c3p0 connection pool like so: (the check_jmx file can be obtained from just downloading the jmxquery code from the google code site like normal. using this link)
check_jmx -U service:jmx:rmi:///jndi/rmi://localhost:1090/jmxrmi -O com.mchange.v2.c3p0:type=PooledDataSource* -N 1 -A numBusyConnections -w 50 -c 100

Categories

Resources