I am attempting to connect to a cloudera environment using Kafka and stream data from a topic. I have been able to successfully do this in java but not python. Python appears to connect but it unable to receive the logs. I don't believe my paths, or servers are incorrect because I have connect via java with the same information.
I have done this successfully before with another cloudera environment, in python, and I'm basically copying and pasting from that code. With that being said is it possible that there are some settings in cloudera for this environment that are preventing me from receiving the logs via python?.
with java:
from java.lang import System
System.setProperty('java.security.auth.login.config','<path to jaas.conf>')
System.setProperty('java.security.krb5.conf','<path to krb5.conf>')
broker=['<broker1>:9092','<broker2>:9092','<broker3>:9092']
try:
consumer=KafkaConsumer(bootstrap_servers=broker,
sasl_kerberos_service_name='kafka',
auto_offset_reset='earliest',api_version=(1,0,1),
session_timeout_ms= 30000,enable_auto_commit=True,
sasl_mechanism='GSSAPI',
security_protocol='SASL_PLAINTEXT')
except Exception as e:
message_consumer="Error connecting to kafka"+e.message
sendAlertEmail(message_consumer)
logger1.error("Failed to connect to brokers"+e.message)
To test the program I do,
for message in consumer:
print(message)
When i attempt to access the environment it never makes it into the loop. However, I know there are logs for the topic.
Related
So apologies if what I ask is trivial but I am experimenting with Memcached and Jmeter. I have a Memcached server setup (as far as I can tell) and am able to make telnet requests to it via telnet IP PORT and additionally set and get using commands set and get appropriately.
Now point me to a different application if perhaps this is the wrong choice; but my understanding was that Jmeter should allow me to pound the server with equivalent Set and Get requests.
Unfortunately the experimental platform is a remote linux PC running Rockylinux which is similar to CentOS / RedHat to my understanding (I didn't set this part up); and as a result I do not have a GUI to launch while on the Linux PC. I have however opened Jmeter up on my local PC on windows and understand I should be able to send the test file over and run it.
I followed these instructions to try to setup a TCP sampler and set the "text to send" field as below; after doing the additional step in the link regarding the precompiler.
set tutorialspoint 0 900 9${CR}${LF}
memcached${CR}${LF}
quit${CR}${LF}
Running the above as a headless jmeter session doesn't generate any errors called [./jmeter -n -t "Sample.jmx" -l testresults.jtl"] but when I connect via telnet I'm also not seeing the value for the key "tutorialspoint" get updated. When manually doing the get and set I am seeing updates. Any ideas what I could be doing wrong? Checking the log indicates ResponseCode 200 OK as expected. Is there a good method to debug something in a Headless setup?
Thanks for your time.
I believe the easiest way is using Memcached Java Client library
Download spymemcached-2.12.3.jar and drop it to "lib" folder of your JMeter installation (or any other location in JMeter Classpath)
Restart JMeter to pick the .jar up
Add JSR223 Sampler to your test plan and use the following code snippets:
def client = new net.spy.memcached.MemcachedClient(new InetSocketAddress('your-memcached-host', your-memcached-port)) - for connecting to the server
client.set('tutorialspoint', 900, 'memcached').done to write memcached to the tutorialspoint key for 15 minutes
client.get('tutorialspoint') - to read the value of tutorialspoint key
client.shutdown() - to disconnect
More information on Groovy scripting in JMeter: Apache Groovy - Why and How You Should Use It
Demo:
I have a requirement to pull the files and push the files from FTP Server but I am getting the below error while pulling/fetching the files from/into FTP server. I am using Talend Open Source Data Integration Tool, also tried using custom java code and getting the same error. While I am able to pull and fetch the files using FileZilla FTP Client. I am getting the error when I am trying to pull and push the files.
Error Start ..
com.enterprisedt.net.ftp.ControlChannelIOException: Connection reset
at
com.enterprisedt.net.ftp.FTPControlSocket.readLine(FTPControlSocket.java:1014)
at
com.enterprisedt.net.ftp.FTPControlSocket.readReply(FTPControlSocket.java:1049)
at
com.enterprisedt.net.ftp.FTPControlSocket.sendCommand(FTPControlSocket.java:973)
at
com.enterprisedt.net.ftp.FTPControlSocket.createDataSocketPASV(FTPControlSocket.java:807)
at
com.enterprisedt.net.ftp.FTPControlSocket.createDataSocket(FTPControlSocket.java:563)
at
com.enterprisedt.net.ftp.FTPClient.setupDataSocket(FTPClient.java:2561)
at com.enterprisedt.net.ftp.FTPClient.dir(FTPClient.java:3468) at
vikas_sir.ftp_salesforce_and_vice_verasa_0_1.FTP_Salesforce_AND_Vice_Verasa.tFTPFileList_2Process(FTP_Salesforce_AND_Vice_Verasa.java:488) at
vikas_sir.ftp_salesforce_and_vice_verasa_0_1.FTP_Salesforce_AND_Vice_Verasa.tFTPConnection_1Process(FTP_Salesforce_AND_Vice_Verasa.java:396)
at
vikas_sir.ftp_salesforce_and_vice_verasa_0_1.FTP_Salesforce_AND_Vice_Verasa.runJobInTOS(FTP_Salesforce_AND_Vice_Verasa.java:1085) at
vikas_sir.ftp_salesforce_and_vice_verasa_0_1.FTP_Salesforce_AND_Vice_Verasa.main(FTP_Salesforce_AND_Vice_Verasa.java:942)
Error End ..
I have already tried the below things :
Allowed java application for my firewall e.g java.exe, javaw.exe and etc.
Also tried by disabling the firewall.
used netsh advfirewall set global StatefulFTP disable command as admin.
Disabled my antivirus.
Also added TLS1.1 and TLS 1.2 into .ini file.
Also tried in different Local Machines.
Tried by writing custom java code.
Below is link to the screenshot of the talend job :
Talend Job to pull the files from FTP server
Also when I am trying another FTP credentials, not getting(Can push and pull the files from/into FTP Server) any error.
Please let me know what/where I am doing wrong.
Any help will be greatly appreciated.
Thanks in Advance.
Amit
I'm running a Java server (Jetty, to be specific) on an AWS EC2 instance using WebSocket to connect to a client's browser. When I do this locally (hosting the server on my computer, not AWS), it runs fine. However, when I move the code to an EC2 instance, I get the following error message on the client-side:
WebSocket connection to 'ws://Elastic_IP:8080/?username=name_of_user' failed: Error during WebSocket handshake: Unexpected response code: 500
I made sure that the EC2 instance will accept traffic on port 8080.
On the server-side, I'm getting many java.lang.NoClassDefFoundError when the connection is attempted. I do not get these error when I run it locally. Perhaps there's an issue when I'm compiling on the EC2 instance, however it does compile without error. I'm compiling and running the code using Eclipse locally, but I'm compiling and running the code on EC2 by hand (javac with lots of classpaths). It's likely that I made an error when compiling by hand, but I'm not sure what the error could be.
Any help would be greatly appreciated.
EDIT
After a little trouble-shooting on my own, I realized that JSON.ParseException was the source of issue. After I removed all calls to this class from the server code, the handshake completed and I was able to establish a connection between the server and the client.
However, I am now running into the following error when I receive a message from the client:
WARN:MyWebSocketHandler:qtp990368553-16: Unhandled Error (closing connection)
java.lang.RuntimeException: Cannot call method public void
MyWebSocketHandler#onMessage(org.eclipse.jetty.websocket.api.Session, java.lang.String) with args:
[org.eclipse.jetty.websocket.common.WebSocketSession, java.lang.String]
It seems that I defined the argument to be org.eclipse.jetty.websocket.api.Session, but during runtime the argument is actually org.eclipse.jetty.websocket.common.WebSocketSession. Any ideas on how this is happening or which one (Session vs WebSocketSession) I should use? The only capability I need is to send strings between the server and the client.
I figured out a possible solution to my problem. Through Eclipse, I can Export the Java project to a runnable jar ("packing" the libraries into the jar). Then running it with java -jar <jar_filename> will work on the server and function the same as on the local machine. However, I've notice some performance issues (slow start), so I do not think this is the best solution, however it is a solution.
I used to have a hadoop cluster running with centos 6.5 that was built using a gui. It made everything work perfectly. For a bunch of reasons that are not relevant i needed to upgrade to centos 7. The gui i used to install it on 6.5 just doesnt have centos 7 as an option. As a result I have to manually install it on a new centos 7 cluster. I keep running into this error.
Call From Cluster-Client-1/ip.add.ress to Cluster-Client-1:8020 failed on
connection exception: java.net.ConnectException:
Connection refused; for more details see the website
The relevant port on my cluster is 50010 so i think where it says "cluster-client-1:8020 that it should say 50010 but i dont know where that appropriate file is or how to change it. The ip-address is correct. You probably need more information to answer my question. but I don't know what you need so please ask away. There are currently no slave nodes in the cluster I'm in the process of adding more but i need to solve this error before i can move on.
Please help
I am trying to setup an application server for AWS Lambda but on a local network so that an application won't have to go out to the internet to execute. I would prefer to use a linux box and my programming environment is Java.
The skill from the echo will execute and then communicate with the local server rather than going out to the internet and communicating with Amazon's application server.
My question is this: How do I setup the application server to handle the skill? I've done the example from Amazon, do I only need to have the linux box run the Java application or is there more to the setup than that? I see there are AMIs (Amazon Machine Images) but can I deploy those locally or are they only for use on the AWS console?
Any insight into this would be great, thank you.
So this is how usual interaction between echo works:
User--->Echo--->Skill--->(Internet)Applicaton server (I'm using Amazon hosted AWS lambda)
I would like to use :
User--->Echo--->Skill--->(LAN)Application server (without ever using the internet).
Currently I have setup echo and a skill but no application server on the LAN. What do I need for the application server? JAWS and something else?
I'm not sure if this question is still relevant or not, but I'm using DEEP Framework to test the code locally and/or deploy it on AWS Lambda. Check this out:
npm install deepify -g
deepify run-lambda --help
run-lambda#1.6.8 - Run Lambda function locally
Usage example: deepify run-lambda path/to/the/lambda -e='{"Name":"John Doe"}'
Arguments:
path: The path to the Lambda (directory of handler itself)
Options:
--event|-e: JSON string used as the Lambda payload
--skip-frontend-build|-f: Skip picking up _build path from the microservices Frontend
--db-server|-l: Local DynamoDB server implementation (ex. LocalDynamo, Dynalite)
--version|-v: Prints command version
--help|-h: Prints command help
Also, you might want consider using the server option:
deepify server --help
server#1.6.9 - Run local development server
Usage example: deepify server path/to/web_app -o
Arguments:
path: The path to the Lambda (directory of handler itself)
Options:
--build-path|-b: The path to the build (in order to pick up config)
--skip-frontend-build|-f: Skip picking up _build path from the microservices Frontend
--skip-backend-build|-s: Skip building backend (dependencies installation in Lambdas and linking aws-sdk)
--skip-build-hook|-h: Skip running build hook (hook.build.js)
--port|-p: Port to listen to
--db-server|-l: Local DynamoDB server implementation (ex. LocalDynamo, Dynalite)
--open-browser|-o: Open browser after the server starts
--version|-v: Prints command version
--help|-h: Prints command help
Disclosure: I am one of the contributors to this framework