Remove user from Active Directory - java

I have an email distribution list "CTW DEV". I would like to remove the 1 user 'rakeshdw' from the ActiveDirectory using java. Please Find below code.
Its giving an exception. User is not getting removed. Please suggest the required changes. Thanks !
import java.io.BufferedReader;
import java.io.FileReader;
import java.io.IOException;
import java.io.RandomAccessFile;
import java.nio.ByteBuffer;
import java.nio.channels.FileChannel;
import java.util.Iterator;
import java.util.Properties;
import java.util.HashSet;
import javax.naming.Context;
import javax.naming.NamingEnumeration;
import javax.naming.NamingException;
import javax.naming.directory.Attribute;
import javax.naming.directory.Attributes;
import javax.naming.directory.BasicAttribute;
import javax.naming.directory.DirContext;
import javax.naming.directory.InitialDirContext;
import javax.naming.directory.ModificationItem;
import javax.naming.directory.SearchControls;
import javax.naming.directory.SearchResult;
import java.util.Properties;
import javax.naming.Context;
import javax.naming.directory.BasicAttribute;
import javax.naming.directory.DirContext;
import javax.naming.directory.InitialDirContext;
import javax.naming.directory.ModificationItem;
public class DeleteFromADGroup {
private String adGroup,dn;
private DirContext ctx;
private String adminName = "intranet\\patilume";
DeleteFromADGroup(){
try{
this.adGroup = "CN=CTW_DEV";
this.dn= "OU=DistributionLists,OU=Messaging,DC=INTRANET,DC=INFOSYSINT,DC=com";
Properties pr = new Properties();
pr.setProperty(Context.INITIAL_CONTEXT_FACTORY, "com.sun.jndi.ldap.LdapCtxFactory");
pr.setProperty(Context.PROVIDER_URL, "ldap://intranet.infosysint.com");
pr.setProperty(Context.SECURITY_AUTHENTICATION,"simple");
pr.setProperty(Context.SECURITY_CREDENTIALS, "myPassword"); //its password
pr.setProperty(Context.SECURITY_PRINCIPAL, this.adminName);
pr.setProperty(Context.REFERRAL, "ignore");
this.ctx = new InitialDirContext(pr);
}
catch(Exception e){
System.out.println("in constructor..");
}
}
public static void main(String[] args) {
DeleteFromADGroup dadg=new DeleteFromADGroup();
dadg.deleteUser("CTW_DEV","rakeshdw");
}
private void deleteUser(String ADGroup, String username){
String groupName = "CN="+ADGroup+",OU=DistributionLists,OU=Messaging,DC=INTRANET,DC=INFOSYSINT,DC=com";
try{
ModificationItem mods[] = new ModificationItem[1];
mods[0]= new ModificationItem(DirContext.REMOVE_ATTRIBUTE, new BasicAttribute("member", username));
//update the group
ctx.modifyAttributes(groupName,mods);
ctx.close();
}
catch(Exception e){
System.out.println("Exception while removing user from DL");
}
}
}
Exception I get is as below:
javax.naming.OperationNotSupportedException: [LDAP: error code 53 - 0000054F: SvcErr: DSID-031A0FC0, problem 5003 (WILL_NOT_PERFORM), data 0

You need to pass the DN of the user to the username attribute in your deleteUser method. For example, it might be something like cn=rakeshw,ou=people,dc=contoso,dc=com.

Related

DataSource cannot be resolved - Weka

I have the following class to perform PCA on a arff file. I have added the Weka jar to my project but I am still getting an error saying DataSource cannot be resolved and I don't know what to do to resolve it. Can anyone suggest what could be wrong?
package project;
import weka.core.Instances;
import weka.core.converters.ArffLoader;
import weka.core.converters.ConverterUtils;
import weka.core.converters.ConverterUtils.DataSource;
import weka.core.converters.TextDirectoryLoader;
import weka.gui.visualize.Plot2D;
import weka.gui.visualize.PlotData2D;
import weka.gui.visualize.VisualizePanel;
import java.awt.BorderLayout;
import java.io.File;
import java.util.ArrayList;
import javax.swing.JFrame;
import org.math.plot.FrameView;
import org.math.plot.Plot2DPanel;
import org.math.plot.PlotPanel;
import org.math.plot.plots.ScatterPlot;
import weka.attributeSelection.PrincipalComponents;
import weka.attributeSelection.Ranker;
public class PCA {
public static void main(String[] args) {
try {
// Load the Data.
DataSource source = new DataSource("../data/ingredients.arff");
Instances data = source.getDataSet();
// Perform PCA.
PrincipalComponents pca = new PrincipalComponents();
pca.setVarianceCovered(1.0);
//pca.setCenterData(true);
pca.setNormalize(true);
pca.setTransformBackToOriginal(false);
pca.buildEvaluator(data);
// Show transform data into eigenvector basis.
Instances transformedData = pca.transformedData();
System.out.println(transformedData);
} catch (Exception e) {
e.printStackTrace();
}
}
}

kafka to hdfs with confluent source code

For the requirement of my project, I need to build a class from the confluent java code to write data from kafka topic to the hdfs filesystem.
It is actually working in CLI with connect-standalone, but I need to do the same thing with the source code which I built successfully.
I have a problem with SinkTask and hdfsConnector classes.
An exception is showing up in the put method.
Here below is my class code:
package io.confluent.connect.hdfs;
import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertTrue;
import java.util.ArrayList;
import java.util.Collection;
import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
import java.util.Set;
import java.util.concurrent.TimeUnit;
import org.apache.kafka.connect.errors.ConnectException;
import org.apache.kafka.connect.sink.SinkConnector;
import org.apache.kafka.connect.sink.SinkRecord;
import org.apache.kafka.connect.sink.SinkTaskContext;
import io.confluent.connect.avro.AvroData;
import io.confluent.connect.hdfs.avro.AvroFormat;
import io.confluent.connect.hdfs.partitioner.DefaultPartitioner;
import io.confluent.connect.storage.common.StorageCommonConfig;
import io.confluent.connect.storage.partitioner.PartitionerConfig;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.kafka.common.TopicPartition;
import org.apache.kafka.common.config.ConfigDef;
public class main{
private static Map<String, String> props = new HashMap<>();
protected static final TopicPartition TOPIC_PARTITION = new TopicPartition(TOPIC, PARTITION);
protected static String url = "hdfs://localhost:9000";
protected static SinkTaskContext context;
public static void main(String[] args) {
HdfsSinkConnector hk = new HdfsSinkConnector();
HdfsSinkTask h = new HdfsSinkTask();
props.put(StorageCommonConfig.STORE_URL_CONFIG, url);
props.put(HdfsSinkConnectorConfig.HDFS_URL_CONFIG, url);
props.put(HdfsSinkConnectorConfig.FLUSH_SIZE_CONFIG, "3");
props.put(HdfsSinkConnectorConfig.FORMAT_CLASS_CONFIG, AvroFormat.class.getName());
try {
hk.start(props);
Collection<SinkRecord> sinkRecords = new ArrayList<>();
SinkRecord record = new SinkRecord("test", 0, null, null, null, null, 0);
sinkRecords.add(record);
h.initialize(context);
h.put(sinkRecords);
hk.stop();
} catch (Exception e) {
throw new ConnectException("Couldn't start HdfsSinkConnector due to configuration error", e);
}
}
}

Apache HttpCilent: Timeout issue within a corporate network

This question worths 50$ bill :)
I have a simple Java application that retrieves some data from a remote server by Get http request. When I run it from my home it works, and when I run it inside my university it does not, it gives me : "Connection timed out: connect" error.
We do NOT have a proxy in my university, I verified this with the adminstrator,
I can access the URL of the remote server from the navigator and test the post/get and it works;
If it is a firewall problem, then why the navigator can access the server and retrieve the data?
Here is my code:
package API;
import Controllers.Computed_IndicatorController;
import Controllers.DB_Connection_Factory;
import Controllers.DatasetController;
import Controllers.MessageController;
import Controllers.UserController;
import java.io.BufferedReader;
import java.io.InputStreamReader;
import java.nio.charset.Charset;
import java.util.Vector;
import org.apache.http.auth.UsernamePasswordCredentials;
import org.apache.http.HttpResponse;
import org.apache.http.client.HttpClient;
import org.apache.http.client.methods.HttpGet;
import org.apache.http.impl.client.DefaultHttpClient;
import org.json.JSONObject;
import Entities.Computed_Indicator;
import Entities.Dataset;
import Entities.Message;
import Entities.User;
import java.io.IOException;
import java.sql.Connection;
import java.sql.PreparedStatement;
import java.sql.SQLException;
import java.text.ParseException;
import java.text.SimpleDateFormat;
import java.time.ZonedDateTime;
import java.util.Calendar;
import java.util.Date;
import java.util.logging.Level;
import java.util.logging.Logger;
import org.apache.http.impl.auth.BasicScheme;
import org.apache.http.util.EntityUtils;
import org.json.JSONException;
public class GetTweets1 {
private static final String USER_AGENT = "Mozilla/5.0";
public GetTweets1(){}
public static void RetrieveAndStoreMessages() throws IOException, JSONException, SQLException{
String url = "...my URL...";
HttpClient client = new DefaultHttpClient();
HttpGet request = new HttpGet(url);
//request.addHeader("User-Agent", USER_AGENT);
request.addHeader(BasicScheme.authenticate(new UsernamePasswordCredentials("account", "password"), "UTF-8", false));//("account", "password"), "UTF-8", false));
//request.addHeader("Accept" ,"application/json; charset=utf-8");
HttpResponse response = client.execute(request);
}
public static void main (String args[]) throws Exception{
RetrieveAndStoreMessages();
}
}
Can you please help me to resolve this problem? I am willing to to give a 50$ bill for the one who resolves it for me.

Java - Stanford NLP - Process all files in directory

I am using Stanford to do some NER analysis on txt files. The problem so far is that I have been to read all files in a directory. I have just been able to read simple Strings. What should be the next step to read several files? I tried with Iterator but it did not work.
Please see my code below:
Blockquote
import java.io.*;
import java.util.*;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import edu.stanford.nlp.ie.AbstractSequenceClassifier;
import edu.stanford.nlp.ie.NERClassifierCombiner;
import edu.stanford.nlp.pipeline.SentimentAnnotator;
import edu.stanford.nlp.ie.crf.CRFClassifier;
import edu.stanford.nlp.io.*;
import edu.stanford.nlp.ling.*;
import edu.stanford.nlp.ling.CoreAnnotations.NamedEntityTagAnnotation;
import edu.stanford.nlp.ling.CoreAnnotations.PartOfSpeechAnnotation;
import edu.stanford.nlp.ling.CoreAnnotations.SentencesAnnotation;
import edu.stanford.nlp.ling.CoreAnnotations.TextAnnotation;
import edu.stanford.nlp.ling.CoreAnnotations.TokensAnnotation;
import edu.stanford.nlp.process.PTBEscapingProcessor;
import edu.stanford.nlp.sentiment.SentimentCoreAnnotations;
import edu.stanford.nlp.pipeline.*;
import edu.stanford.nlp.process.DocumentPreprocessor;
import edu.stanford.nlp.tagger.maxent.MaxentTagger;
import edu.stanford.nlp.trees.*;
import edu.stanford.nlp.util.*;
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.FileReader;
import java.io.FileWriter;
import java.io.IOException;
import java.io.PrintWriter;
import java.io.File;
import java.io.IOException;
import java.nio.charset.Charset;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
import java.util.Properties;
import org.apache.commons.io.FileUtils;
import com.google.common.io.Files;
import org.apache.commons.io.*;
public class NLPtest2 {
public static void main(String[] args) throws IOException {
Properties props = new Properties();
props.setProperty("annotators", "tokenize, ssplit, pos, lemma, parse, ner, dcoref, sentiment");
StanfordCoreNLP pipeline = new StanfordCoreNLP(props);
//how can we read all documents in a directory instead of just a String??
String text = "I work at Lalalala Ltd. It is awesome";
Annotation annotation = new Annotation(text);
pipeline.annotate(annotation);
// Annotation annotation = pipeline.process(text);
List<CoreMap> sentences = annotation.get(SentencesAnnotation.class);
for (CoreMap sentence : sentences) {
String sentiment = sentence.get(SentimentCoreAnnotations.SentimentClass.class);
System.out.println(sentiment + "\t" + sentence);
// System.out.println(annotation.get(CoreAnnotations.QuotationsAnnotation.class));// dont need it
// traversing the words in the current sentence
// a CoreLabel is a CoreMap with additional token-specific methods
for (CoreLabel token: sentence.get(TokensAnnotation.class)) {
// this is the text of the token
String word = token.get(TextAnnotation.class);
// this is the POS tag of the token
String pos = token.get(PartOfSpeechAnnotation.class);
// this is the NER label of the token
String ne = token.get(NamedEntityTagAnnotation.class);
System.out.println( "Text:"+ word +"//"+"Part of Speech:"+ pos + "//"+ "Entity Recognition:"+ ne);
}
}
}
}
import edu.stanford.nlp.io.*;
import edu.stanford.nlp.util.*;
import java.util.*;
public class ReadFiles {
public static void main(String[] args) {
List<String> filePaths = IOUtils.linesFromFile(args[0]);
for (String filePath : filePaths) {
String fileContents = IOUtils.stringFromFile(filePath);
}
}
}

java.lang.ClassCastException: com.mysql.jdbc.JDBC4ResultSet cannot be cast to com.mysql.jdbc.ResultSet

Below shows an error message I am receiving when trying to run my code
java.lang.ClassCastException: com.mysql.jdbc.JDBC4ResultSet cannot be cast to com.mysql.jdbc.ResultSet
I don't understand why this error is occurring so could someone give me the reason why or even better give me a fix to the problem. Below is the entire code
package com.hotel.database;
import com.hotel.beans.VehicleTypeBean;
import java.sql.Connection;
import com.mysql.jdbc.PreparedStatement;
import com.mysql.jdbc.ResultSet;
import com.mysql.jdbc.Statement;
import java.sql.SQLException;
import java.util.ArrayList;
import java.util.LinkedList;
import java.util.Vector;
import com.hotel.beans.ReservationDetailsBean;
import com.hotel.beans.AgentBean;
import com.hotel.beans.CheckInBean;
import com.hotel.beans.CompanyBean;
import com.hotel.beans.CustomerBean;
import com.hotel.beans.RecieptBean;
import com.hotel.beans.ReservationBean;
import com.hotel.beans.RoomClassBean;
import com.hotel.beans.RoomRateBean;
import com.hotel.beans.RoomStatusBean;
import com.hotel.beans.TransportRecieptBean;
import com.hotel.beans.TransportfareBean;
import com.hotel.beans.VehicleBean;
import com.hotel.beans.VehicleDriverBean;
import java.sql.Date;
import java.sql.SQLException;
import java.util.ArrayList;
import java.util.LinkedList;
import java.util.Vector;
/**
*
* #author lenovo
*/
public class Transport {
Connection conn=null;
ResultSet rs=null;
PreparedStatement stmt=null;
private Vector v;
private LinkedList[] lst;
private int stId;
private Statement state;
private ResultSet res;
public Transport(){
conn=ConnectionProvider.getConnection();
}
public LinkedList [] getWalkInGuests()throws Exception
{
int count=0;
stmt=(PreparedStatement) conn.prepareStatement("select count(distinct customerId) from transport where reservationId=-1");
rs=(ResultSet) stmt.executeQuery();
if(rs.next())
{
count=rs.getInt(1);
}
lst=new LinkedList[count];
if(lst.length<1)
return lst;
count=0;
stmt=(PreparedStatement) conn.prepareStatement("SELECT customer.customer_id,customer.firstName,customer.lastName,customer.address from customer where customer.customer_id IN (select customerId from transport where reservationid=-1)");
// stmt.setInt(1, )
rs=(ResultSet) stmt.executeQuery();
while(rs.next())
{
lst[count]=new LinkedList();
lst[count].add(rs.getInt("customer.customer_id"));
lst[count].add(rs.getString("customer.firstName"));
lst[count].add(rs.getString("customer.lastName"));
lst[count].add(rs.getString("customer.address"));
count++;
}
return lst;
}
}
You have to change your packages :- These are not right:
import com.mysql.jdbc.DriverManager;
import com.mysql.jdbc.Connection;
import com.mysql.jdbc.PreparedStatement;
You can use this ....
import java.sql.DriverManager;
import java.sql.Connection;
import java.sql.PreparedStatement;
because you are using connection as a com.mysql and DriverManager using java.sql like respectively. That is why the errors occurred.

Categories

Resources