Read multiple csv file in apache beam using java - java

This code works well with just one file as input but when I pass :-
D://beam//csv//*.csv
or D://beam//csv//20*.csv as parameter it throws :-
Exception in thread "main" org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.nio.file.InvalidPathException: Illegal char <*> at index 17: D:\\beam\\csv\\20*.csv
at org.apache.beam.runners.direct.DirectRunner$DirectPipelineResult.waitUntilFinish(DirectRunner.java:332)
at org.apache.beam.runners.direct.DirectRunner$DirectPipelineResult.waitUntilFinish(DirectRunner.java:302)
at org.apache.beam.runners.direct.DirectRunner.run(DirectRunner.java:197)
at org.apache.beam.runners.direct.DirectRunner.run(DirectRunner.java:64)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:313)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:299)
at beam.wordcount.TestCsv.main(TestCsv.java:60)
Caused by: java.nio.file.InvalidPathException: Illegal char <*> at index 17: D:\\beam\\csv\\20*.csv
at sun.nio.fs.WindowsPathParser.normalize(Unknown Source)
at sun.nio.fs.WindowsPathParser.parse(Unknown Source)
at sun.nio.fs.WindowsPathParser.parse(Unknown Source)
at sun.nio.fs.WindowsPath.parse(Unknown Source)
at sun.nio.fs.WindowsFileSystem.getPath(Unknown Source)
at java.nio.file.Paths.get(Unknown Source)
at org.apache.beam.sdk.io.LocalFileSystem.matchOne(LocalFileSystem.java:217)
at org.apache.beam.sdk.io.LocalFileSystem.match(LocalFileSystem.java:90)
at org.apache.beam.sdk.io.FileSystems.match(FileSystems.java:119)
at org.apache.beam.sdk.io.FileSystems.match(FileSystems.java:140)
at org.apache.beam.sdk.io.FileSystems.match(FileSystems.java:152)
at org.apache.beam.sdk.io.FileIO$MatchAll$MatchFn.process(FileIO.java:636)
I don't know why it is throwing error , * is used to read multiple files with similar type
CODE
public interface BatchOptions extends PipelineOptions {
#Description("Path to the data file(s) containing game data.")
#Default.String("D:\\beam\\csv\\2020.csv")
String getInput();
void setInput(String value);
}
public static void main(String[] args) {
BatchOptions options = PipelineOptionsFactory.fromArgs(args).withValidation().as(BatchOptions.class);
Pipeline pipeline = Pipeline.create(options);
PCollection lines=pipeline
.apply(FileIO.match().filepattern(options.getInput()))
.apply(FileIO.readMatches());
herepipeline.run().waitUntilFinish();
}

WindowsFileSystem does not expand * and treat it as special character.
I would recommend passing the complete directory like
D://beam//csv//

Related

ClassCastException occurs when Flink DataStream sends a message to a remote stateful function

The DataStream job:
public static final FunctionType DEVICE = new FunctionType("com.github.f1xman.era.anomalydetection.device", "DeviceFunction");
public static void main(String[] args) throws Exception {
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
StatefulFunctionsConfig statefunConfig = StatefulFunctionsConfig.fromEnvironment(env);
statefunConfig.setFactoryType(MessageFactoryType.WITH_KRYO_PAYLOADS);
DataStreamSource<String> names = env.addSource(new NamesSourceFunction());
DataStream<RoutableMessage> namesIngress = names.map(name -> RoutableMessageBuilder.builder()
.withTargetAddress(DEVICE, name)
.withMessageBody(name)
.build());
StatefulFunctionDataStreamBuilder.builder("example")
.withDataStreamAsIngress(namesIngress)
.withRequestReplyRemoteFunction(
requestReplyFunctionBuilder(DEVICE, URI.create("http://localhost:8080/statefun"))
)
.withConfiguration(statefunConfig)
.build(env);
env.execute("Flink Streaming Java API Skeleton");
}
When String value passed to .withMessageBody(...) the following exception occurred:
Exception in thread "main" org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.runtime.minicluster.MiniClusterJobClient.lambda$getJobExecutionResult$3(MiniClusterJobClient.java:137)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.rpc.akka.AkkaInvocationHandler.lambda$invokeRpc$1(AkkaInvocationHandler.java:258)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.doForward(FutureUtils.java:1389)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.lambda$null$1(ClassLoadingUtils.java:93)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.lambda$guardCompletionWithContextClassLoader$2(ClassLoadingUtils.java:92)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.akka.AkkaFutureUtils$1.onComplete(AkkaFutureUtils.java:47)
at akka.dispatch.OnComplete.internal(Future.scala:300)
at akka.dispatch.OnComplete.internal(Future.scala:297)
at akka.dispatch.japi$CallbackBridge.apply(Future.scala:224)
at akka.dispatch.japi$CallbackBridge.apply(Future.scala:221)
at scala.concurrent.impl.CallbackRunnable.run$$$capture(Promise.scala:60)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala)
at org.apache.flink.runtime.concurrent.akka.AkkaFutureUtils$DirectExecutionContext.execute(AkkaFutureUtils.java:65)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:68)
at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:284)
at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:284)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:284)
at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:621)
at akka.pattern.PipeToSupport$PipeableFuture$$anonfun$pipeTo$1.applyOrElse(PipeToSupport.scala:24)
at akka.pattern.PipeToSupport$PipeableFuture$$anonfun$pipeTo$1.applyOrElse(PipeToSupport.scala:23)
at scala.concurrent.Future.$anonfun$andThen$1(Future.scala:532)
at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:29)
at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:29)
at scala.concurrent.impl.CallbackRunnable.run$$$capture(Promise.scala:60)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala)
at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:63)
at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:100)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:81)
at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:100)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:49)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:48)
at java.util.concurrent.ForkJoinTask.doExec$$$capture(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:252)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:242)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:684)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:79)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:444)
at sun.reflect.GeneratedMethodAccessor14.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRpcInvocation$1(AkkaRpcActor.java:316)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:83)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:314)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:217)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:123)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:122)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage$$$capture(ActorCell.scala:580)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
... 5 more
Caused by: org.apache.flink.statefun.flink.core.functions.StatefulFunctionInvocationException: An error occurred when attempting to invoke function FunctionType(com.github.f1xman.era.anomalydetection.device, DeviceFunction).
at org.apache.flink.statefun.flink.core.functions.StatefulFunction.receive(StatefulFunction.java:50)
at org.apache.flink.statefun.flink.core.functions.ReusableContext.apply(ReusableContext.java:74)
at org.apache.flink.statefun.flink.core.functions.LocalFunctionGroup.processNextEnvelope(LocalFunctionGroup.java:60)
at org.apache.flink.statefun.flink.core.functions.Reductions.processEnvelopes(Reductions.java:164)
at org.apache.flink.statefun.flink.core.functions.Reductions.apply(Reductions.java:149)
at org.apache.flink.statefun.flink.core.functions.FunctionGroupOperator.processElement(FunctionGroupOperator.java:90)
at org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.pushToOperator(CopyingChainingOutput.java:82)
at org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:57)
at org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:29)
at org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:56)
at org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:29)
at org.apache.flink.statefun.flink.core.feedback.FeedbackUnionOperator.sendDownstream(FeedbackUnionOperator.java:180)
at org.apache.flink.statefun.flink.core.feedback.FeedbackUnionOperator.processElement(FeedbackUnionOperator.java:86)
at org.apache.flink.streaming.runtime.tasks.OneInputStreamTask$StreamTaskNetworkOutput.emitRecord(OneInputStreamTask.java:233)
at org.apache.flink.streaming.runtime.io.AbstractStreamTaskNetworkInput.processElement(AbstractStreamTaskNetworkInput.java:134)
at org.apache.flink.streaming.runtime.io.AbstractStreamTaskNetworkInput.emitNext(AbstractStreamTaskNetworkInput.java:105)
at org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65)
at org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:496)
at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:203)
at org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:809)
at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:761)
at org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:958)
at org.apache.flink.runtime.taskmanager.Task.restoreAndInvoke(Task.java:937)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:766)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:575)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.flink.statefun.sdk.reqreply.generated.TypedValue
at org.apache.flink.statefun.flink.core.reqreply.RequestReplyFunction.invoke(RequestReplyFunction.java:118)
at org.apache.flink.statefun.flink.core.functions.StatefulFunction.receive(StatefulFunction.java:48)
... 25 more
Though, sending a String value to an embedded function works well. The workaround I've found is to wrap the value with TypedValue:
.withMessageBody(TypedValue.newBuilder()
.setValue(ByteString.copyFrom(name, StandardCharsets.UTF_8))
.setHasValue(true)
.setTypename("example/Name")
.build()
)
This approach requires the receiver function to unwrap the TypedValue and deserialize the ByteString. It looks too low-level for this kind of API. I believe this is the wrong usage of Stateful Function's SDK for Flink DataStream Integration. What is the correct way to implement Flink DataStream and remote Stateful Functions interoperability?
The job is inspired by the official examples.

Javassist CannotCompileException when trying to add a line to create a Map

um trying to instrument a method to do the following task.
Task - Create a Map and insert values to the map
Adding System.out.println lines wouldn't cause any exception. But when i add the line to create the Map, it throws a cannotCompileException due to a missing ;. When i print the final string it doesn't seem to miss any. What am i doing wrong here.
public void createInsertAt(CtMethod method, int lineNo, Map<String,String> parameterMap)
throws CannotCompileException {
StringBuilder atBuilder = new StringBuilder();
atBuilder.append("System.out.println(\"" + method.getName() + " is running\");");
atBuilder.append("java.util.Map<String,String> arbitraryMap = new java.util.HashMap<String,String>();");
for (Map.Entry<String,String> entry : parameterMap.entrySet()) {
}
System.out.println(atBuilder.toString());
method.insertAt(1, atBuilder.toString());
}
String obtained by printing the output of string builder is,
System.out.println("prepareStatement is
running");java.util.Map arbitraryMap = new
java.util.HashMap();
Exception received is,
javassist.CannotCompileException: [source error] ; is missing
at javassist.CtBehavior.insertAt(CtBehavior.java:1207)
at javassist.CtBehavior.insertAt(CtBehavior.java:1134)
at org.wso2.das.javaagent.instrumentation.InstrumentationClassTransformer.createInsertAt(InstrumentationClassTransformer.java:126)
at org.wso2.das.javaagent.instrumentation.InstrumentationClassTransformer.instrumentMethod(InstrumentationClassTransformer.java:100)
at org.wso2.das.javaagent.instrumentation.InstrumentationClassTransformer.transform(InstrumentationClassTransformer.java:37)
at sun.instrument.TransformerManager.transform(TransformerManager.java:188)
at sun.instrument.InstrumentationImpl.transform(InstrumentationImpl.java:424)
at sun.instrument.InstrumentationImpl.retransformClasses0(Native Method)
at sun.instrument.InstrumentationImpl.retransformClasses(InstrumentationImpl.java:144)
at org.wso2.das.javaagent.instrumentation.Agent.premain(Agent.java:39)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at sun.instrument.InstrumentationImpl.loadClassAndStartAgent(InstrumentationImpl.java:382)
at sun.instrument.InstrumentationImpl.loadClassAndCallPremain(InstrumentationImpl.java:397)
Caused by: compile error: ; is missing
at javassist.compiler.Parser.parseDeclarationOrExpression(Parser.java:594)
at javassist.compiler.Parser.parseStatement(Parser.java:277)
at javassist.compiler.Javac.compileStmnt(Javac.java:567)
at javassist.CtBehavior.insertAt(CtBehavior.java:1186)
... 15 more
(Is there any way to debug these kind of issues.) Some help please.....
Javassist's compiler doesn't support generics. Either remove or comment them out:
.append("java.util.Map arbitraryMap = new java.util.HashMap();")
or
.append("java.util.Map/*<String,String>*/ arbitraryMap = new java.util.HashMap/*<String,String>*/();")
The latter is useful as comment for yourself only, of course, it has no special meaning for Javassist.

Extract Wikipedia Infobox data

I want to extract the data from wikipedia infobox and came upon the code in Wikipedia infobox extraction in Java that suggests a method to do so with java. I am not handy with java as I am with python so I am using the wikixmlj-r43.jar in my eclipse with the code :
import edu.jhu.nlp.wikipedia.*;
public class InfoboxParser {
public static void main(String[] args) throws Exception{
WikiXMLParser parser = WikiXMLParserFactory.getSAXParser("/home/siddhartha/Documents/wiki/enwiki-latest-pages-articles.xml");
parser.setPageCallback(new PageCallbackHandler() {
public void process(WikiPage page) {
InfoBox infobox=page.getInfoBox();
//do something with info box
}
});
parser.parse();
}
}
I am getting the following error:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/tools/bzip2/CBZip2InputStream
at edu.jhu.nlp.wikipedia.WikiXMLParserFactory.getSAXParser(WikiXMLParserFactory.java:15)
at parser.InfoboxParser.main(InfoboxParser.java:7)
Caused by: java.lang.ClassNotFoundException: org.apache.tools.bzip2.CBZip2InputStream
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 2 more
I have added the JAR in eclipse under properties > java build path > libraries. What I get is that it is not able to find CBZip2InputStream class.
Please help.
Response res = Jsoup.connect("http://en.wikipedia.org/wiki/Carbon")
.execute();
String html = res.body();
Document doc = Jsoup.parseBodyFragment(html);
Element body = doc.body();
Elements tables = body.getElementsByTag("table");// hasClass("infobox bordered");
for (Element table : tables) {
if (table.className().equalsIgnoreCase("infobox bordered")) {
System.out.println(table.outerHtml());
break;
}
This might help you.
https://code.google.com/p/wikixmlj/source/browse/trunk/tests/InfoBoxTest.java?spec=svn40&r=40
Replace the link in the code(data/newton.xml) with this
http://en.wikipedia.org/w/api.php?action=query&prop=revisions&rvprop=content&format=xml&titles=your_title&rvsection=0

Activating a user account on creation in AD LDS

So, I'm writing code that will create user accounts in AD LDS. I can create the user, but the account is disabled.
I want the user to be active and to be able to change their password. I've tried some of the things suggested in this post, but it hasn't helped me.
Here's my code:
ctx = getConnection(adminUser, adminPassword);
// Create attributes for the new user
Attributes attributes = new BasicAttributes(true);
// Main attributes for user
attributes.put("objectClass", "user");
attributes.put("name", user.getFullName());
attributes.put("ms-DS-User-Account-Control-Computed",
Integer.toString(UF_NORMAL_ACCOUNT + UF_PASSWORD_EXPIRED));
try {
ctx.createSubcontext(getDistinguishedName(user.getFullName()),
attributes);
System.out.println("User successfully added!");
} catch (NamingException e) {
e.printStackTrace();
}
When I run this, I get the following error:
javax.naming.directory.NoSuchAttributeException: [LDAP: error code 16
- 00000057: LdapErr: DSID-0C090D11, comment: Error in attribute conversion operation, data 0, v23f0remaining name 'CN=Samuel
King,CN=Users,CN=Agents,DC=CHESA,DC=local' at
com.sun.jndi.ldap.LdapCtx.mapErrorCode(Unknown Source) at
com.sun.jndi.ldap.LdapCtx.processReturnCode(Unknown Source) at
com.sun.jndi.ldap.LdapCtx.processReturnCode(Unknown Source) at
com.sun.jndi.ldap.LdapCtx.c_createSubcontext(Unknown Source) at
com.sun.jndi.toolkit.ctx.ComponentDirContext.p_createSubcontext(Unknown
Source) at
com.sun.jndi.toolkit.ctx.PartialCompositeDirContext.createSubcontext(Unknown
Source) at
com.sun.jndi.toolkit.ctx.PartialCompositeDirContext.createSubcontext(Unknown
Source) at
javax.naming.directory.InitialDirContext.createSubcontext(Unknown
Source) at
com.ceiwc.ActiveDirectory.createUserAccount(ActiveDirectory.java:114)
at com.ceiwc.TestAD.main(TestAD.java:24)
If I change the line where I'm updating the ms-DS-User-Account-Control-Computed to:
attributes.put("ms-DS-User-Account-Control-Computed", UF_NORMAL_ACCOUNT
+ UF_PASSWORD_EXPIRED);
i get the following error:
javax.naming.directory.InvalidAttributeValueException: Malformed
'ms-DS-User-Account-Control-Computed' attribute value; remaining name
'CN=Samuel King,CN=Users,CN=Agents,DC=CHESA,DC=local' at
com.sun.jndi.ldap.LdapClient.encodeAttribute(Unknown Source) at
com.sun.jndi.ldap.LdapClient.add(Unknown Source) at
com.sun.jndi.ldap.LdapCtx.c_createSubcontext(Unknown Source) at
com.sun.jndi.toolkit.ctx.ComponentDirContext.p_createSubcontext(Unknown
Source) at
com.sun.jndi.toolkit.ctx.PartialCompositeDirContext.createSubcontext(Unknown
Source) at
com.sun.jndi.toolkit.ctx.PartialCompositeDirContext.createSubcontext(Unknown
Source) at
javax.naming.directory.InitialDirContext.createSubcontext(Unknown
Source) at
com.ceiwc.ActiveDirectory.createUserAccount(ActiveDirectory.java:116)
at com.ceiwc.TestAD.main(TestAD.java:24)
So, what am I doing wrong? Is this the proper way to activate the account? Does someone have any code to help me out?
Thanks!
NuAlphaMan,
I think, that the exception has something to do with the fact that you use CN as the name of the attribute instead of Ldap-Display-Name which is msDS-User-Account-Control-Computed. The description could be found here http://msdn.microsoft.com/en-us/library/windows/desktop/ms677840(v=vs.85).aspx.
As to the second question of how to activate an account, I've found that there is an attribute userAccountControl (http://msdn.microsoft.com/en-us/library/windows/desktop/ms680832(v=vs.85).aspx#win_2008_r2) and the value 0x00000002 (ADS_UF_ACCOUNTDISABLE) that can disable an account. The only thing that crosses my mind is to try to read the value and flip the bit.
Regards, Dmitry
NoSuchAttributeException: "Indicates that the attribute specified in the modify or compare operation does not exist in the entry."
Malformed 'ms-DS-User-Account-Control-Computed' attribute value: means wrong attribute type.
here is my working example, that i check with ActiveDirectory 2008:
public void mapToContext(int userAccountControl, DirContextAdapter context) {
context.setAttributeValue("userAccountControl", disableAccount(userAccountControl));
}
private String disableAccount(int userAccountControl) {
userAccountControl |= AccountControlFlags.ACCOUNTDISABLE;
return String.valueOf(userAccountControl);
}

stackoverflow exception while using String match in java

For a little university project i'm doing, i need to extract code samples from html given as a string.
To by more precise, i need to get from that html string, everything in between <code> and </code>.
I'm writing in Java, and using String.match to do that.
My code:
public static ArrayList<String> extractByHTMLtagDelimiters(String source, String startDelimiter, String endDelimiter){
ArrayList<String> results = new ArrayList<String>();
if (source.matches("([\t\n\r]|.)*" + startDelimiter + "([\t\n\r]|.)*" + endDelimiter)){
//source has some code samples in it
//get array entries of the form: {Some code}</startDelimiter>{something else}
String[] splittedSource = source.split(startDelimiter);
for (String sourceMatch : splittedSource){
if (sourceMatch.matches("([\t\n\r]|.)*" + endDelimiter + "([\t\n\r]|.)*")){
//current string has code sample in it (with some body leftovers)
//the code sample located before the endDelimiter - extract it
String codeSample = (sourceMatch.split(endDelimiter))[0];
//add the code samples to results
results.add(codeSample);
}
}
}
return results;
iv'e tried to extract that samples from some html of ~1300 chars and got pretty massive exception: (it goes on and on for few dozens of lines)
Exception in thread "main" java.lang.StackOverflowError
at java.util.regex.Pattern$Branch.match(Unknown Source)
at java.util.regex.Pattern$GroupHead.match(Unknown Source)
at java.util.regex.Pattern$Loop.match(Unknown Source)
at java.util.regex.Pattern$GroupTail.match(Unknown Source)
at java.util.regex.Pattern$BranchConn.match(Unknown Source)
at java.util.regex.Pattern$CharProperty.match(Unknown Source)
at java.util.regex.Pattern$Branch.match(Unknown Source)
at java.util.regex.Pattern$GroupHead.match(Unknown Source)
at java.util.regex.Pattern$Loop.match(Unknown Source)
at java.util.regex.Pattern$GroupTail.match(Unknown Source)
at java.util.regex.Pattern$BranchConn.match(Unknown Source)
at java.util.regex.Pattern$CharProperty.match(Unknown Source)
at java.util.regex.Pattern$Branch.match(Unknown Source)
at java.util.regex.Pattern$GroupHead.match(Unknown Source)
at java.util.regex.Pattern$Loop.match(Unknown Source)
at java.util.regex.Pattern$GroupTail.match(Unknown Source)
at java.util.regex.Pattern$BranchConn.match(Unknown Source)
at java.util.regex.Pattern$CharProperty.match(Unknown Source)
at java.util.regex.Pattern$Branch.match(Unknown Source)
at java.util.regex.Pattern$GroupHead.match(Unknown Source)
at java.util.regex.Pattern$Loop.match(Unknown Source)
i've found the following bug report:
http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=5050507
is there anything i can do to still use string.match? if not, can you please recommend some other way to do it without implementing html parsing by myself?
Thank a lot,
Dub.
You can just manually go through the input string using String's indexOf() method to find the start and end deliminters and extract out the bits between yourself.
public static void main(String[] args) {
String source = "<html>blah<code>this is awesome</code>more junk</html>";
String startDelim = "<code>";
String endDelim = "</code>";
int start = source.indexOf(startDelim);
int end = source.indexOf(endDelim);
String code = source.substring(start + startDelim.length(), end);
System.out.println(code);
}
If you need to find more than one, then just use indexOf again starting at the point you finished:
int nextStart = source.indexOf(startDelim, end + endDelim.length())
Simply replace your regex pattern with "(?s).*"
This matches anything including new lines as you intended.

Categories

Resources