I know I can compile individual "snippets" in Scala using the Toolbox like this:
import scala.reflect.runtime.universe
import scala.tools.reflect.ToolBox
object Compiler {
val tb = universe.runtimeMirror(getClass.getClassLoader).mkToolBox()
def main(args: Array[String]): Unit = {
tb.eval(tb.parse("""println("hello!")"""))
}
}
Is there any way I can compile more than just "snippets", i.e., classes that refer to each other? Like this:
import scala.reflect.runtime.universe
import scala.tools.reflect.ToolBox
object Compiler {
private val tb = universe.runtimeMirror(getClass.getClassLoader).mkToolBox()
val a: String =
"""
|package pkg {
|
|class A {
|def compute(): Int = 42
|}}
""".stripMargin
val b: String =
"""
|import pkg._
|
|class B {
|def fun(): Unit = {
| new A().compute()
|}
|}
""".stripMargin
def main(args: Array[String]): Unit = {
val compiledA = tb.parse(a)
val compiledB = tb.parse(b)
tb.eval(compiledB)
}
}
Obviously, my snippet doesn't work as I have to tell the toolbox how to resolve "A" somehow:
Exception in thread "main" scala.tools.reflect.ToolBoxError: reflective compilation has failed:
not found: type A
Try
import scala.reflect.runtime.universe._
import scala.reflect.runtime.universe
import scala.tools.reflect.ToolBox
val tb = universe.runtimeMirror(getClass.getClassLoader).mkToolBox()
val a = q"""
class A {
def compute(): Int = 42
}"""
val symbA = tb.define(a)
val b = q"""
class B {
def fun(): Unit = {
new $symbA().compute()
}
}"""
tb.eval(b)
https://github.com/scala/scala/blob/2.13.x/src/compiler/scala/tools/reflect/ToolBox.scala#L131-L138
In cases more complex than those the toolbox can handle, you can always run the compiler manually
import scala.reflect.internal.util.{AbstractFileClassLoader, BatchSourceFile}
import scala.reflect.io.{AbstractFile, VirtualDirectory}
import scala.tools.nsc.{Global, Settings}
import scala.reflect.runtime
import scala.reflect.runtime.universe
import scala.reflect.runtime.universe._
val a: String =
"""
|package pkg {
|
|class A {
| def compute(): Int = 42
|}}
""".stripMargin
val b: String =
"""
|import pkg._
|
|class B {
| def fun(): Unit = {
| println(new A().compute())
| }
|}
""".stripMargin
val directory = new VirtualDirectory("(memory)", None)
compileCode(List(a, b), List(), directory)
val runtimeMirror = createRuntimeMirror(directory, runtime.currentMirror)
val bInstance = instantiateClass("B", runtimeMirror)
runClassMethod("B", runtimeMirror, "fun", bInstance) // 42
def compileCode(sources: List[String], classpathDirectories: List[AbstractFile], outputDirectory: AbstractFile): Unit = {
val settings = new Settings
classpathDirectories.foreach(dir => settings.classpath.prepend(dir.toString))
settings.outputDirs.setSingleOutput(outputDirectory)
settings.usejavacp.value = true
val global = new Global(settings)
val files = sources.zipWithIndex.map { case (code, i) => new BatchSourceFile(s"(inline-$i)", code) }
(new global.Run).compileSources(files)
}
def instantiateClass(className: String, runtimeMirror: Mirror, arguments: Any*): Any = {
val classSymbol = runtimeMirror.staticClass(className)
val classType = classSymbol.typeSignature
val constructorSymbol = classType.decl(termNames.CONSTRUCTOR).asMethod
val classMirror = runtimeMirror.reflectClass(classSymbol)
val constructorMirror = classMirror.reflectConstructor(constructorSymbol)
constructorMirror(arguments: _*)
}
def runClassMethod(className: String, runtimeMirror: Mirror, methodName: String, classInstance: Any, arguments: Any*): Any = {
val classSymbol = runtimeMirror.staticClass(className)
val classType = classSymbol.typeSignature
val methodSymbol = classType.decl(TermName(methodName)).asMethod
val instanceMirror = runtimeMirror.reflect(classInstance)
val methodMirror = instanceMirror.reflectMethod(methodSymbol)
methodMirror(arguments: _*)
}
//def runObjectMethod(objectName: String, runtimeMirror: Mirror, methodName: String, arguments: Any*): Any = {
// val objectSymbol = runtimeMirror.staticModule(objectName)
// val objectModuleMirror = runtimeMirror.reflectModule(objectSymbol)
// val objectInstance = objectModuleMirror.instance
// val objectType = objectSymbol.typeSignature
// val methodSymbol = objectType.decl(TermName(methodName)).asMethod
// val objectInstanceMirror = runtimeMirror.reflect(objectInstance)
// val methodMirror = objectInstanceMirror.reflectMethod(methodSymbol)
// methodMirror(arguments: _*)
//}
def createRuntimeMirror(directory: AbstractFile, parentMirror: Mirror): Mirror = {
val classLoader = new AbstractFileClassLoader(directory, parentMirror.classLoader)
universe.runtimeMirror(classLoader)
}
dynamically parse json in flink map
Tensorflow in Scala reflection
How to eval code that uses InterfaceStability annotation (that fails with "illegal cyclic reference involving class InterfaceStability")?
Related
That's my project on writing the Minecraft plugin using Scala as the main language, supported by Java. I'm using Gradle and some settings to combine Scala and Java.
I've had a working prototype of one of the modules and then added some refactoring, that somehow called the exception. As far as I've researched, there are only 3 to 8 results on the search, related to the "NoDenotation.owner" exception. Now I have no idea how to resolve it and there are no comments on it on the StackOverflow as well.
Here is my build.gradle file (I'll fix shadowing part later):
plugins {
id 'java'
id 'scala'
id 'com.github.johnrengelman.shadow' version '7.1.2'
}
group = 'com.danikvitek'
version = '1.0'
sourceSets {
main {
scala.srcDirs = ["$projectDir/src/main/mixed"]
resources.srcDirs = ["$projectDir/src/main/resources"]
}
test {
scala.srcDirs = ["$projectDir/src/test/mixed"]
resources.srcDirs = ["$projectDir/src/test/resources"]
}
}
repositories {
mavenCentral()
maven {
name = 'papermc-repo'
url = 'https://papermc.io/repo/repository/maven-public/'
}
maven {
name = 'sonatype'
url = 'https://oss.sonatype.org/content/groups/public/'
}
maven {
name = 'codemc-snapshots'
url = 'https://repo.codemc.io/repository/maven-snapshots/'
}
maven { url "https://repo.dmulloy2.net/repository/public/" }
}
dependencies {
implementation 'org.hibernate:hibernate-core:5.6.3.Final'
implementation 'org.hibernate:hibernate-entitymanager:5.6.3.Final'
implementation 'mysql:mysql-connector-java:8.0.25'
implementation 'net.wesjd:anvilgui:1.5.3-SNAPSHOT'
compileOnly 'io.papermc.paper:paper-api:1.17.1-R0.1-SNAPSHOT'
// compileOnly group: "com.comphenix.protocol", name: "ProtocolLib", version: "4.7.0";
compileOnly group: 'org.scala-lang', name: 'scala3-library_3', version: '3.1.0'
testImplementation group: 'org.scala-lang', name: 'scala3-library_3', version: '3.1.0'
}
jar {
duplicatesStrategy = DuplicatesStrategy.EXCLUDE
}
def targetJavaVersion = 16
java {
def javaVersion = JavaVersion.toVersion(targetJavaVersion)
sourceCompatibility = javaVersion
targetCompatibility = javaVersion
if (JavaVersion.current() < javaVersion) {
toolchain.languageVersion = JavaLanguageVersion.of(targetJavaVersion)
}
}
tasks.withType(JavaCompile).configureEach {
if (targetJavaVersion >= 10 || JavaVersion.current().isJava10Compatible()) {
options.release = targetJavaVersion
}
}
processResources {
def props = [version: version]
inputs.properties props
filteringCharset 'UTF-8'
filesMatching('plugin.yml') {
expand props
}
}
And here is the exception stacktrace:
## Exception when compiling 84 sources to D:\Desktop\Minecraft plugins\PoliticsCountryS\build\classes\scala\main
java.lang.AssertionError: NoDenotation.owner
dotty.tools.dotc.core.SymDenotations$NoDenotation$.owner(SymDenotations.scala:2503)
dotty.tools.dotc.typer.Typer.canAssign$1(Typer.scala:959)
dotty.tools.dotc.typer.Typer.typedAssign(Typer.scala:996)
dotty.tools.dotc.typer.Typer.typedUnnamed$1(Typer.scala:2762)
dotty.tools.dotc.typer.Typer.typedUnadapted(Typer.scala:2818)
dotty.tools.dotc.typer.Typer.typed(Typer.scala:2883)
dotty.tools.dotc.typer.Typer.typed(Typer.scala:2887)
dotty.tools.dotc.typer.Typer.traverse$1(Typer.scala:2936)
dotty.tools.dotc.typer.Typer.typedStats(Typer.scala:2959)
dotty.tools.dotc.typer.Typer.typedBlockStats(Typer.scala:1027)
dotty.tools.dotc.typer.Typer.typedBlock(Typer.scala:1031)
dotty.tools.dotc.typer.Typer.typedUnnamed$1(Typer.scala:2763)
dotty.tools.dotc.typer.Typer.typedUnadapted(Typer.scala:2818)
dotty.tools.dotc.typer.ProtoTypes$FunProto.$anonfun$5(ProtoTypes.scala:431)
dotty.tools.dotc.typer.ProtoTypes$FunProto.cacheTypedArg(ProtoTypes.scala:359)
dotty.tools.dotc.typer.ProtoTypes$FunProto.typedArg(ProtoTypes.scala:432)
dotty.tools.dotc.typer.Applications$ApplyToUntyped.typedArg(Applications.scala:853)
dotty.tools.dotc.typer.Applications$ApplyToUntyped.typedArg(Applications.scala:853)
dotty.tools.dotc.typer.Applications$Application.addTyped$1(Applications.scala:544)
dotty.tools.dotc.typer.Applications$Application.matchArgs(Applications.scala:609)
dotty.tools.dotc.typer.Applications$Application.matchArgs(Applications.scala:609)
dotty.tools.dotc.typer.Applications$Application.matchArgs(Applications.scala:609)
dotty.tools.dotc.typer.Applications$Application.init(Applications.scala:447)
dotty.tools.dotc.typer.Applications$TypedApply.<init>(Applications.scala:735)
dotty.tools.dotc.typer.Applications$ApplyToUntyped.<init>(Applications.scala:852)
dotty.tools.dotc.typer.Applications.ApplyTo(Applications.scala:1060)
dotty.tools.dotc.typer.Applications.ApplyTo$(Applications.scala:317)
dotty.tools.dotc.typer.Typer.ApplyTo(Typer.scala:107)
dotty.tools.dotc.typer.Applications.simpleApply$1(Applications.scala:907)
dotty.tools.dotc.typer.Applications.realApply$5$$anonfun$4(Applications.scala:986)
dotty.tools.dotc.typer.Typer.tryEither(Typer.scala:3011)
dotty.tools.dotc.typer.Applications.realApply$1(Applications.scala:997)
dotty.tools.dotc.typer.Applications.typedApply(Applications.scala:1035)
dotty.tools.dotc.typer.Applications.typedApply$(Applications.scala:317)
dotty.tools.dotc.typer.Typer.typedApply(Typer.scala:107)
dotty.tools.dotc.typer.Typer.typedUnnamed$1(Typer.scala:2755)
dotty.tools.dotc.typer.Typer.typedUnadapted(Typer.scala:2818)
dotty.tools.dotc.typer.Typer.typed(Typer.scala:2883)
dotty.tools.dotc.typer.Typer.typed(Typer.scala:2887)
dotty.tools.dotc.typer.Typer.typedExpr(Typer.scala:3003)
dotty.tools.dotc.typer.Typer.typedParent$2(Typer.scala:2321)
dotty.tools.dotc.typer.Typer.$anonfun$40(Typer.scala:2396)
dotty.tools.dotc.core.Decorators$ListDecorator$.loop$1(Decorators.scala:92)
dotty.tools.dotc.core.Decorators$ListDecorator$.mapconserve$extension(Decorators.scala:108)
dotty.tools.dotc.typer.Typer.typedClassDef(Typer.scala:2396)
dotty.tools.dotc.typer.Typer.typedTypeOrClassDef$2(Typer.scala:2743)
dotty.tools.dotc.typer.Typer.typedNamed$1(Typer.scala:2747)
dotty.tools.dotc.typer.Typer.typedUnadapted(Typer.scala:2817)
dotty.tools.dotc.typer.Typer.typed(Typer.scala:2883)
dotty.tools.dotc.typer.Typer.typed(Typer.scala:2887)
dotty.tools.dotc.typer.Typer.traverse$1(Typer.scala:2909)
dotty.tools.dotc.typer.Typer.typedStats(Typer.scala:2959)
dotty.tools.dotc.typer.Typer.typedPackageDef(Typer.scala:2532)
dotty.tools.dotc.typer.Typer.typedUnnamed$1(Typer.scala:2788)
dotty.tools.dotc.typer.Typer.typedUnadapted(Typer.scala:2818)
dotty.tools.dotc.typer.Typer.typed(Typer.scala:2883)
dotty.tools.dotc.typer.Typer.typed(Typer.scala:2887)
dotty.tools.dotc.typer.Typer.typedExpr(Typer.scala:3003)
dotty.tools.dotc.typer.TyperPhase.liftedTree1$1(TyperPhase.scala:56)
dotty.tools.dotc.typer.TyperPhase.typeCheck$$anonfun$1(TyperPhase.scala:62)
dotty.tools.dotc.core.Phases$Phase.monitor(Phases.scala:411)
dotty.tools.dotc.typer.TyperPhase.typeCheck(TyperPhase.scala:63)
dotty.tools.dotc.typer.TyperPhase.runOn$$anonfun$1(TyperPhase.scala:105)
scala.runtime.function.JProcedure1.apply(JProcedure1.java:15)
scala.runtime.function.JProcedure1.apply(JProcedure1.java:10)
scala.collection.immutable.List.foreach(List.scala:333)
dotty.tools.dotc.typer.TyperPhase.runOn(TyperPhase.scala:105)
dotty.tools.dotc.Run.runPhases$4$$anonfun$4(Run.scala:261)
scala.runtime.function.JProcedure1.apply(JProcedure1.java:15)
scala.runtime.function.JProcedure1.apply(JProcedure1.java:10)
scala.collection.ArrayOps$.foreach$extension(ArrayOps.scala:1323)
dotty.tools.dotc.Run.runPhases$5(Run.scala:272)
dotty.tools.dotc.Run.compileUnits$$anonfun$1(Run.scala:280)
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
dotty.tools.dotc.util.Stats$.maybeMonitored(Stats.scala:68)
dotty.tools.dotc.Run.compileUnits(Run.scala:289)
dotty.tools.dotc.Run.compileSources(Run.scala:222)
dotty.tools.dotc.Run.compile(Run.scala:206)
dotty.tools.dotc.Driver.doCompile(Driver.scala:39)
dotty.tools.dotc.Driver.process(Driver.scala:199)
dotty.tools.dotc.Main.process(Main.scala)
xsbt.CachedCompilerImpl.run(CachedCompilerImpl.java:67)
xsbt.CompilerInterface.run(CompilerInterface.java:59)
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.base/java.lang.reflect.Method.invoke(Method.java:568)
sbt.internal.inc.AnalyzingCompiler.call(AnalyzingCompiler.scala:248)
sbt.internal.inc.AnalyzingCompiler.compile(AnalyzingCompiler.scala:122)
sbt.internal.inc.AnalyzingCompiler.compile(AnalyzingCompiler.scala:95)
sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$4(MixedAnalyzingCompiler.scala:91)
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
sbt.internal.inc.MixedAnalyzingCompiler.timed(MixedAnalyzingCompiler.scala:186)
sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$3(MixedAnalyzingCompiler.scala:82)
sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$3$adapted(MixedAnalyzingCompiler.scala:77)
sbt.internal.inc.JarUtils$.withPreviousJar(JarUtils.scala:215)
sbt.internal.inc.MixedAnalyzingCompiler.compileScala$1(MixedAnalyzingCompiler.scala:77)
sbt.internal.inc.MixedAnalyzingCompiler.compile(MixedAnalyzingCompiler.scala:146)
sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileInternal$1(IncrementalCompilerImpl.scala:343)
sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileInternal$1$adapted(IncrementalCompilerImpl.scala:343)
sbt.internal.inc.Incremental$.doCompile(Incremental.scala:120)
sbt.internal.inc.Incremental$.$anonfun$compile$4(Incremental.scala:100)
sbt.internal.inc.IncrementalCommon.recompileClasses(IncrementalCommon.scala:180)
sbt.internal.inc.IncrementalCommon.cycle(IncrementalCommon.scala:98)
sbt.internal.inc.Incremental$.$anonfun$compile$3(Incremental.scala:102)
sbt.internal.inc.Incremental$.manageClassfiles(Incremental.scala:155)
sbt.internal.inc.Incremental$.compile(Incremental.scala:92)
sbt.internal.inc.IncrementalCompile$.apply(Compile.scala:75)
sbt.internal.inc.IncrementalCompilerImpl.compileInternal(IncrementalCompilerImpl.scala:348)
sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileIncrementally$1(IncrementalCompilerImpl.scala:301)
sbt.internal.inc.IncrementalCompilerImpl.handleCompilationError(IncrementalCompilerImpl.scala:168)
sbt.internal.inc.IncrementalCompilerImpl.compileIncrementally(IncrementalCompilerImpl.scala:248)
sbt.internal.inc.IncrementalCompilerImpl.compile(IncrementalCompilerImpl.scala:74)
org.gradle.api.internal.tasks.scala.ZincScalaCompiler.execute(ZincScalaCompiler.java:157)
org.gradle.api.internal.tasks.scala.ZincScalaCompilerFacade.execute(ZincScalaCompilerFacade.java:47)
org.gradle.api.internal.tasks.scala.ZincScalaCompilerFacade.execute(ZincScalaCompilerFacade.java:32)
org.gradle.api.internal.tasks.compile.daemon.AbstractDaemonCompiler$CompilerWorkAction.execute(AbstractDaemonCompiler.java:135)
org.gradle.workers.internal.DefaultWorkerServer.execute(DefaultWorkerServer.java:63)
org.gradle.workers.internal.AbstractClassLoaderWorker$1.create(AbstractClassLoaderWorker.java:49)
org.gradle.workers.internal.AbstractClassLoaderWorker$1.create(AbstractClassLoaderWorker.java:43)
org.gradle.internal.classloader.ClassLoaderUtils.executeInClassloader(ClassLoaderUtils.java:97)
org.gradle.workers.internal.AbstractClassLoaderWorker.executeInClassLoader(AbstractClassLoaderWorker.java:43)
org.gradle.workers.internal.IsolatedClassloaderWorker.run(IsolatedClassloaderWorker.java:49)
org.gradle.workers.internal.IsolatedClassloaderWorker.run(IsolatedClassloaderWorker.java:30)
org.gradle.workers.internal.WorkerDaemonServer.run(WorkerDaemonServer.java:85)
org.gradle.workers.internal.WorkerDaemonServer.run(WorkerDaemonServer.java:55)
org.gradle.process.internal.worker.request.WorkerAction$1.call(WorkerAction.java:138)
org.gradle.process.internal.worker.child.WorkerLogEventListener.withWorkerLoggingProtocol(WorkerLogEventListener.java:41)
org.gradle.process.internal.worker.request.WorkerAction.run(WorkerAction.java:135)
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.base/java.lang.reflect.Method.invoke(Method.java:568)
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:61)
java.base/java.lang.Thread.run(Thread.java:833)
Here are the places, where the new code was added:
PMenuCommand.scala:
package com.danikvitek.politicscountry.ui.command
import com.danikvitek.politicscountry.ui.gui.prefab.PMenu
import com.danikvitek.politicscountry.ui.gui.{Menu, MenuHandler}
import com.danikvitek.politicscountry.utils.Translation
import org.bukkit.{Bukkit, ChatColor}
import org.bukkit.command.{Command, CommandSender, TabExecutor}
import org.bukkit.entity.Player
import org.bukkit.inventory.Inventory
import java.util
class PMenuCommand extends TabExecutor {
override def onCommand(sender: CommandSender, command: Command,
alias: String, args: Array[String]): Boolean = {
sender match {
case player: Player =>
if player hasPermission "politicscountry.command.pmenu" then {
val menu: Menu = new PMenu(player)
MenuHandler.openMenu(player, menu)
}
else
player sendMessage Translation.`you have no permission to use this command`
case _ => sender sendMessage (ChatColor.RED.toString + "You need to be a player to use this command")
}
true
}
override def onTabComplete(sender: CommandSender, command: Command,
alias: String, args: Array[String]): util.List[String] = {
new util.ArrayList[String]
}
}
PMenu.scala:
package com.danikvitek.politicscountry.ui.gui.prefab
import com.danikvitek.politicscountry.ui.gui.{Button, Menu, MenuHandler}
import com.danikvitek.politicscountry.utils.{ItemBuilder, Translation}
import org.bukkit.{Bukkit, ChatColor, Material}
import org.bukkit.event.inventory.InventoryClickEvent
import org.bukkit.inventory.{Inventory, ItemStack}
import com.danikvitek.politicscountry.api.controller.{FormationManager, ResidentManager}
import com.danikvitek.politicscountry.RichPlayer
import com.danikvitek.politicscountry.gameplay.residency.Resident
import net.kyori.adventure.text.Component
import net.wesjd.anvilgui.AnvilGUI
import org.bukkit.entity.Player
import scala.collection.mutable
class PMenu(player: Player)
extends DoubleChestMenu(Component.text(Translation.`pmenu inventory title`)) {
private lazy val resident = player.toResident
private lazy val statusButton = new Button(Material.RED_STAINED_GLASS_PANE, Translation.`status`) {
override def onClick(menu: Menu, event: InventoryClickEvent): Unit = {
event setCancelled true
val statusMenu: Menu = new Status(resident)
MenuHandler.closeMenu(player)
MenuHandler.openMenu(player, statusMenu)
}
}
private lazy val createFormationButton = new Button(Material.RED_STAINED_GLASS_PANE, Translation.`create formation`) {
override def onClick(menu: Menu, event: InventoryClickEvent): Unit = {
event setCancelled true
var title: String = null
new AnvilGUI.Builder()
.title(Translation.`create formation`)
.itemLeft(new ItemStack(Material.PAPER))
.text(Translation.`input formation title`)
.onClose { p =>
MenuHandler.closeMenu(p)
p.performCommand("pmenu")
}
.onComplete { (p, t) =>
title = t
AnvilGUI.Response.close()
}
.open(player)
val chunk = player.getLocation.getChunk()
FormationManager.create(title, chunk, resident)
}
}
setButton(24, statusButton)
if resident.feud.isEmpty then
setButton(13, createFormationButton)
}
SingleChestMenu.scala:
package com.danikvitek.politicscountry.ui.gui.prefab
import com.danikvitek.politicscountry.ui.gui.{Button, Menu}
import com.danikvitek.politicscountry.utils.Translation
import net.kyori.adventure.text.Component
import org.bukkit.{Bukkit, Material}
import org.bukkit.event.inventory.InventoryClickEvent
abstract class SingleChestMenu(title: Component)
extends Menu(Bukkit.createInventory(null, 27, title)) {
def this(title: String) = this(Component.text(title))
for {
slot <- Seq(0, 1, 7, 8, 18, 19, 25, 26)
} setButton(slot, new Button(Material.BLUE_STAINED_GLASS_PANE, " ") {
override def onClick(menu: Menu, event: InventoryClickEvent): Unit = event setCancelled true
})
}
DoubleChestMenu.scala:
package com.danikvitek.politicscountry.ui.gui.prefab
import com.danikvitek.politicscountry.ui.gui.{Button, Menu}
import com.danikvitek.politicscountry.utils.Translation
import net.kyori.adventure.text.Component
import org.bukkit.event.inventory.InventoryClickEvent
import org.bukkit.{Bukkit, Material}
abstract class DoubleChestMenu(title: Component)
extends Menu(Bukkit.createInventory(null, 54, title)) {
def this(title: String) = this(Component text title)
for {
slot <- Seq(0, 1, 2, 6, 7, 8, 9, 17, 36, 44, 45, 46, 47, 51, 52, 53)
} setButton(slot, new Button(Material.BLUE_STAINED_GLASS_PANE, " ") {
override def onClick(menu: Menu, event: InventoryClickEvent): Unit = event setCancelled true
})
}
Status.scala:
package com.danikvitek.politicscountry.ui.gui.prefab
import com.danikvitek.politicscountry.gameplay.residency.Resident
import com.danikvitek.politicscountry.ui.gui.{Button, Menu, MenuHandler}
import com.danikvitek.politicscountry.utils.{ItemBuilder, Translation}
import com.danikvitek.politicscountry.api.controller.ResidentManager
import com.danikvitek.politicscountry.data.model.FormationType
import net.kyori.adventure.text.Component
import net.kyori.adventure.text.format.{NamedTextColor, TextColor}
import net.kyori.adventure.util.RGBLike
import org.bukkit.{Bukkit, ChatColor, Material}
import org.bukkit.event.inventory.InventoryClickEvent
class Status(resident: Resident) extends SingleChestMenu(Translation.`status`) {
private val residentStatus = resident.getStatus
setButton(11, new Button(
new ItemBuilder(Material.RED_STAINED_GLASS_PANE)
.setDisplayName(Translation.`status - resident role`)
.addLore(Translation.`status - resident role map`(residentStatus._1))
.build
) {
override def onClick(menu: Menu, event: InventoryClickEvent): Unit = event setCancelled true
})
setButton(13, new Button(
new ItemBuilder(Material.RED_STAINED_GLASS_PANE)
.setDisplayName(Translation.`status - formation`)
.setLore {
val feud = residentStatus._2
val commonwealth = residentStatus._3
val state = residentStatus._4
val alliance = residentStatus._5
var lore = List.empty[String]
feud foreach { f =>
val typeName = f.typeName
.getOrElse(Translation.`status - formation type map`(FormationType.FEUD))
lore = lore :+
(Translation.`status - formation type color` + typeName + ": " + f.title)
}
commonwealth foreach { c =>
val typeName = c.typeName
.getOrElse(Translation.`status - formation type map`(FormationType.COMMONWEALTH))
lore = lore :+
(Translation.`status - formation type color` + typeName + ": " + c.title)
}
state foreach { s =>
val typeName = s.typeName
.getOrElse(Translation.`status - formation type map`(FormationType.STATE))
lore = lore :+
(Translation.`status - formation type color` + typeName + ": " + s.title)
}
alliance foreach { a =>
val typeName = a.typeName
.getOrElse(Translation.`status - formation type map`(FormationType.ALLIANCE))
lore = lore :+
(Translation.`status - formation type color` + typeName + ": " + a.title)
}
if lore.sizeIs == 0 then lore = lore :+
(Translation.`status - formation type color` + Translation.`status - formation type nothing`)
lore
}
.build
) {
override def onClick(menu: Menu, event: InventoryClickEvent): Unit = event setCancelled true
})
setButton(15, new Button(
new ItemBuilder(Material.RED_STAINED_GLASS_PANE)
.setDisplayName(Translation.`status - resident plots`)
.setLore {
val distances = residentStatus._6
.toSeq
.map { rp =>
val homePlotChunk = resident.feud.get.homePlot.chunk
math.hypot(homePlotChunk.getX - rp.chunk.getX, homePlotChunk.getZ - rp.chunk.getZ)
}
val chunks = residentStatus._6
.toSeq
.map(_.chunk)
var plotsList = (distances zip chunks)
.sortWith(_._1 < _._1)
.map(_._2)
.map { chunk =>
Translation.`status - resident plot color` + Translation.`status - plot at` +
s" (${chunk.getX}, ${chunk.getZ})"
}
if plotsList.sizeIs == 0 then plotsList = plotsList :+
(Translation.`status - resident plot color` + Translation.`status - no plots`)
plotsList
}
.build
) {
override def onClick(menu: Menu, event: InventoryClickEvent): Unit = event setCancelled true
})
setButton(19, new Button(
Material.YELLOW_STAINED_GLASS_PANE,
Translation.`menu - back`
) {
override def onClick(menu: Menu, event: InventoryClickEvent): Unit = {
event setCancelled true
val player = resident.player
MenuHandler closeMenu player
player performCommand "pmenu"
}
})
}
I am attempting to read avro data from Kafka using Spark Streaming but I receive the following error message:
Streaming Query Exception caught!: org.apache.spark.sql.streaming.StreamingQueryException: Job aborted.
=== Streaming Query ===
Identifier: [id = 8b54c92d-6bbc-4dbc-84d0-55b762c21ba2, runId = 4bc92b3c-343e-4886-b0bc-0777b89f9ec8]
Current Committed Offsets: {KafkaV2[Subscribe[customer-avro4]]: {"customer-avro":{"0":17}}}
Current Available Offsets: {KafkaV2[Subscribe[customer-avro4]]: {"customer-avro":{"0":20}}}
Current State: ACTIVE
Thread State: RUNNABLE
Any idea on what the issue might be and how to resolve it? Code is the following (inspired from xebia-france spark-structured-streaming-blog). Actually, I think it ran earlier already but now there is a problem.
import com.databricks.spark.avro.SchemaConverters
import io.confluent.kafka.schemaregistry.client.{CachedSchemaRegistryClient, SchemaRegistryClient}
import io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer
import org.apache.avro.Schema
import org.apache.avro.generic.GenericRecord
import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.streaming.StreamingQueryException
object AvroConsumer {
private val topic = "customer-avro4"
private val kafkaUrl = "http://localhost:9092"
private val schemaRegistryUrl = "http://localhost:8081"
private val schemaRegistryClient = new CachedSchemaRegistryClient(schemaRegistryUrl, 128)
private val kafkaAvroDeserializer = new AvroDeserializer(schemaRegistryClient)
private val avroSchema = schemaRegistryClient.getLatestSchemaMetadata(topic + "-value").getSchema
private val sparkSchema = SchemaConverters.toSqlType(new Schema.Parser().parse(avroSchema))
def main(args: Array[String]): Unit = {
val spark = SparkSession
.builder
.appName("ConfluentConsumer")
.master("local[*]")
.getOrCreate()
spark.sparkContext.setLogLevel("ERROR")
spark.udf.register("deserialize", (bytes: Array[Byte]) =>
DeserializerWrapper.deserializer.deserialize(bytes)
)
val kafkaDataFrame = spark
.readStream
.format("kafka")
.option("kafka.bootstrap.servers", kafkaUrl)
.option("subscribe", topic)
.load()
val valueDataFrame = kafkaDataFrame.selectExpr("""deserialize(value) AS message""")
import org.apache.spark.sql.functions._
val formattedDataFrame = valueDataFrame.select(
from_json(col("message"), sparkSchema.dataType).alias("parsed_value"))
.select("parsed_value.*")
val writer = formattedDataFrame
.writeStream
.format("parquet")
.option("checkpointLocation", "hdfs://localhost:9000/data/spark/parquet/checkpoint")
while (true) {
val query = writer.start("hdfs://localhost:9000/data/spark/parquet/total")
try {
query.awaitTermination()
}
catch {
case e: StreamingQueryException => println("Streaming Query Exception caught!: " + e);
}
}
}
object DeserializerWrapper {
val deserializer: AvroDeserializer = kafkaAvroDeserializer
}
class AvroDeserializer extends AbstractKafkaAvroDeserializer {
def this(client: SchemaRegistryClient) {
this()
this.schemaRegistry = client
}
override def deserialize(bytes: Array[Byte]): String = {
val genericRecord = super.deserialize(bytes).asInstanceOf[GenericRecord]
genericRecord.toString
}
}
}
Figured it out - the problem was not as I had thought with the Spark-Kafka integration directly, but with the checkpoint information inside the hdfs filesystem instead. Deleting and recreating the checkpoint folder in hdfs solved it for me.
I am trying to expand a bunch of AD Groups using Scala. Based on the code given here
http://www.thetekblog.com/2010/06/active-directory-with-ldap-retrieving-all-members-of-a-group/
I wrote the following code
package com.abhi
import java.util
import javax.naming.ldap._
import javax.naming._
import java.util.Hashtable
import javax.naming.directory.{SearchControls, SearchResult}
object LDAPScala extends App {
val base = "ou=Foo,dc=MYCOMPANY,dc=COM"
val env = new util.Hashtable[String, String]()
env.put(Context.INITIAL_CONTEXT_FACTORY, "com.sun.jndi.ldap.LdapCtxFactory")
env.put(Context.SECURITY_AUTHENTICATION, "simple")
env.put(Context.SECURITY_PRINCIPAL, "foo#mycompany.com")
env.put(Context.SECURITY_CREDENTIALS, "Bar")
env.put(Context.PROVIDER_URL, "ldap://ldapserver.mycompany.com:389")
val groupList = List("Group1", "Group2", "Group3")
try {
val ctx = new InitialLdapContext(env, null)
val searchCtls = new SearchControls()
searchCtls.setSearchScope(SearchControls.SUBTREE_SCOPE)
val attributes = Array("member","memberof")
searchCtls.setReturningAttributes(attributes);
for {
group <- groupList
} {
val searchFilter = s"(&(objectCategory=group)(name=${group}))"
val answers = ctx.search(base, searchFilter, searchCtls)
while(answers.hasMoreElements) {
val answer = answers.next()
val attributes = answer.getAttributes.getAll
while(attributes.hasMore) {
val attr = attributes.nextElement()
val everyone = attr.getAll
while(everyone.hasMore) {
val person = everyone.next()
println(person)
}
}
}
}
} catch {
case e : Exception =>
println(e.getMessage)
println(e.getStackTrace)
}
}
This code works and I can see a list of users in each group like this
CN=User1,OU=Users,OU=Accounts,OU=tor,OU=CA,OU=AMER,OU=Regions,DC=FOO,DC=COM
CN=User2,OU=Users,OU=Accounts,OU=LON,OU=UK,OU=EMEA,OU=Regions,DC=FOO,DC=COM
CN=User3,OU=Users,OU=Accounts,OU=pla,OU=US,OU=AMER,OU=Regions,DC=FOO,DC=COM
Three questions
I needed the login IDs (I think they are called samAccountNames). but here the CN contains the actual names of people not their login ids.
Will this give me all the members? I remember that AD had some type of limitation where it will truncate the number of users in the group if there are too many users.
I don't know if my code above will work if there are groups within groups.
I was able to convert the CNs to samAccountNames. The final code is
package com.abhi
import java.util
import javax.naming.ldap._
import javax.naming._
import javax.naming.directory.SearchControls
import scala.collection.mutable.ArrayBuffer
object LDAPScala extends App {
val base = "dc=FOO,dc=COM"
val env = new util.Hashtable[String, String]()
env.put(Context.INITIAL_CONTEXT_FACTORY, "com.sun.jndi.ldap.LdapCtxFactory")
env.put(Context.SECURITY_AUTHENTICATION, "simple")
env.put(Context.SECURITY_PRINCIPAL, "user#foo.com")
env.put(Context.SECURITY_CREDENTIALS, "pass")
env.put(Context.PROVIDER_URL, "ldap://adserver.foo.com:389")
val groupList = List("group1", "group2", "group3")
try {
val people = for {
group <- groupList
cn <- queryAD(base, s"(&(objectCategory=group)(name=${group}))", "member")
sam <- queryAD(cn, "(sAMAccountName=*)", "samaccountname")
name <- getName(cn)
} yield (group, sam, name)
people.foreach{case (g, s, n) => println(s"$g,$s,$n")}
} catch {
case e : Exception =>
println(e.getMessage)
println(e.getStackTrace)
}
def getName(cn: String): Option[String] = {
val regex = """^CN=([\w\s\d]*),.*$""".r
cn match {
case regex(name) => Some(name)
case _ => None
}
}
def queryAD(base: String, searchFilter: String, attribute: String): List[String] = {
val ctx = new InitialLdapContext(env, null)
val searchCtls = new SearchControls()
searchCtls.setSearchScope(SearchControls.SUBTREE_SCOPE)
searchCtls.setReturningAttributes(Array(attribute))
val answers = ctx.search(base, searchFilter, searchCtls)
var retVal = ArrayBuffer[String]()
while(answers.hasMoreElements) {
val answer = answers.next()
val member = answer.getAttributes.get(attribute).getAll
while(member.hasMoreElements) {
val person = member.next().toString
retVal += person
}
}
retVal.toList
}
}
How can I call this function from Java? Or do I need a wrapper in scala?
package com.datastax.spark.connector
class DataFrameFunctions(dataFrame: DataFrame) extends Serializable {
...
def createCassandraTable(
keyspaceName: String,
tableName: String,
partitionKeyColumns: Option[Seq[String]] = None,
clusteringKeyColumns: Option[Seq[String]] = None)(
implicit
connector: CassandraConnector = CassandraConnector(sparkContext.getConf)): Unit = {
...
I used the following code :
DataFrameFunctions frameFunctions = new DataFrameFunctions(dfTemp2);
Seq<String> argumentsSeq1 = JavaConversions.asScalaBuffer(Arrays.asList("CategoryName")).seq();
Option<Seq<String>> some1 = new Some<Seq<String>>(argumentsSeq1);
Seq<String> argumentsSeq2 = JavaConversions.asScalaBuffer(Arrays.asList("DealType")).seq();
Option<Seq<String>> some2 = new Some<Seq<String>>(argumentsSeq2);
frameFunctions.createCassandraTable("coupons", "IdealFeeds", some1, some2, connector);
Edit: Using Kryo 1.04
I'm right now serializing a User class that contains a java.sql.Timestamp field in Scala. For some reason, Kryo can't find a zero-arg constructor and throws an error:
Caused by: com.esotericsoftware.kryo.SerializationException: Class cannot be created (missing no-arg constructor): java.sql.Timestamp
Serialization trace:
created (com.threetierlogic.AccountService.models.User)
at com.esotericsoftware.kryo.Kryo.newInstance(Kryo.java:688)
at com.esotericsoftware.kryo.Serializer.newInstance(Serializer.java:75)
at com.esotericsoftware.kryo.serialize.FieldSerializer.readObjectData(FieldSerializer.java:200)
at com.esotericsoftware.kryo.serialize.FieldSerializer.readObjectData(FieldSerializer.java:220)
at com.esotericsoftware.kryo.serialize.FieldSerializer.readObjectData(FieldSerializer.java:200)
at com.esotericsoftware.kryo.Serializer.readObject(Serializer.java:61)
at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:589)
... 84 more
Caused by: java.lang.InstantiationException: java.sql.Timestamp
at java.lang.Class.newInstance0(Class.java:340)
at java.lang.Class.newInstance(Class.java:308)
at com.esotericsoftware.kryo.Kryo.newInstance(Kryo.java:676)
... 90 more
This is part of a converter class to convert domain objects for Riak. Here's my converter class:
/**
* Kryo converter for passing domain objects into Riak
*/
class UserConverter(val bucket: String) extends Converter[User] {
def fromDomain(domainObject: User, vclock: VClock): IRiakObject = {
val key = domainObject.guid
if(key == null) throw new NoKeySpecifedException(domainObject)
val kryo = new Kryo()
kryo.register(classOf[User])
kryo.register(classOf[Timestamp])
val ob = new ObjectBuffer(kryo)
val value = ob.writeObject(domainObject)
RiakObjectBuilder.newBuilder(bucket, key)
.withValue(value)
.withVClock(vclock)
.withContentType(Constants.CTYPE_OCTET_STREAM)
.build()
}
def toDomain(riakObject: IRiakObject): User = {
if(riakObject == null) null
val kryo = new Kryo()
kryo.register(classOf[User])
kryo.register(classOf[Timestamp])
val ob = new ObjectBuffer(kryo)
ob.readObject(riakObject.getValue(), classOf[User])
}
}
Do I need to extend Timestamp and create a zero argument constructor? Or is there a better workaround?
If I need to upgrade to 2.20, what's the replacement for ObjectBuffer without writing to a file?
A quick look at the Kryo home page suggests that in the absence of a zero-arg constructor, you can create what Kryo calls an "Instantion Strategy" to handle that class. Look in the "Object Creation" section.
You can do something like this :
class KryoSO {
import com.esotericsoftware.kryo.KryoSerializable
import de.javakaffee.kryoserializers.KryoReflectionFactorySupport
import com.esotericsoftware.kryo.Kryo
import com.esotericsoftware.kryo.Serializer
import java.io.{ InputStream, OutputStream }
import com.esotericsoftware.kryo.io.{ Output, Input }
import java.sql.Timestamp
object TimestampSerializer extends Serializer[Timestamp] {
override def write(kryo: Kryo, output: Output, t: Timestamp): Unit = {
output.writeLong(t.getTime(), true);
}
override def read(kryo: Kryo, input: Input, t: Class[Timestamp]): Timestamp = {
new Timestamp(input.readLong(true));
}
override def copy(kryo: Kryo, original: Timestamp): Timestamp = {
new Timestamp(original.getTime());
}
}
val kryo: Kryo = new KryoReflectionFactorySupport
kryo.addDefaultSerializer(classOf[Timestamp], TimestampSerializer)
def serialize(o: Any, os: OutputStream) = {
val output = new Output(os);
this.kryo.writeClassAndObject(output, o);
output.flush();
}
def deserialize(is: InputStream): Any = {
kryo.readClassAndObject(new Input(is));
}
}
val k = new KryoSO
val b = new java.io.ByteArrayOutputStream
val timestamp = new java.sql.Timestamp(System.currentTimeMillis())
k.serialize(timestamp, b)
val result = k.deserialize(new java.io.ByteArrayInputStream(b.toByteArray()))
println(timestamp)
println(result.getClass)
println(result.isInstanceOf[java.sql.Timestamp])
println(timestamp == result)
Result :
2013-02-07 10:59:19.482
class java.sql.Timestamp
true
true