OPAL-How to configure properties of project - opal

While I'm running analysis(eg.CHADemo in OPAL source code), i always get warning as
[warn][OPAL] the property org.opalj.threads.CPUBoundTasks is unspecified
In former question, it suggested the config file in the Common project under /src/main/resources/reference.conf. So i try to add follow lines into the file, but still got the same warning.
org.opalj.threads{
CPUBoundTasks = "8"
IOBoundTasks = "8"
}
Besides, while i import OPAL as library in my project, and trying to create reference.conf in "/src/main/resources/" of my project. I have suffered same problem.

Those values are configured in config value that is considered at compile time of the project. The file you need is located in OPALs root directory. When you open the file local.sbt.template you will see the following:
//
// Optional configuration settings specific to each developers machine.
//
// If your CPU uses hyperthreading, it is recommended to specify the
// number of physical cores and the number of hyperthreaded cores;
// this will spead up the overall execution.
javaOptions in ThisBuild ++= Seq(
"-Dorg.opalj.threads.CPUBoundTasks=16", // Number of physical (not hyperthreaded) cores/CPUs
"-Dorg.opalj.threads.IOBoundTasks=32" // Number of (hyperthreaded) cores * 1,5
)
// If you want to disable assertions, uncomment the following line.
// Assertions are heavily used throughout OPAL and have a
// signifcant performance impact. However, at development time it is
// HIGHLY recommend to turn on assertions!
//scalacOptions in ThisBuild += "-Xdisable-assertions"
//
//scalacOptions in ThisBuild -= "-Ywarn-unused-import"
//scalacOptions in ThisBuild -= "-Ywarn-unused"
If you want to configure the number of used cores, remove the .template from the file name and adapt the values to your needs. Then you have to rebuild OPAL.

Related

Xtend - saved files contain repeated data

I don't know why when I generate files fsa.generateFile(fileName, finalString) it creates the files fine, but when I clean the project, it doubles the output.
Even if I delete the file, it continues growing.
Is this a code or Eclipse problem?
Thank you.
you store the file content for some reason as a member in the generator and never reset it
val root = resource?.allContents?.head as ProblemSpecification;
s += readFile(path_sigAlloyDeclaration+"sigAlloyDeclaration.txt")
i assume s either should be local to the doGenerate method or be reset at the start
s = ""
val root = resource?.allContents?.head as ProblemSpecification;
s += readFile(path_sigAlloyDeclaration+"sigAlloyDeclaration.txt")

sbt plugin - How to make sure my settingKey is resolve before everything else?

I have a settingKey[Level.Value] which is for the level of logs. I would like to set this value prior to everything else so that my log level is applied to everything.
( I know there is a setLogLevel but I would like to be able to set it, only for my Plugin )
If my setting is named myLogLevel, I tried to call myLogLevel.value inside each of my tasks and settings.
The issue seems to be that if I am not using the value yield by it, then the setting is not executed.
My setting is something like that :
myLogLevel := {
val theValueSetByTheUser = myLogLevel.value
MyLogLibrary.setLevel(theValueSetByTheUser)
theValueSetByTheUser
}
So what should I do ? Should I call println(myResultLevel) everywhere so it is used. It seems silly.
Thank you.
I found a solution :
You need two settingKey.
The first one is : myLogLevel: settingKey[Level.Value] .
And consumer of your plugin can set it to whatever in their build.sbt and you can put a default value in your settings: myLogLevel := Level.Info .
Then, you have a second setting: myLog: settingKey[Logger]
which is not exposed to the consumer and you define it like :
myLog := {
createLogOfLevel(myLogLevel.value)
}
And in your other task and setting, you can now call:
val log = myLog.value
log.info("message")

Why sbt.Extracted remove the previously defined TaskKey while append method?

There is a suitable method in the sbt.Exctracted to add the TaskKey to the current state. Assume I have inState: State:
val key1 = TaskKey[String]("key1")
Project.extract(inState).append(Seq(key1 := "key1 value"), inState)
I have faced with the strange behavior when I do it twice. I got the exception in the following example:
val key1 = TaskKey[String]("key1")
val key2 = TaskKey[String]("key2")
val st1: State = Project.extract(inState).append(Seq(key1 := "key1 value"), inState)
val st2: State = Project.extract(st1).append(Seq(key2 := "key2 value"), st1)
Project.extract(st2).runTask(key1, st2)
leads to:
java.lang.RuntimeException: */*:key1 is undefined.
The question is - why does it work like this? Is it possible to add several TaskKeys while executing the particular task by several calls to sbt.Extracted.append?
The example sbt project is sbt.Extracted append-example, to reproduce the issue just run sbt fooCmd
Josh Suereth posted the answer to sbt-dev mail list. Quote:
The append function is pretty dirty/low-level. This is probably a bug in its implementation (or the lack of documentation), but it blows away any other appended setting when used.
What you want to do, (I think) is append into the current "Session" so things will stick around and the user can remove what you've done via "sesison clear" command.
Additonally, the settings you're passing are in "raw" or "fully qualified" form. If you'd for the setting you write to work exactly the same as it would from a build.sbt file, you need to transform it first, so the Scopes match the current project, etc.
We provide a utility in sbt-server that makes it a bit easier to append settings into the current session:
https://github.com/sbt/sbt-remote-control/blob/master/server/src/main/scala/sbt/server/SettingUtil.scala#L11-L29
I have tested the proposed solution and that works like a charm.

Check if a command exists using qmake

I am working on a project which incorporates C code, as well as (MASM-like) assembly. I want to be able to compile it on Linux, as well as Windows, thus I am using a third-party assembler (jwasm) as follows:
QMAKE_PRE_LINK += jwasm -coff -Fo$$assembly_obj $$PWD/assembly.asm
(here, assembly_obj holds the directory I want jwasm to save the output. Oh, by the way: when using jwasm it is critical to first specify all the parameters, and only at the end the input files, otherwise it will ignore the parameters)
To make it easier for other people to compile the project, I would like to be able to check if jwasm is in their path, and if not, emit an error() telling them how to fix this. However, I am not sure if this is even possible using qmake. I have tried:
exists("jwasm") { # Always false
message("jwasm found!")
}
as well as:
packagesExist(jwasm) { # Always true
message("jwasm found!")
}
I have looked around in the qmake docs, but couldn't find any other alternatives...

Define Custom Source File Dependencies in SBT

How can I cause SBT to recompile some file A whenever another non-scala file B changes?
I have defined a macro:
printMacro("path/to/file")
which creates a string literal from the file indicated by "path/to/file".
Whenever that file changes, the file that uses that macro needs to be recompiled to reflect those changes. I can use watchSources to monitor that file for changes and recompile the project when it does, but because of the incremental compiler, this recompile doesn't actually do anything.
I'll almost certainly need to write a plugin to accomplish this, but I cannot find which hooks into sbt will enable me to write such a plugin.
EDIT: Recompiling the whole project isn't desirable, since there might be multiple tracked files and the project itself might be very large.
How about this solution, which is based on FileFunction.cached.
Basically define a function, which takes:
cachedBaseDirectory - the place where it's going to keep cache metadata
inStyle - which determines how it checks for changes
action - which is invoked when the observed file is changed
The function returns another function, which takes a set of monitored files.
def cached(cacheBaseDirectory: File, inStyle: FilesInfo.Style)(action: => Unit): Set[File] => Unit = {
import Path._
lazy val inCache = Difference.inputs(cacheBaseDirectory / "in-cache", inStyle)
inputs => {
inCache(inputs) { inReport =>
if(!inReport.modified.isEmpty) action
}
}
}
This is how you can use it in build.sbt
val recompileWhenFileChanges = taskKey[Unit]("Recompiles the project when a file changes")
recompileWhenFileChanges := {
val base = baseDirectory.value
val mySpecialFile = baseDirectory.value / "path" / "to" / "file" / "test.txt"
val cache = cacheDirectory.value / "my_cache_dir"
val cachedFunction = cached(cache, FilesInfo.lastModified)(IO.delete((classDirectory in Compile).value))
cachedFunction(mySpecialFile.get.toSet)
}
compile in Compile := ((compile in Compile) dependsOn recompileWhenFileChanges).value
The task deletes classDirectory only if the file is changed. Deleting the classDirectory makes the project to recompile.
Last we make the original compile to depend on our newly created task.