How to increase the heap size used by a .zip distribution created via "sbt dist"? - scala

I'm using a package built on sbt, and following the instruction, which is: run
sbt dist in the package folder.
After the process, i got a .zip file that contains the bin file if i unzip it. (Windows system)
However, I'd like to increase the heap size/memory allocated for this package/program.
I've tried all methods as listed below, but none of them worked.
I've searched a lot, including all methods listed here. But seems like they use sbt directly, like sbt run, sbt project_name, etc, which are not very related to my problem i guess?
Thank you all for the comments! Here is the link to the documentation about how to "Build and Link" of the project I tried to use https://github.com/anskarl/LoMRF/blob/develop/docs/7_1_build_and_link_lomrf.md

The author of that project answered the question and here is the solution: https://github.com/anskarl/LoMRF/issues/24.
For example, to set it to 8GB, run in the shell the following:
LOMRF_JVM_ARGS="-Xmx8g" lomrf

Related

IntelliJ IDEA 13: new Scala SBT project hasn't src directory structure generated

I followed the getting start video on Jetbrains website to setup IntelliJ IDEA 13.1 Community Edition to work with Scala. Scala plugin v0.36.431 had been installed. While I created a new Scala SBT project with wizard, there was no src/ directory structure generated in the project. Only two sbt files were generated:
scala-course/
├── build.sbt
└── project
└── plugins.sbt
From the video and other document I know that there should be a src/ directory structure, including src/main/scala, src/test/scala, etc. sbt uses the same directory structure as Maven for source files by default.
I can create those folders manually and mark it as source root. However it is trivial. So my question is: Why IntelliJ IDEA new project wizard doesn't generate the directory structure as said in document? Was I doing something wrong? I checked the preferences and couldn't find anything that seems related.
Normally it should create these folders automatically. It may take a while though - it takes couple of seconds in my case.
When creating project make sure you have selected Scala -> SBT, then proceed with the wizard.
Once the Finish is clicked, the project will be loaded. This part takes couple of seconds, and I can see no src/main/scala nor src/test/scala generated until it's done. Observe the bottom of the screen to see when it's done.
Once the process is finished, you'll see the folders.
If that's not the case, check the settings. You should have the Create directories for empty content roots automatically checked. You may want to check Use auto-import to automatically propagate changes in the build.sbt.
After changing the settings (if the change is required) you may need to refresh the project, as seen in picture below.
This can also happen if you do not have a JDK selected. For some reason you no longer get the option to select an SDK so you must make sure you have configured this before hand. To fix this do the following:
From the welcome screen, go to
Configure -> Project defaults -> Project structure and add the jdk.
Source:
What's the reason for "Error:Cannot determine Java VM executable in selected JDK"?
Thanks to lpiepiora, with his hint I find out the reason.
Because my sbt is newly installed, there is nothing in ~/.ivy2/cache/ and ~/.sbt/boot/. sbt needed to download required dependencies from repositories on network. It happened that my proxy to internet had something wrong, download stuck.
And also need to notice that, if quit IntelliJ IDEA when sbt is running in background, the next time you'll get error of waiting for some lock file. Have to remove the lock file on filesystem and restart IntelliJ IDEA again.
After fixed the network problem, everything work as promised. It requires several minutes, depends on network speed, to download required jar files. After finished, the src/ directory structure is created.
I followed the instructions in this thread but I had a java crash in the final phase in sbt with the configuration bellow and I think this info maybe useful:
The problem happened with IDEA 2016.2, sbt 0.13.8 (I tried later to import using 0.13.12 but the crash was the same), scala 2.11.8 and ubuntu 16.04.
The only way I could make it work was to use java 8 instead of 9.
error: error while loading package, Missing dependency 'object java.lang.Object in compiler mirror', required by /home/jbamaral/.sbt/boot at xsbt.boot.Boot.main(Boot.scala)
...
stack log here
...
[error] scala.reflect.internal.MissingRequirementError: object java.lang.Object in compiler mirror not found.

Running tests on Intellij: Class not found

I'm evaluating IntelliJ (13.0.2 133.696) and cannot get jUnit tests to run from within the IDE.
My project is a multi module gradle project and uses scala.
Test class is located under src/test/scala/xxx/xxxxx/xxx/xxxx/xxxxx and everytime i try to run from IDE i get the same error:
Class not found: "xxx.xxxxx.xxx.xxxx.xxxxx.AccountRepositoryTest"
Test class is nothing fancy, simple jUnit test:
#RunWith(classOf[SpringJUnit4ClassRunner])
#ContextConfiguration(classes = Array(classOf[DataConfig], classOf[SettingsConfig]))
class AccountRepositoryTest extends AssertionsForJUnit {
I've found a related question Cannot run Junit tests from IDEA 13.0 IDE for imported gradle projects , but the provided fix (upgrade to 13.0.2) does not work.
I've even tried upgrading to the latest EAP, still the same issue.
I looked through some of these answers, fussed with Project Settings, tried a few things, etc. and nothing worked. (Full disclosure: I'm not trying to juggle Gradle here; I'm just using Maven, but I don't see what this has to do with Gradle.)
I'm using IDEA 14.
What I found to work, because it just simply seemed IntelliJ had lost its way, was this:
$ rm -rf .idea project-name.iml
Then relaunched IntelliJ and did File -> Open -> navigate to the root of my project, etc.--in short, just recreated my project.
IntelliJ got over it. I may have messed something up originally in this project as I had done plenty of refactoring both package- and class names and I had even changed the project name. (It was probably my fault it happened.)
I had this same problem, and in my case the problem was due to the "Project compiler output" path being left blank in Project Settings.
To fix it I created a classes directory in my project root, and set Project compiler output to the absolute path (use the … button to browse).
Go to Project Settings -> Project.
Fill in Project compiler output:
ex. D:\repo\Project\out
Go to Module -> Paths
Make sure that:
output path is like D:\repo\Project\out\production
test output path like D:\repo\Project\out\test
Should work!
Simply 'Build > Rebuild Project' worked for me.
Check Run/Debug configuration for that test
"Use classpath and SDK of module:" should point into your module.
In meantime you module must have a Scala facet and that class must be inside the
"Test source Folders".
You can try to invalidate the cache and restart. That usually will resolve issues when you add new dependencies / classes.
Make sure your test class package and the class for which you are writing test case are not same. If both test case and the class is having the same package, the compiler will look in the src folder and ignores the test folder.
I had the same problem. I changed a path in Module Settings -> Modules -> Paths -> Test output path to my directory for test classes bytecode (exclude output paths on). Now everything works!
IDEA restart solved the issue for me.
Just make shure the folder of your test file marked as a test folder in Intellij IDEA. That worked for me.
If you have multiple directories with source files with the same name, add package to your class source file, if not present!
I had the same problem, Intellij wasn't finding the Test output path. Running the regular application had no problems however.
For me, the fix was changing from inherited project compile paths, to using module compile output paths.
Project Settings -> Modules -> (Your module) -> Paths (tab)
Change the radio select button to "Use module compile output path". For me the autofilled suggestion worked, you may need to manually put in the correct Test Output Path if the autosuggestion doesn't work. Remember to apply the settings change.
As of for gradle project X, deleting both:
X/build
X/out
and running tests again helped me resolve this issue.
Try this in this order:
rebuild project (+strangely, select another app, reselect idea for context-switch, seems to force files reload ?!)
invalidate cache/restart idea
reimport project /create a new project
Just renaming the file to something different and back worked for me :)
modify the content in tag of the module's .iml file just as the following. It works for me.
<content url="file://$MODULE_DIR$">
<sourceFolder url="file://$MODULE_DIR$/src/main/java" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/src/test/java" isTestSource="true" />
<excludeFolder url="file://$MODULE_DIR$/target" />
</content>
Had the same problem, fixed it by recreating the project in a directory path that had no spaces, colons, periods or other special characters anywhere in the full path. Apparently IntelliJ can be finicky about the project path.
My issue remained after building, clean and rebuild, closing and opening the project (using intellij), project compiler output was correct; in the end I just deleted the folder out from my project directory. Note that my issue was only on the newly added Junit calling a newly added method. The rest of Junits were working fine.
For me it was that one test was failing and the error was misguiding.
So:
I re-imported module in intellij as standalone project
Run the test
Fix test issues
Run modules again and it started to work.
tl;dr:
Missing file extensions can cause this error.
details:
In my case the test classes were missing the .java extension. E.g. a
file named UserTest instead of UserTest.java
Was hard to find, everything looked normal from within the IDE
(apparently IntelliJ rather uses the file contents to display a
corresponding symbol).
Was not an issue as long as I used mvn from command line (with the surfire plugin enabled in pom.xml) or a dedicated maven launcher configuration, but caused
the initial error message when launching using IntelliJs test or coverage menu / buttons.
Everything working as expected as once I added the missing file endings.
The root cause might be hidden in Java Compiler settings, it was so for me at least. In particular, feel free to adjust those settings using some hints from another thread.

Is it possible to use typesafe activator laucher to achieve the same thing as the zip release?

As a scaffolding tool, the official release has a size of 238MB , which is too big and I already have an repo on my local, why activator ships another repo and continue downloading the existing dependencies into it?
The launcher is actually really small, < 50K, is it possible to use it like sbt, just use the activator launcher to achieve the same functions as in the full release?
Yes. The repository is only their for convenience. The actual size of the "bootloader" of Activator is something like 2Mb. If you delete the "repository" directory from the activator zip, then everything will still work, but dependencies will be downloaded on demand.
Another hidden feature is if you download a particular template from the website (see go to http://typesafe.com/activator/template/activator-akka-spray and click the download link), you'll get a a launcher for activator which is standalone.
The activator.bat/activator/activator-launcher-<version>.jar files are the only portion of the distribution you really need to run.

Eclipse OutOfMemoryError Flex 3

I can only code one line of code, and build. The next change will make an out of memory in eclipse (java.lang.OutOfMemoryError). I am running eclipse through a shortcut with params:
-vmargs -Xms1024m -Xmx1024m -XX:+UseParallelGC -XX:PermSize=128M -XX:MaxPermSize=256M.
I know this question has been done many times but I was wondering if by separating the code into swc libs I will fix this problem. I think I have tried all that is on the net unsuccesfully. My project is just too big! How does the build operation work with libs, does it compile unchanged libs. Will this work? I also want to try this approach because the build takes to long. Does anyone know a good documentation where I can understand what does the build do?
From my experience, it will not compile the unchanged libraries unless you hit clean the project. I dont think there is a documentation for the same, I've used eclipse.ini file to change the vm arguments, this seems to have worked well for me. try to use this.
If you still need to speed up the compilation speed, i've hacked it previously in the following manner
Remove all the dependencies of the libraries from the main project. (So, each project builds without disturbing others)
In the build path of the library project, (output text field) point it to
the lib location of the main project.
Under build path > libraries tab add the swc files here and attach source to the lib project
When you build the main project now, it does not take a lot of time as it does not know about the lib projects.
Hope it helps.

Netbeans project to scripted build

I'm trying to convert a Netbeans 6.9.1 project into a scripted build (without netbeans). Of course, it fails (or I wouldn't be asking for help).
In the failure it says that the org.apache.commons.httpclient package does not exist. (Of course, it worked when we ran the build in Netbeans).
Now I know exactly where the commons-httpclient.jar file is located in my project structure, but I can't seem to tell it to the compiler via the ant build files and the netbeans property files.
Perhaps related to this is when I ran "ant -v" to build my software, it said,
Property lib.mystuff.classpath has not been set. This variable is important, I guess, because
the file nbproject/project.properties uses lib.mystuff.classpath in its definition of javac.classpath, which of course tells the Java compiler where to find the JARs.
So...when moving a Netbeans project to a netbeans-independent scripted build, how can the build script set these properties? Also, how can I ensure that the jar file gets included in the ant build?
I appreciate any help I can get, as I am a Java newbie.
UPDATE AFTER ACCEPTING ANSWER FROM vkraemer:
There are a few best practices for build scripts for production software:
Put everything needed for a build under a single directory tree. (Netbeans = fail)
Put everything in source code control. (I did that)
The first line of the build script should clear all environment variables.
The next section of the build script should explicitly set all environment variables to values which are known to work.
The next part of the build should be able to execute using command-line programs such as javac, ant, cc, etc, and must not depend on firing up an IDE such as Eclipse or Netbeans.
It is a shame that Netbeans makes this hard.
I did a quick look in a Java Application project and found the following...
javac.classpath = ${libs.MyStuff.classpath}
libs.MyStuff.classpath is defined in %HOME%/.netbeans/6.9.1/build.properties.
You may be able to get by doing the following...
ant -Dlibs.MyStuff.classpath=c:\a\b\c.jar
You would need to do more if you have multiple jar files in the MyStuff library that you created in NetBeans.