How to specify a Chisel3 search path? - scala

This may or may not be a duplicate of How to use chisel module as package.
Again, for scala/sbt/maven experts this may be obvious, for old school ASIC designers it's not:
I have project PROJ with standard directory structure PROJ/src/main/scala/myproj.scala. How do I use some chisel code from some external library LIB? Eg. from /usr/libs/LIB/src/main/scala/{stuff}.scala?

(Not a full answer, more a warning than an answer)
"Search path" sounds a bit concerning, so I'd just want to make sure that you don't expect something like a C/C++ build that's searching for some files on some file systems.
Before proceeding, it might be helpful to ponder on the thought that the entire scala / java / kotlin / maven / sbt / gradle / ... ecosystem is "internet-centric", not "file-system-centric". It essentially assumes that all packages are available under a globally unique identifier in some online repository (even when they are not, local installation will make them look as if they came from a public repository, see below). Local file systems are used only as temporary local caches (and it is assumed that you as a human will not look into those caches without a good reason). In general, it tries really hard not to depend on the machine on which it's built: everything it needs is specified in the build.sbt, presence or absence of any files in /usr/lib is irrelevant.
If you want to use a package, you have to declare it as a dependency in your build config (SBT documentation definitely does tell how to do this; Maven Central even provides a helpful little textfield from which you can copy the correctly formatted pieces of the config).
If your package does not come from any public repository, you'll first have to install it locally (SBT documentation must also tell how to "install a package locally", it's a short SBT command that will copy the package into the local cache in your file system, so that other projects that depend on it can pretend as if it came from some repository).
If you have just the src/foo/bar/baz/stuff.scala-files, but no build.sbt-file, you'll probably first want to convert it into a proper sbt-project, then build it, then install it locally (you need a JAR, adding .scala files to the CLASSPATH won't buy you anything, they must be compiled first; Doing anything to the CLASSPATH manually is essentially hopeless anyway, the only way to do it is to let SBT take care of everything).

Related

How do you tell sbt-eclipse to ignore (errors of) a very specific folder under /src

I have an infrastructure project that contains other projects as resources. (Because it compiles them on the fly). One of those contained projects is deliberately one that fails to compile.
This makes the entire project show in eclipse as "with errors".
How can I make sbt-eclipse configure eclipse such that e.g. anything under src/main/resources/foo should be ignored?
Of course this isn't exactly the scenario eclipse was built for, but might there be some clean way around it? as much as it matters, sbt itself does not try to compile these resources.
If not, maybe a way to tell eclipse to not even load source
directories under src/main/resources?
Thanks!

How does sbt integrate with IntelliJ?

Is there a definite doc somewhere that explains all the magic that happens behind the "Typesafe Activator" generation of "IntelliJ supported" project?
The sbt build files look absolutely monstrous, and I have no idea what and where IntelliJ looks for.
This is frustrating as working from two different PCs the scala seed project refers to different hard-coded paths.
Is there a good place to start?
Last time I checked, the typesafe activator was using SBT as the underlying build tool. When creating an intellij project it would thus use the sbt-idea plugin.
I guess a possible place to start would be that plugin's documentation.
However I think there is something else going on here. I think you have the activator installed on two different PCs and are trying to share the project between both PCs whether using version control or copying the folders.
The sbt-idea plugin will indeed write some absolute path in ideas project files (most likely the absolute paths to the sbt managed libraries in the ivy cache of your home folder) since this is required for the intellij project to work.
There should be no reason to "share" the idea project files, these should be considered computer specific and should not be checked into source control, or expected to work when copied from a random computer to another. You are expected to regenerate them for each computer the project is worked on.
If that sounds like a burden, you may want to install the Intellij scala plugin. Once installed, the sbt integration will allow you to import any sbt project even if you haven't generated the intellij support in the activator. Have a look at the features page, there is a video showing how to use the plugin.

How to get Eclipse to create bin/main and bin/test

I want my Ant build to take all Java sources from src/main/*, compile them, and place them inside bin/main/. I also want it to compile src/test/* sources to bin/test/. I wan this behavior because I want to package the binaries into different JARs, and if they all just go to a single bin/ directory it will be impossible* (extremely difficult!) to know which class files belong where.
When I go to configure my build path and then click the Source tab I see an area toward the bottom where it reads Default output folder: and then allows you to browser for its location.
I'm wondering how to create bin/main and bin/test in an existing project without "breaking" Eclipse (it happens). I'm also wondering that if I just have my Ant build make and delete those directories during the clean-n-build process, that Eclipse might not care what the default output is set to. But I can't find any documentation either way.
Thanks in advance for any help here.
In Eclipse, you can only have one output folder per project for your compiled Java files. So you cannot get Eclipse to do the split you want (i.e. compile src/main to bin/main and src/test to bin/test).
You can, if you want, create two Eclipse projects, one main project and one test project, where the test project depends on (and tests) the main project. However, in that case, each project should be in its own directory structure, which is not what you are asking for. But this is a common approach.
Another way, which I would recommend, would be to not mix Ant compilation and Eclipse's compilation. Make the Ant script the way you describe (i. e. compile the main and test directories separately and create two separate jar files). Change the Eclipse compile directory to something different, for instance bin/eclipse. Use the Ant script when making official builds or building for release. Use Eclipse's building only for development/debugging. This way, your two build systems will not get in each other's way and confuse each other.
Hope this answers your question and I understood it correctly. Good luck!

ant deployment issues

i am looking to make our deployments here not suck and i need some help, if you can help me with these few things i owe you beer
right now whenever i make a change thats not to the jsps i need to clean-including-tomcat otherwise my change doesnt take. this is really annoying.
any clues as to what i can change to make it work?
my current build is really simple, just the regular old, javac, war, deploy
one thing that isnt done is that there is no build dir, the project itself contains a web-inf and the javac is done in place, then the war excludes all the .java resources and wars the project.
edit:
I am looking to fix this problem with least amount of effort - so while switching to maven and learning how to use it might solve this problem, but it will create another problem ;)
You've already identified some of the weaknesses, in your current build.
The easiest way that I can suggest to clean it up would be to start with the directory structure.
I highly recommend using the maven directory structure, I would go further to suggest using maven as a build tool instead of ant, however for some folk that remains open for debate.
The maven directory structure has been well thought out, I really like working on projects that use the maven directory structure, because they follow a convention that allows me to save a lot of time, by knowing from previous experience where to find the application components
java source
unit test source
resources etc.
Also by following the convention, the maven plugins work with less configuration required.
Another useful advantage that I get from working on maven based projects is good code metrics, to measure the health of the application. There are various report available as maven plugins, which will give you new insight into your codebase, including:
checkstyle
pmd
findbugs
and more.
Created a build directory where everything got copied before build
Added some flags to not copy over things that rarely change, like images (also to not remove them on clean)
Started using ant-reload task after deploying code
Now i don't need to restart tomcat on every build, and build takes much less time.

Installing Java libraries

As I'm quite new to Java, I would like to know the proper procedure of installing new libraries (those that are no available in my linux dist repositories).
Where should I place them? and how to install them?
For instance, I downloaded openCsv (http://opencsv.sourceforge.net/), and I have no idea how to install it.
Java libraries don't really need to be 'installed' like other applications. All you need to do is put the jar file in a specific location, and add the jar file to your classpath. How you do that depends on the linux distro you are using. If you are making a web application in eclipse, you can drop the .jar file in the WebRoot/web-inf/lib folder, and it will be bundled in with your project.
Be sure that the path, which you place the libaries at, is set in the $CLASSPATH Environment Variable.
For Eclipse: Project -> Properties -> Java Build Path -> Add JARs...
It's up to you really - I use /opt/javalib, but you might consider a directory in /usr/local as well.
You can store them wherever you wish. You can store them within the JRE distribution directories, but I wouldn't recommend that.
Instead I would store them per-project (so you can have different versions for each project easily - some libraries have different names for each version, some don't) and adopt a standard such as a lib/ directory. That way you can have standard build scripts (Ant etc.) that can operate in the same way (if you're using Maven, then there's a standard place per-project - src/main/resources)
You could use Maven to manage any dependencies to those libraries.
Maven will automatically download all needed JAR files and put them in a local repository (the location is configurable).
This makes upgrading to new versions of various libraries very easy as you just declare the version you want and Maven does the rest.
Beware: Maven is something to get used to and the initial learning curve is steep.
The rewards come if you have everything set up properly and maven takes care of compiling, packaging, distribution, site creation, release management etc. etc. etc.