Java refactor programmatically - eclipse

As a process of rebranding, we have a change the names in one of our huge project.
Is there a way that let me refactor my Java code base (rename class and package) programmatically using eclipse/other tools?

Eclipse has a way of persisting refactorings that you apply into an XML file and gives you the possibility to replay this script later, see https://help.eclipse.org/neon/index.jsp?topic=%2Forg.eclipse.jdt.doc.user%2Fconcepts%2Fconcept-refactoring.htm.
It is not that hard to write this script by hand, especially if you're just renaming code (refactoring within a method-body needs much more fine-grained information).

Related

Spring Cloud Contract: Prefered way to import repeating parts in Contract DSL

When writing a contract for an API i found myself repeating the same things over and over. For example regex validations for complex json object need to be copy pasted.
Thats tedious and not very DRY.
I'm using Kotlin DSL and Maven
I tried to extract the common parts in another file to reuse it. (Kotlin extension functions ftw.)
After trying multiple things to reuse that file I gave up.
The common parts should be as close to the actual contracts as possible. I don't want to export them in another project, and build them seperat, as they are an vital part of the contracts.
I tried it the following ways:
just put the file in the same directory and importing the functions, hoping it would be resolved as it would in java - did not work at all (my expectations were low, was worth a shot)
putting it in another maven module, and add that as dependency to the spring-cloud-contract-maven-plugin. that worked, as long as the dependant module was built and installed in the local maven repo. if no built version was available maven could not resolve it.
experimenting with kotlin script #file:Import() and #file:DependOn to tackle my issue, no luck.
Is there another way, that I missed? Is there a prefered way doing this?
This must be a common issue, right?

Read an ".sbt like" file for highly customizable configurations in Scala

I'd like to write a simple app to crawl/scrape some web content. The more I delve into the analysis, the more I realize that the instructions I need to give to the app in order to navigate through the site and get the content I wish it to extract, may be very peculiar depending on the target website structure.
The idea was to write some configuration files to define how the app should browse and scrape each specific site, but defining such behaviour could be challenging, unless you write in the configuration file some actual Scala code.
So, the idea is to write some code able to get a scraper object instance reading a file written in the .sbt format and inject some code in it.
First of all, I need to know where to start to achieve such task: what library should I use?
Could it be easier to write some sbt tasks and use sbt as the core of the app instead of writing one from scratch? What should be the limitations in this approach?
I apologize for being so general, but I don't have the slightest idea from where to start. I'd like you to head me to the right direction and post some docs to read.
Consider the app is meant to be a CLI tool, no graphical interface needed, then.

IJavaProject without Eclipse Environment in JDT

I have an exported Eclipse Java Project in my server and I want to be able to compile the project and use ASTParser with JDT.
I'm able to compile the project using BatchCompiler, however it runs on console and gives me PrintWriters instead of an array of problems and errors. Also I want to be able to use proposals in Eclipse and BatchCompiler didn't built for this purpose.
Therefore I tried to use ASTParser, it can be used with either char[] or ICompilationUnit. CompletionProposalCollector and org.eclipse.jdt.internal.compiler.Compiler.Compiler needs ICompilationUnit so I have to create an ICompilationUnit which only can be created by an IJavaProject (https://dl.dropboxusercontent.com/u/10773282/2012/eclipse_workspace.pdf) in order to be able to use these features.
It seems the only way to create IJavaProject is to use ResourcesPlugin.getWorkspace(), however it returns java.lang.IllegalStateException: Workspace is closed. on my computer and it seems the reason is that the program that I coded is not an Eclipse plug-in.
Is there any way to create IJavaProject without Eclipse environment?
From the comments, it looks like you are trying to do more than just parsing, you actually want to get some form of content assist.
I'm afraid that you're asking for too much. There is no simple way to get the power and flexibility of JDT outside of a running Eclipse instance (believe me, I've tried). There's no simple way, but if you are brave and strong willed, you can one of try following:
Run a headless Eclipse on your server that works on top of an actual workspace. This would be the easiest to implement, but would be the most resource intensive and least flexible way of doing things.
Use the jdt core jar, and create alternate implementations of the IResource hierarchy, and the parts of JFace that are used by the the parser and the CompletionEngine. This would likely be the most feature-rich way to go, but also the most brittle. I can't guarantee that this would work as you may need to create some very complex stubs for internal Eclipse non-API classes.
Avoid the CompletionEngine and the ASTParser entirely and just use the batch compiler. You would then need to provide an alternate implementation of org.eclipse.jdt.internal.compiler.env.INameEnvironment. This implementation would be able to find types, files, and compilation units in your actual project structure. You'd need to reimplement support for content assist, but this would most likely work reasonably well.
I am actually fairly interested in doing something like this (but I lack the time to do it). If you are seriously considering creating a headless JDT that can run on a server, feel free to ask for more information. I am quite familiar with JDT internals.
I've had a similar problem. Here is how to use ASTParser without Eclipse (it just needs the core JDT JAR on the classpath): http://blog.pdark.de/2010/11/05/using-eclipse-to-parse-java-code/

How to create several flash application sharing common codebase in FlashDevelop/ActionScript 3.0?

Situation:
I need several swf/exe output files compiled in FlashDevelop from several projects. More than 60% of ActionScript 3.0 source is common for all project, rest are project-specific. How can I organize that in FlashDevelop? I want to have "one-click-to-build all" setting without duplicating common codebase (so when I need to fix something I do not need to copy-paste solution into several files).
All sources are under develeopment and will change very often.
A straightforward solution is to make an external classpath, for instance:
c:\dev\shared_src\
c:\dev\project1\
c:\dev\project2\
Then configure each project:
Project Properties > Classpath
Add Classpath > select '../shared_src'
PS: of course you should keep everything under source control.
Using svn:externals you could structure your repository in such a way that the commom parts are stored just once in the source control system, so changes made can be synchronised with just a single commit and update cycle.
For example, imagine that you have ^/ProjectA and ^/ProjectB, each of with require ^/Common as a sub directory.
Using svn:externals, pull ^/Common into both projects.
The exact nature of doing this will depend on the version of svn you use, and any client you use (such as TortoiseSvn). Refer to the relevant edition of the svn book for specifics.
The ease of implementing this will depend quite a lot on how separate the common code currently is in your application; and pulling in directories as directories is much more practical than trying to pull them into an existing directory; and unfortunately wildcards for filepaths are not supported.
However, based on your description of your aim; this is the most straight-forward solution I can imagine.
Hope this helps.

Managing a scala project with a Makefile

First of all, I know how to write a basic/intermediate level
makefile. In my c++ projects I use a makefile that does a lot of stuff
automatically. The most important to me is that it automatically
detects all source files (which are always in the same folder) using
wildcards, uses that to predict the name (and location) of all object files, and compiles appropriately.
Recently I've been trying to achieve the same effect with my scala
projects, but I've hit two obstacles.
Copilled class files which belong to packages are stored inside
subdirectories (like com/me/mypack/). This is a problem because
Make needs to find these files to check the timestamps (and I
have no idea how to do that automatically).
Some source files (such as those defining a package object)
generate class files with different naming standards. Again, Make
needs to know where these class files are and I don't know how to
do that automatically.
The consequence of this is that the "problematic" source files are
recompiled every time I run make (which is aggravated by scala's long
compile times). I'd like to know how to fix that without having to
manually write out the entire list of expected class files.
EDIT As an extra note: I'd like to avoid placing the source files in subdirectories. I like keeping them all in the same directory for several reasons.
You should use sbt or Maven for Scala. These are designed specifically for the way Scala and Java work, and they will be much easier to set up and use. They also provide many more features than make does.
These tools are used for a variety of things. Compiling is a big one, but they are also important for dependency management. Also, sbt (and probably Maven?) does "incremental compilation", so that only classes that have changed are recompiled, which speeds up compilation.
I personally use sbt, but I know people who prefer Maven.