How to use OrTools in Java (Spring Boot) - or-tools

I'm struggling to add OrTools to our Spring Boot project, basically what I've done is first try to add the dependency using gradle:
implementation 'com.google.ortools:ortools-java:9.2.9972'
implementation 'com.google.protobuf:protobuf-java:3.19.4'
But when using the CpModel we have an error on Protobuf even if it is downloaded by gradle:
java.lang.NoClassDefFoundError: com/google/protobuf/MessageOrBuilder
So I tried to load the dependency using the OrTools Loader:
Loader.loadNativeLibraries();
And Here I have another error and can't find the solution:
Resource ortools-darwin-aarch64/ was not found in ClassLoader jdk.internal.loader.ClassLoaders$AppClassLoader#3b192d32
Do you have any idea? Is there someone that has a working OrTools with spring?
(I'm on MacOS arm)

or-tools is a C++ library wrapped in java using swig. currently we only provide/support amd64 cpu arch and LInux/MacOS and windows platform.
I'll try to add a Java M1 issue but since we still don't have any M1 for testing and dev, don't expect support soon ;)
ps: https://github.com/google/or-tools/issues?q=apple+m1+

Related

How to solve the 'Unable to find Asm for stackmap generation' error on startup of STS?

I am trying to use the Spring Tool Suite 3.8.3 on Ubuntu 16.04. Upon startup I get this error:
An internal error occured during: "Initializing Java Tooling"
with the detailed message:
An internal error occurred during: "Initializing Java Tooling". Unable
to find Asm for stackmap generation (Looking for
'aj.org.objectweb.asm.ClassReader'). Stackmap generation for woven
code is required to avoid verify errors on a Java 1.7 or higher
runtime when weaving type org.eclipse.jdt.core.search.SearchPattern
when weaving classes when weaving
I have to admit that I have no idea what I should do here and I failed to find any pointers online. Any advice or hint is welcome.
It was for me due to the scala plug-in which I have installed a few days back. Uninstall the scala plugin and change to JDK 8 or JDK 11 it will work.
If anyone is still having a similar problem with Eclipse and needs to use Java 11 or higher (Eclipse nowadays seems to require Java 11) then have a look into this plugin as it seems to be required for the kotlin plugin.
https://marketplace.eclipse.org/content/aspectj-development-tools
I think you have installed Java 9 on your system. That's why you are facing compatibility issue and getting the stated errors. I therefore suggest you to degrade your java version to 8 for which it will work well and you wont be facing any error issue in that.
You can install java8 from this link!
I had to downgrade the JRE used to run Eclipse. Downgrading from Java 15 to Java 11 solved the problem.
Use the -vm option in eclipse.ini, eg.
-vm
c:\Dev\jdk-11\bin
My STS did not even start after installing scala plugin. I had to manually delete scala jars and folders from the STS /plugins. After that it started working.

NoSuchMethod exception in Flink when using dataset with custom object array

I have a problem with Flink
java.lang.NoSuchMethodError: org.apache.flink.api.java.typeutils.ObjectArrayTypeInfo.getInfoFor(Lorg/apache/flink/api/common/typeinfo/TypeInformation;)Lorg/apache/flink/api/java/typeutils/ObjectArrayTypeInfo;
at LowLevel.FlinkImplementation.FlinkImplementation$$anon$6.<init>(FlinkImplementation.scala:28)
at LowLevel.FlinkImplementation.FlinkImplementation.<init>(FlinkImplementation.scala:28)
at IRLogic.GmqlServer.<init>(GmqlServer.scala:15)
at it.polimi.App$.main(App.scala:20)
at it.polimi.App.main(App.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
...
the line with the problem is this one
implicit val regionTypeInformation =
api.scala.createTypeInformation[FlinkDataTypes.FlinkRegionType]
in the FlinkRegionType I have an Array of custom object
I developed the app with the maven plugin in the IDE and everything is working good, but when I move to the version I downloaded from the website I get the error above
I am using Flink 0.9
I was thinking that some library may be missing but I am using maven for handling everything. Moreover running through the code of ObjectArrayTypeInfo.java it doesn't seem to be the problem
A NoSuchMethodError commonly indicates a version mismatch between the libraries a Flink program was compiled with and the system the program is executed on. Especially if the same code works in an IDE setup where compile and execution libraries are the same.
In such case, you should check the version of the Flink dependencies, for example in the Maven POM file.

How fix sigar library when I run spray application?

I have a sbt project written in scala. The project uses akka and spray. There is a class with main function. When I run scala console application sometimes I get
[on-spray-can-akka.actor.default-dispatcher-4] [DEBUG] [2014-11-07 16:48:30,336] Sigar: no sigar-amd64-winnt.dll in java.library.path
org.hyperic.sigar.SigarException: no sigar-amd64-winnt.dll in java.library.path
I do not change anything run it again and it runs well. So it can be run successful or fail several times on end. How to fix this?
UPDATED
Also when it start normal there is a message:
[INFO] [11/07/2014 17:02:36.772] [on-spray-can-akka.actor.default-dispatcher-2]
[Cluster(akka://myApp)] Cluster Node [akka.tcp://myApp#127.0.0.1:2551] - Metrics will be
retreived from MBeans, and may be incorrect on some platforms. To increase metric accuracy
add the 'sigar.jar' to the classpath and the appropriate platform-specific native libary to
'java.library.path'. Reason: java.lang.IllegalArgumentException: java.lang.UnsatisfiedLinkError:
org.hyperic.sigar.Sigar.getPid()J
Sigar is a native library for gathering performance stats, used by Typesafe Console atmos Scala library. If you're not interested in hooking up Typesafe Console to your application, you can simply remove all references to atmos library from sbt build script and app config files without affecting your app functionality.

Using guava in griffon gives Prohibited package exception

I am using Griffon and want to add the guava libraries as a dependency in my project. However, when I do this, even without using 1 class of it, I get the following exception:
Compilation error: BUG! exception in phase 'canonicalization' in source unit
'/home/wdb/myproject/griffon-app/controllers/MyController.groovy' Prohibited
package name: java.util.concurrent
Any idea what might be wrong? This is my java version (on Ubuntu 11.10):
wdb#wdb-laptop:~$ java -version
java version "1.6.0_27"
Java(TM) SE Runtime Environment (build 1.6.0_27-b07)
Java HotSpot(TM) Server VM (build 20.2-b06, mixed mode)
I found this link that talks about using the bootclasspath for a similar problem, but that seems a bit drastic.
regards,
Wim
My wild guess is that our bootclasspath copy of java.util.concurrent.ExecutorService (necessary due to an incompatible change between JDK5 and JDK6) is showing up in your classpath. I don't really know Maven, but I would think that, because we identify the dependency as "provided", this shouldn't be happening.
That's not really an answer, but I hope it's enough to get you or someone else started.
It must be that Griffon does not honor 'provided' scope. I managed to get it working by editing BuilderConfig.groovy to:
compile( 'com.google.guava:guava:10.0.1' ) {
exclude 'guava-bootstrap'
}

How to exclude work module from the compile, when deploying to GAE?

Our standard module file is: Myproject.gwt.xml
We have added an extra modulefile for fast compilation called:
MyprojectWork.gwt.xml
When deploying to GAE, it compiles both Myproject.gwt.xml and
MyprojectWork.gwt.xml.
How to exclude MyprojectWork.gwt.xml from the compile, when deploying
to GAE ?
Config:
GWT SDK 1.7.0
Google Plugin for Eclipse 3.5
GAE SDK 1.2.2
If you use Maven than this may solve your problem: stackoverflow #1745315