Implementing Isolation Forest in Spark Scala - scala

I am trying to implement Isolation Forests algorithm using Spark Scala Maven project. It is explained on this link: iforest example.
My question is: when I try to implement the suggested code I collect this error :
object iforest is not a member of package org.apache.spark.ml
I tried to import org.apache.spark.ml and changed the Spark-core dependency to vesrion 2.2.0 as well.
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.0</version>
</dependency>
Any suggestions please?

You can try this Spark/Scala implementation of the isolation forest algorithm, which has artifacts available in the public Maven Central repository.
You can declare the dependency in your project's pom.xml as:
<dependency>
<groupId>com.linkedin.isolation-forest</groupId>
<artifactId>isolation-forest_3.2.0_2.12</artifactId>
<version>2.0.8</version>
</dependency>
Other available artifact versions are listed here.

This spark-iforest artifact is not included in official distribution nor present in any centralized artifact distribution resource, so to use it you need to build it on your own, either as a separate library or inside your project.
This library should not use package name of external sources at first place, because it made a false offer that it is available within Spark itself.

Related

Scala unit test in a maven project with IntelliJ

I have a scala project with maven as build tool, and I am using IntelliJ IDE.
What is the ideal folder structure for tests in this case? And what testing library should I use?
What is the ideal folder structure for tests in this case?
In this documentation, checkout this section - Explaining this Archetype
It tells you the ideal folder structure for tests.
what testing library should I use?
I think scalatest is a great testing tool for scala. You can add the following dependency in your pom.xml or checkout this link.
<!-- https://mvnrepository.com/artifact/org.scalatest/scalatest -->
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_2.13</artifactId>
<version>3.1.1</version>
<scope>test</scope>
</dependency>
Let me know if it helps!!

Not able to import com.adobe.cq.social.srp.utilities.api.SocialResourceUtilities package in AEM 6.2

Is any one can give me the dependency to add in global and core pom , so that i can use this interface in my jave class
<dependency>
<groupId>com.adobe.cq.social</groupId>
<artifactId>cq-social-srp-api</artifactId>
<version>1.2.35</version>
<scope>provided</scope>
</dependency>
To find the dependency, go to http://<domain>:<port>/system/console/depfinder , and search with the package/class name. If the package is exported by bundles already present in the instance, the interface will display the maven dependency to be used.
The above dependency is from 6.3 version, repeat the steps above to find the dependency version available on 6.2

Fix com.day.cq.commons,version=[5.7,6) can't be resolved in OSGi Bundles in AEM 6.2

when you working with AEM 6.2 you might have came across below error,
com.day.cq.commons,version=[5.7,6)-->can't be resolved
This happened in AEM 6.2 Versions
I answered below.
Another Solution for this issue is :
copy and paste “cq-commons-5.9.26.jar” file in /.m2/repository/com/day/cq/cq-commons/5.9.26
and update pom dependency to below :
<dependency>
<groupId>com.day.cq</groupId>
<artifactId>cq-commons</artifactId>
<version>5.9.26</version>
<scope>provided</scope>
</dependency>
Solution:
Add below dependency in AEM Core Project pom.xml file
<dependency>
<groupId>com.day.cq</groupId>
<artifactId>cq-commons</artifactId>
<version>5.7.4</version>
</dependency>
Add Import Package in Core pom.xml
<Import-Package>
com.day.cq.commons;version="[5.7.0,7.0)",
</Import-Package>
build maven and deploy the project in AEM. build should be in active state.
references:
The instruction is a list of packages that are required by the bundle's contained packages. The default for this header is "*", resulting in importing all referred packages.
This header rarely has to be explicitly specified. However, in certain cases when there is an unwanted import, such an import can be removed by using a negation package pattern. The package patterns work in the same way as for , which means they are ordered.
For example, if you wanted to import all packages except org.foo.impl you would specify "!org.foo.impl,*"

Is it possible to use the Scala-Play ecosystem via Maven only? How?

I need to develop a Scala-Play application in a very controlled environment where they have Maven established but do not allow any other build tool e.g. sbt. I really want sbt of course but I can't have it here. Therefore my question, is it possible to use Maven only to setup a Scala - Play application? note I do not want to use Java i.e. productivity.
I have used Scala - Play and I am very accustomed to the Play sbt plugin that setups every nicely for activator etc but unfortunately I do not enjoy that choice here ... however, I can do anything with Maven.
Needless to say I have tried to setup SBT and Activator locally to fetch dependencies tunneling via the existing Nexus Maven without success. The nexus instance doesn't have an ivy2 repository and I am not allowed to create one.
There is Play Maven plugin (though it still in beta):
https://github.com/play2-maven-plugin/play2-maven-plugin
I did not use it, but I did few Play modules that depend on Play and build completely in Maven. I use Play from the Maven central repository. Hope this will help you.
<dependency>
<groupId>com.typesafe.play</groupId>
<artifactId>play-java_2.11</artifactId>
<version>${play.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.typesafe.play</groupId>
<artifactId>play-cache_2.11</artifactId>
<version>${play.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.typesafe.play</groupId>
<artifactId>play-json_2.11</artifactId>
<version>${play.version}</version>
<scope>provided</scope>
</dependency>

java.lang.NoClassDefFoundError: scala/reflect/ClassManifest

I am getting an error when trying to run an example on spark. Can anybody please let me know what changes do i need to do to my pom.xml to run programs with spark.
Currently Spark only works with Scala 2.9.3. It does not work with later versions of Scala. I saw the error you describe when I tried to run the SparkPi example with SCALA_HOME pointing to a 2.10.2 installation. When I pointed SCALA_HOME at a 2.9.3 installation instead, things worked for me. Details here.
You should add dependecy for scala-reflect to your maven build:
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-reflect</artifactId>
<version>2.10.2</version>
</dependency>
Ran into the same issue using the Scala-Redis 2.9 client (incompatible with Scala 2.10) and including a dependency to scala-reflect does not help. Indeed, scala-reflect is packaged as its own jar but does not include the Class missing which is deprecated since Scala 2.10.0 (see this thread).
The correct answer is to point to an installation of Scala which includes this class (In my case using the Scala-Redis client, the answer of McNeill helped. I pointed to Scala 2.9.3 using SBT and everything worked as expected)
In my case, the error is raised in Kafka's api. I change the dependency from
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.9.2</artifactId>
<version>0.8.1.1</version>
</dependency>
to
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka_2.10</artifactId>
<version>1.6.1</version>
</dependency>
fixed the problem.