Metaphone or Soundex for Scala - scala

I have found Apache's impelementation of Soundex and Metaphone in Java but I would prefer to keep the text comparison libraries I am using in Scala only if possible. Google searches have yielded me nothing useful in finding either of these algorithms in Scala.
Worst case scenario I can translate these algorithms into Scala but that is less than ideal.
http://commons.apache.org/codec/

You are looking for Stringmetric from https://stackoverflow.com/users/554647/rocky-madden :
https://github.com/rockymadden/stringmetric

Not to answer my own question or anything but a viable option would be to utilize a Java library and create some companion objects in scala to help expose them more appropriately and to allow to code to document itself more effectively.
//Metaphone companion object for org.apache.commons.codec.language.Metaphone in /lib/commons-codec-1.7
object Metaphone {
val metaphone = new Metaphone
metaphone setMaxCodeLen 5
def encode(str:String) : String = {
metaphone encode str
}
}
Implementation:
val str_meta = Metaphone encode "Starbucks"

Related

How to find max date from stream in scala using CompareTo?

I am new to Scala and trying to explore how I can use Java functionalities with Scala.
I am having stream of LocalDate which is a Java class and I am trying to find maximum date out of my list.
var processedResult : Stream[LocalDate] =List(javaList)
.toStream
.map { s => {
//some processing
LocalDate.parse(str, formatter)
}
}
I know we can do easily by using .compare() and .compareTo() in Java but I am not sure how do I use the same thing over here.
Also, I have no idea how Ordering works in Scala when it comes to sorting.
Can anyone suggest how can get this done?
First of all, a lot of minor details that I will point out since it seems you are pretty new to the language and I expect those to help you with your learning path.
First, avoid var at all costs, especially when learning.
While mutability has its place and is not always wrong, forcing you to avoid it while learning will help you. Particularly, avoid it when it doesn't provide any value; like in this case.
Second, this List(javaList) doesn't do what you think it does. It creates a single element Scala List whose unique element is a Java List. What you probably want is to transform that Java List into a Scala one, for that you can use the CollectionConverters.
import scala.jdk.CollectionConverters._ // This works if you are in 2.13
// if you are in 2.12 or lower use: import scala.collection.JavaConverters._
val scalaList = javaList.asScala.toList
Third, not sure why you want to use a Scala Stream, a Stream is for infinite or very large collections where you want all the transformations to be made lazily and only produce elements as they are consumed (also, btw, it was deprecated in 2.13 in favour of LazyList).
Maybe, you are confused because in Java you need a "Stream" to apply functional operations like map? If so, note that in Scala all collections provide the same rich API.
Fourth, Ordering is a Typeclass which is a functional pattern for Polymorphism. On its own, this is a very broad question so I won't answer it here, but I hope the two links provide insight.
The TL;DR; is simple, it is just that an Ordering for a type T knows how to order (sort) elements of type T. Thus operations like max will work for any collection of any type if, and only if, the compiler can prove the existence of an Ordering for that type if it can then it will pass such value implicitly to the method call for you; again the implicits topic is very broad and deserves its own question.
Now for your particular question, you can just call max or maxOption in the List or Stream and that is all.
Note that max will throw if the List is empty, whereas maxOption returns an Option which will be empty (None) for an empty input; idiomatic Scala favour the latter over the former.
If you really want to use compareTo then you can provide your own Ordering.
scalaList.maxOption(Ordering.fromLessThan[LocalDate]((d1, d2) => d1.compareTo(d2) < 0))
Ordering[A] is a type class which defines how to compare 2 elements of type A. So to compare LocalDates you need Ordering[LocalDate] instance.
LocalDate extends Comparable in Java and Scala conveniently provides instances for Comparables so when you invoke:
Ordering[java.time.LocalDate]
in REPL you'll see that Scala is able to provide you the instance without you needing to do anything (you could take a look at the list of methods provided by this typeclass).
Since you have and Ordering in implicit scope which types matches the Stream's type (e.g. Stream[LocalDate] needs Ordering[LocalDate]) you can call .max method... and that's it.
val processedResult : Stream[LocalDate] = ...
val newestDate: LocalDate = processedResult.max

Convert a Seq[String] to a case class in a typesafe way

I have written a parser which transforms a String to a Seq[String] following some rules. This will be used in a library.
I am trying to transform this Seq[String] to a case class. The case class would be provided by the user (so there is no way to guess what it will be).
I have thought to shapeless library because it seems to implement the good features and it seems mature, but I have no idea to how to proceed.
I have found this question with an interesting answer but I don't find how to transform it for my needs. Indeed, in the answer there is only one type to parse (String), and the library iterates inside the String itself. It probably requires a deep change in the way things are done, and I have no clue how.
Moreover, if possible, I want to make this process as easy as possible for the user of my library. So, if possible, unlike the answer in link above, the HList type would be guess from the case class itself (however according to my search, it seems the compiler needs this information).
I am a bit new to the type system and all these beautiful things, if anyone is able to give me an advice on how to do, I would be very happy!
Kind Regards
--- EDIT ---
As ziggystar requested, here is some possible of the needed signature:
//Let's say we are just parsing a CSV.
#onUserSide
case class UserClass(i:Int, j:Int, s:String)
val list = Seq("1,2,toto", "3,4,titi")
// User transforms his case class to a function with something like:
val f = UserClass.curried
// The function created in 1/ is injected in the parser
val parser = new Parser(f)
// The Strings to convert to case classes are provided as an argument to the parse() method.
val finalResult:Seq[UserClass] = parser.parse(list)
// The transfomation is done in two steps inside the parse() method:
// 1/ first we have: val list = Seq("1,2,toto", "3,4,titi")
// 2/ then we have a call to internalParserImplementedSomewhereElse(list)
// val parseResult is now equal to Seq(Seq("1", "2", "toto"), Seq("3","4", "titi"))
// 3/ finally Shapeless do its magick trick and we have Seq(UserClass(1,2,"toto"), UserClass(3,4,"titi))
#insideTheLibrary
class Parser[A](function:A) {
//The internal parser takes each String provided through argument of the method and transforms each String to a Seq[String]. So the Seq[String] provided is changed to Seq[Seq[String]].
private def internalParserImplementedSomewhereElse(l:Seq[String]): Seq[Seq[String]] = {
...
}
/*
* Class A and B are both related to the case class provided by the user:
* - A is the type of the case class as a function,
* - B is the type of the original case class (can be guessed from type A).
*/
private def convert2CaseClass[B](list:Seq[String]): B {
//do something with Shapeless
//I don't know what to put inside ???
}
def parse(l:Seq[String]){
val parseResult:Seq[Seq[String]] = internalParserImplementedSomewhereElse(l:Seq[String])
val finalResult = result.map(convert2CaseClass)
finalResult // it is a Seq[CaseClassProvidedByUser]
}
}
Inside the library some implicit would be available to convert the String to the correct type as they are guessed by Shapeless (similar to the answered proposed in the link above). Like string.toInt, string.ToDouble, and so on...
May be there are other way to design it. It's just what I have in mind after playing with Shapeless few hours.
This uses a very simple library called product-collecions
import com.github.marklister.collections.io._
case class UserClass(i:Int, j:Int, s:String)
val csv = Seq("1,2,toto", "3,4,titi").mkString("\n")
csv: String =
1,2,toto
3,4,titi
CsvParser(UserClass).parse(new java.io.StringReader(csv))
res28: Seq[UserClass] = List(UserClass(1,2,toto), UserClass(3,4,titi))
And to serialize the other way:
scala> res28.csvIterator.toList
res30: List[String] = List(1,2,"toto", 3,4,"titi")
product-collections is orientated towards csv and a java.io.Reader, hence the shims above.
This answer will not tell you how to do exactly what you want, but it will solve your problem. I think you're overcomplicating things.
What is it you want to do? It appears to me that you're simply looking for a way to serialize and deserialize your case classes - i.e. convert your Scala objects to a generic string format and the generic string format back to Scala objects. Your serialization step presently is something you seem to already have defined, and you're asking about how to do the deserialization.
There are a few serialization/deserialization options available for Scala. You do not have to hack away with Shapeless or Scalaz to do it yourself. Try to take a look at these solutions:
Java serialization/deserialization. The regular serialization/deserialization facilities provided by the Java environment. Requires explicit casting and gives you no control over the serialization format, but it's built in and doesn't require much work to implement.
JSON serialization: there are many libraries that provide JSON generation and parsing for Java. Take a look at play-json, spray-json and Argonaut, for example.
The Scala Pickling library is a more general library for serialization/deserialization. Out of the box it comes with some binary and some JSON format, but you can create your own formats.
Out of these solutions, at least play-json and Scala Pickling use macros to generate serializers and deserializers for you at compile time. That means that they should both be typesafe and performant.

Scala's compiler lexer

I'm trying to get a list of tokens (I'm most interested in keywords) and their positions for a given scala source file.
I think there is a lexer utility inside scala compiler, but I can't find it. Can you point me into the right direction?
A simple lexer for a Scala-like language is provided in a standard library.
A small utility program which tokenizes Scala source using the same lexer as compiler does lives here
Scalariform has an accurate Scala lexer you can use:
import scalariform.lexer._
val tokens = ScalaLexer.rawTokenise("class A", forgiveErrors = true)
val keywords = tokens.find(_.tokenType.isKeyword)
val comments = tokens.find(_.tokenType.isComment)
Parser Combinators might help in what you are trying to achieve here, especially if you later on are not only interessted in keyword parsing.

Is string concatenation in scala as costly as it is in Java?

In Java, it's a common best practice to do string concatenation with StringBuilder due to the poor performance of appending strings using the + operator. Is the same practice recommended for Scala or has the language improved on how java performs its string concatenation?
Scala uses Java strings (java.lang.String), so its string concatenation is the same as Java's: the same thing is taking place in both. (The runtime is the same, after all.) There is a special StringBuilder class in Scala, that "provides an API compatible with java.lang.StringBuilder"; see http://www.scala-lang.org/api/2.7.5/scala/StringBuilder.html.
But in terms of "best practices", I think most people would generally consider it better to write simple, clear code than maximally efficient code, except when there's an actual performance problem or a good reason to expect one. The + operator doesn't really have "poor performance", it's just that s += "foo" is equivalent to s = s + "foo" (i.e. it creates a new String object), which means that, if you're doing a lot of concatenations to (what looks like) "a single string", you can avoid creating unnecessary objects — and repeatedly recopying earlier portions from one string to another — by using a StringBuilder instead of a String. Usually the difference is not important. (Of course, "simple, clear code" is slightly contradictory: using += is simpler, using StringBuilder is clearer. But still, the decision should usually be based on code-writing considerations rather than minor performance considerations.)
Scalas String concatenation works the same way as Javas does.
val x = 5
"a"+"b"+x+"c"
is translated to
new StringBuilder()).append("ab").append(BoxesRunTime.boxToInteger(x)).append("c").toString()
StringBuilder is scala.collection.mutable.StringBuilder. That's the reason why the value appended to the StringBuilder is boxed by the compiler.
You can check the behavior by decompile the bytecode with javap.
I want to add: if you have a sequence of strings, then there is already a method to create a new string out of them (all items, concatenated). It's called mkString.
Example: (http://ideone.com/QJhkAG)
val example = Seq("11111", "2222", "333", "444444")
val result = example.mkString
println(result) // prints "111112222333444444"
Scala uses java.lang.String as the type for strings, so it is subject to the same characteristics.

SQL DSL for Scala

I am struggling to create a SQL DSL for Scala. The DSL is an extension to Querydsl, which is a popular Query abstraction layer for Java.
I am struggling now with really simple expressions like the following
user.firstName == "Bob" || user.firstName == "Ann"
As Querydsl supports already an expression model which can be used here I decided to provide conversions from Proxy objects to Querydsl expressions. In order to use the proxies I create an instance like this
import com.mysema.query.alias.Alias._
var user = alias(classOf[User])
With the following implicit conversions I can convert proxy instances and proxy property call chains into Querydsl expressions
import com.mysema.query.alias.Alias._
import com.mysema.query.types.expr._
import com.mysema.query.types.path._
object Conversions {
def not(b: EBoolean): EBoolean = b.not()
implicit def booleanPath(b: Boolean): PBoolean = $(b);
implicit def stringPath(s: String): PString = $(s);
implicit def datePath(d: java.sql.Date): PDate[java.sql.Date] = $(d);
implicit def dateTimePath(d: java.util.Date): PDateTime[java.util.Date] = $(d);
implicit def timePath(t: java.sql.Time): PTime[java.sql.Time] = $(t);
implicit def comparablePath(c: Comparable[_]): PComparable[_] = $(c);
implicit def simplePath(s: Object): PSimple[_] = $(s);
}
Now I can construct expressions like this
import com.mysema.query.alias.Alias._
import com.mysema.query.scala.Conversions._
var user = alias(classOf[User])
var predicate = (user.firstName like "Bob") or (user.firstName like "Ann")
I am struggling with the following problem.
eq and ne are already available as methods in Scala, so the conversions aren't triggered when they are used
This problem can be generalized as the following. When using method names that are already available in Scala types such as eq, ne, startsWith etc one needs to use some kind of escaping to trigger the implicit conversions.
I am considering the following
Uppercase
var predicate = (user.firstName LIKE "Bob") OR (user.firstName LIKE "Ann")
This is for example the approach in Circumflex ORM, a very powerful ORM framework for Scala with similar DSL aims. But this approach would be inconsistent with the query keywords (select, from, where etc), which are lowercase in Querydsl.
Some prefix
var predicate = (user.firstName :like "Bob") :or (user.firstName :like "Ann")
The context of the predicate usage is something like this
var user = alias(classOf[User])
query().from(user)
.where(
(user.firstName like "Bob") or (user.firstName like "Ann"))
.orderBy(user.firstName asc)
.list(user);
Do you see better options or a different approach for SQL DSL construction for Scala?
So the question basically boils down to two cases
Is it possible to trigger an implicit type conversion when using a method that exists in the super class (e.g. eq)
If it is not possible, what would be the most Scalaesque syntax to use for methods like eq, ne.
EDIT
We got Scala support in Querydsl working by using alias instances and a $-prefix based escape syntax. Here is a blog post on the results : http://blog.mysema.com/2010/09/querying-with-scala.html
There was a very good talk at Scala Days: Type-safe SQL embedded in Scala by Christoph Wulf.
See the video here: Type-safe SQL embedded in Scala by Christoph Wulf
Mr Westkämper - I was pondering this problem, and I wondered if would be possible to use 'tracer' objects, where the basic data types such as Int and String would be extended such that they contained source information, and the results of combining them would likewise hold within themselves their sources and the nature of the combination.
For example, your user.firstName method would return a TracerString, which extends String, but which also indicates that the String corresponds to a column in a relation. The == method would be overwritten such that it returns an EqualityTracerBoolean which extends Boolean. This would preserve the standard Scala semantics. However, the constructor for EqualityTracerBoolean would record the fact that the result of the expression was derived by comparing a column in a relation to a string constant. Your 'where' method could then analyse the EqualityTracerBoolean object returned by the conditional expression evaluated over a dummy argument in order to derive the expression used to create it.
There would have to be override defs for inequality operators, as well as plus and minus, for Ints, and whatever else you wished to represent from sql, and corresponding tracer classes for each of these. It would be a bit of a project!
Anyway, I decided not to bother, and use squeryl instead.
I didn't have the exact same problem with jOOQ, as I'm using a bit more verbose operator names: equal, notEqual, etc instead of eq, ne. On the other hand, there is a val operator in jOOQ for explicitly creating bind values, which I had to overload with value, as val is a keyword in Scala. Is overloading operators an option for you? I documented my attempts of running jOOQ in Scala here:
http://lukaseder.wordpress.com/2011/12/11/the-ultimate-sql-dsl-jooq-in-scala/
Just like you, I had also thought about capitalising all keywords in a major release (including SELECT, FROM, etc). But that will leave an open question about whether "compound" keywords should be split in two method calls, or connected by an underscore: GROUP().BY() or GROUP_BY(). WHEN().MATCHED().THEN().UPDATE() or WHEN_MATCHED_THEN_UPDATE(). Since the result is not really satisfying, I guess it's not worth to break backwards-compatibility for such a fix, even if the two-method-call option would look very very nice in Scala, as . and () can be omitted. So maybe, jOOQ and QueryDSL should both be "wrapped" (as opposed to "extended") by a dedicated Scala-API?
What about decompiling the bytecode at runtime? I started to write such a tool:
http://h2database.com/html/jaqu.html#natural_syntax
I know it's a hack, so please don't vote -1 :-) I just wanted to mentioned it. It's a relatively novel approach. Instead of decompiling at runtime, it might be possible to do it at compile time using an annotation processor, not sure if that's possible using Scala (and not sure if it's really possible with Java, but Project Lombok seems to do something like that).