Am trying to compare String with BigDecimal in openjpa using Criteria builder - openjpa

list.add(build.equal(bondRoot.get(Bond_.writingCompanyCode),dtlRoot.get(ArPstdDtl_.companycd ).as(String.class)));
but am getting the following error:
Caused by: org.apache.openjpa.persistence.ArgumentException: No metadata
was found for type "class java.lang.String". The class is not
enhanced.
Could someone help me out on this ??

Changing type from BigDecimal to String needs conversion, not a cast. Method as cannot convert from type to other - it is purely for typecasting, as documented:
Perform a typecast upon the expression, returning a new expression
object. This method does not cause type conversion: the runtime type
is not changed. Warning: may result in a runtime failure.
Criteria API does not offer conversion from BigDecimal to String. Database vendor specific functions can be used via CriteriaBuilder.function.

Related

Scala var type usage results in type mismatch

I thought that Scala var type is cool, helps to avoid tons of some technical code and makes it possible to concentrate on functionality. However, I now face something really strange. When I compile my program, I get an error message from sbt:
type mismatch;
found: java.sql.Connection
required: String
this.conn = DriverManager.getConnection(
^
Please, pay attention that the compiler points to conn property of the class, and this property is defined in the class like so:
class Db{
private var conn = ""
....
}
So, why does compiler care about types matching, if it is Scala and if I'm using var data type?
var is not a data type. It is a keyword for declaring and defining a mutable variable. The type is not dynamic---it is still inferred at compile-time. In this case conn is inferred to be a String, and it is completely identical to writing
private var conn: String = ""
The whole point of Scala's type system is to disallow passing incompatible types around. It's failing because, obviously, you cannot assign an SQL connection to a variable of type String. Type inference does not allow you to ignore the types of objects, it just lets the compiler figure it out where possible.
var is not a data type, it's one keyword used to define variables in Scala. The other one is val. Whether you use var or val only affects whether the variable you define can be re-assigned to (var) or is read-only (val). It does not affect the type of the variable in anyway.
Regardless of whether you use var or val, the type of a variable is either specified explicitly (by writing : theType after the variable name) or inferred implicitly from the value you assign it to.
In your example, you did not explicitly provide a type, so the inferred type was String as that is the type of conn.

How to disallow null as a value in json4s deserialization

I'm deserializing data from json into case classes and just came across some malformed json providing me null for a non-optional object. I would prefer this to be parser failure, instead of setting the field value to null. That way i could safely assume that my case classes are properly populated if they parse, but I can't seem to find a way to configure the parser for this.
It is possible to initialize a field with null value and then use "#JsonInclude(Include.NON_NULL)" to ignore the field while deserialization if there wasn't any value assigned to it.
You can check for the existence of the field and interpret it as being null or not.
You could use Jackson deserializer see:
Deserialization sample
and:
Another Deserialization sample
You can also Deserialize field:
#JsonDeserialize(using = SomeDeserializer.class)
private String definition;
and than do validation in Deserializer.

Why isn't DbSet covariant?

I have a factory function to return a DbSet(Of IItemType). The actual return type will always be an implementation IItemType, for example DbSet(Of CategoryType).
I thought covariance is supported in generics and this method would work fine, but I get an exception when I try to run my code:
Unable to cast object of type 'System.Data.Entity.DbSet1[CategoryType]' to type 'System.Data.Entity.DbSet1[IItemType]'.
DbSet<T> CANNOT be co-variant. Firstly, this is because in C#, classes cannot be co-variant...only interfaces. Secondly because DbSet<T> has both co-variant and contra-variant methods.
Take the following two examples.
DbSet<CategoryType> set = context.Set<CategoryType>();
IQueryable<IItemType> query = set.Where(x => x.Foo == Bar);
So we know for a fact that all CategoryTypes are IItemType, so we know this can always work.
However conversely try this...
DbSet<CategoryType> set = context.Set<CategoryType>();
IItemType newItemType = new ProductType();
set.Add(newItemType); // Compiler error.
Not all IItemTypes are CategoryType. So if we could cast DbSet<CategoryType> to DbSet<IItemType> we would get run time errors when adding... Since the compiler knows that this might not always work, it won't let you do the casting.
However there are interfaces that DbSet<T> does allow Co-Variance on. You can try casting it to IQueryable<IItemType> for example.
However it sounds like you are trying to query against the DbSet using a query against the interface... try the following
DbSet<CategoryType> set = context.Set<CategoryType>();
IQueryable<IItemType> cast = set.OfType<IITemType>(); //(or .Cast<>() works too)
IQueryable<IItemType> query = cast.Where(x => x ....);
It looks like they could be covariant. But there is a host of differences between in-memory programming and programming against a query provider.
For Entity Framework to materialize an object from the data store it requires its exact type and the mapping from database columns to the type's properties. An interface can represent any type and EF is not (yet) able to pick a suitable implementation itself. Maybe support for interfaces will be featured in future releases.
The same applies to casting entities to an interface in an EF query (a case I just added to my list of differences).

How can I coerce an integer to an enum type in PowerShell?

Read carefully before you answer! I want to cast an integer to an enum type where the integer value is not actually defined in the enum. In VB.Net, it is possible to directly cast any integer to an integer-based enum type using DirectCast. Is there some way to accomplish this natively in PowerShell?
I need to do this in PowerShell in order to call a method on an Office Interop object (Access.Application.SysCmd) which takes an enumeration value as its first argument (AcSysCmdAction), but where the actual value I need to pass (603 for the undocumented export to accde action) is not included in the PIA enum definition. PowerShell's built-in type conversion cause it to convert either a number or a string the applicable enumeration type, but it will not coerce an int value that is not in the enum. Instead it throws an invalid conversion exception. Right now I'm resorting to a dynamically compiled ScriptControl which calls SysCmd via VBScript, but I'd like to keep everything in PowerShell if possible.
You could call the Enum class's ToObject method:
$day = [Enum]::ToObject([DayOfWeek], 90)
Well, I just figured out a way. In PowerShell, it appears enum objects have a property called "value__" which can be directly set to any value!
#Create a variable with the desired enum type
$ops = [System.Text.RegularExpressions.RegexOptions]0
#Directly set the value__ property
$ops.value__ = 3450432

Scala timestamp/date zero argument constructor?

Squeryl requires a zero argument constructor when using Option[] in fields. I realized how I could create such a constructor for Long like 0L but how do I create such a thing for a Timestamp or Date?
Essentially I need to finish this:
def this() = this(0L,"",TIMESTAMP,TIMESTAMP,0L,"","","","",Some(""),Some(""),"",DATE,DATE,false,false,false,Some(0L),Some(0),Some(0L))
Below is how I originally found the timestamp and date problem.
Background
Getting the following error in my Play! 2.0 Scala app (also using Squeryl):
Caused by: java.lang.RuntimeException: Could not deduce Option[] type of field 'startOrder' of class models.Job
This field in models.Job:
#Column("start_order")
var startOrder: Option[Int],
And in the Postgres DB it is defined as an integer. Is there different handling in Play! 2.0 of models, is this a bug, or is it a Squeryl problem? Thanks!
Stack trace, looks like Squeryl problem
Caused by: java.lang.RuntimeException: Could not deduce Option[] type of field 'startOrder' of class models.Job
at scala.sys.package$.error(package.scala:27) ~[scala-library.jar:na]
at scala.Predef$.error(Predef.scala:66) ~[scala-library.jar:0.11.2]
at org.squeryl.internals.FieldMetaData$$anon$1.build(FieldMetaData.scala:441) ~[squeryl_2.9.1-0.9.4.jar:na]
at org.squeryl.internals.PosoMetaData$$anonfun$3.apply(PosoMetaData.scala:111) ~[squeryl_2.9.1-0.9.4.jar:na]
at org.squeryl.internals.PosoMetaData$$anonfun$3.apply(PosoMetaData.scala:80) ~[squeryl_2.9.1-0.9.4.jar:na]
at scala.collection.immutable.HashMap$HashMap1.foreach(HashMap.scala:176) ~[scala-library.jar:0.11.2]
If startOrder is defined as
val startOrder: Option[java.sql.Timestamp]
in class definition. I believe,
Some(new java.sql.Timestamp(0))
should be passed to constructor.
Option is used when a value is optional, i.e. if there could be a value or not. Only if there is a value, you use Some wrapping it. But if there is no value, you use None.