I am currently using mapstruct in my project. I have a target field 't1' which could be either mapped to source field 's1' or 's2', initially it should try to map the source field 's1' to target field 't1', in case it is null it should then fallback to source field 's2' for mapping it to the target field 't1'. And the source field objects in both cases are nested and would need some processing before it is mapped to the target. I've been looking around however I wasn't able to find any solution for this. Does Mapstruct support fallback Mapping? Below is an example, Here UserRequest is the target object and User is the source. In case I would like map id of the userRequest to user.address.postal primarily and in case it's null the id should be mapped to location.id of the overrideLocation. Can anyone please help me with this.
UserRequest{
String id;
}
User{
String name;
Address adress;
OverrideLocation location;
}
Address{
String postalCode;
}
OverrideLocation{
String id;
}
For now I've been using the AfterMapping in mapstruct to accomplish this. The primary mapping I specified as part of the #Mapping and the fallback mapping I've specified under AfterMapping. However I was wondering if there is a better way to deal with it. I looked into defaultValue and default expression as well, however I wasn't able to use it in this case
Related
I'm using EF 6.2.0, Code First from Database, in a .net 4.5.2 environment.
I have this DbSet:
<Table("Firebird.PLANT4")>
Partial Public Class PLANT4
<Key>
<DatabaseGenerated(DatabaseGeneratedOption.None)>
Public Property ID_PLANT4 As Integer
Public Property STATUS As Integer
<Required>
<StringLength(20)>
Public Property DESCRIPTION1 As String
Public Property COUNTER As Long
Public Property RESET_COUNTER As Integer
End Class
When I execute this code:
Using dbContext As New DbModel
dbContext.Database.Connection.ConnectionString = GetFbConnectionString()
dbContext.Database.Connection.Open()
Dim plant As PLANT4 = dbContext.PLANT4.Find(1)
plant.RESET_COUNTER = 1
dbContext.SaveChanges()
End Using
I get the error: "DESCRIPTION1 field is required".
The exception is throwing during SaveChanges.
I can't understand where the problem is, as if I watch "plant" in debug, all fields are there (ID_PLANT4 = 1 is an existing row), and DESCRIPTION1 in particular is not Nothing.
I can simply remove the "Required" attribute and it works, but the attribute is a consequence of the db column not allowing nulls, so I don't think this is the right way to go.
I can even add this line of code, just after the "Using" statement:
dbContext.Configuration.ValidateOnSaveEnabled = False
and it works, but again I don't think this is the right way to go.
What is the reason of this behavior?
Eventually I find that the problem is: field DESCRIPTION1 is populate by default with 20 spaces. It is not null, but it is a string formed only by spaces. For an unknown reason, during validation this string is treated by EF like null, and an exception is thrown because it's a required field.
"Required" attribute is not needed, but I generate my POCO classes by "Code First from Database", so if a VARCHAR field is declared as "not null" it is automatically generated with the "Required" attribute. Now I think it's better allowing nulls for VARCHAR columns.
I am wondering if there exists a way to map String on the java side for manual references to ObjectIds and vice-versa. For example in:
User =[{_id: ObjectId('123'),
pics_id: ObjectId('123'), ...
}, ... ]
pics_id is a manual reference to another collection. The following code:
class User{
#Id id;
#Field("pics_id") String picId;
}
stores the pics_id as String instead of ObjectId.
Is there any way to make this happen without the use of the ObjectId class instead of String? IMHO, using ObjectId in the java code would make the code look a bit strange, as some ids are string(such as fields annotated with #Id) and some are objectIds. Thank you.
With the upcoming Spring Data MongoDB 2.2 release it is possible to define the desired target type via the #Field annotation.
The type information is passed down to the conversion subsystem so that eg. a plain String can be stored as Code or ObjectId.
class User {
#Id String id;
#Field(targetType = FieldType.OBJECT_ID)
String picId;
}
Please have a look at the documentation of 2.2.0.RC1 for more details.
I have EF poco classproperty which has the DataAnnotatins. They include the FK, mandatory, maxlength conditions.
[Required(ErrorMessage = "Company name cannot be empty")]
[StringLength(128, ErrorMessage = "The CompanyName should be less than 128 characters or less.")]
[Index(IsUnique = true)]
public string CompanyName { get; set; }
I am trying to move all these into EntityTypeConfigurations and am struggling to move the ErrorMessages.
Can any one give me a pointer on how to get this done>
As you can read here, constraints configured by fluent mappings will by evaluated only in the context. They don't trickle through to the UI, as data annotations do (when used with the correct framework). So the EF team figured it wouldn't make sense to craft a user-friendly error message here. The validation will just throw a standard DbValidationError saying something like
The field Name must be a string or array type with a maximum length of '128'
So you need the annotations if you want your own custom messages.
I try in Grails service save an object to mongodb:
Cover saveCover = new Cover()
saveCover.id = url
saveCover.url = url
saveCover.name = name
saveCover.sku = sku
saveCover.price = price
saveCover.save()
Cover domain looks like this:
class Cover {
String id
String name
String url
String sku
String price
}
So I want to have custom id based on url, but during save process I get error:
Could not commit Datastore transaction; nested exception is
org.grails.datastore.mapping.core.OptimisticLockingException: The
instance was updated by another user while you were editing
But if I didn`t use setters and just pass all values in constructor, the exception is gone. Why?
As reported in the documentation here:
Note that if you manually assign an identifier, then you will need to use the insert method instead of the save method, otherwise GORM can't work out whether you are trying to achieve an insert or an update
so you need to use insert method instead of save when id generator is assigned
cover.insert(failOnError: true)
if you do not define the mapping like this:
static mapping = {
id generator: 'assigned'
}
and will use insert method you'll get an auto-generated objectId:
"_id" : "5496e904e4b03b155725ebdb"
This exception occurs when you assign an id to a new model and try to save it because GORM thinks it should be doing an update.
Why this exception occurs
When I ran into this issue I was using 1.3.0 of the grails-mongo plugin. That uses 1.1.9 of the grails datastore core code. I noticed that the exception gets generated on line 847(ish) of NativeEntryEntityPersister. This code updates an existing domain object in the db.
Above that on line 790 is where isUpdate is created which is used to see if it's an update or not. isInsert is false as it is only true when an insert is forced and readObjectIdentifier will return the id that has been assigned to the object so isUpdate will end up evaluating as true.
Fixing the exception
Thanks to && !isInsert on line 791 if you force an insert the insert code will get called and sure enough the exception will go away. However when I did this the assigned id wasn't saved and instead a generated object id was used. I saw that the fix for this was on line 803 where it checks to see if the generator is set to "assigned".
To fix that you can add the following mapping.
class Cover {
String id
String name
String url
String sku
String price
static mapping = {
id generator: 'assigned'
}
}
A side effect of this is that you will always need to assign an id for new Cover domain objects.
So I know that I can write an interface like the one below and Spring Data will automatically generate the necessary database access stuff for me. Now what I'd like to do is add a new method name that will count the number of entities that match a set of criteria.
public interface EventRegistrationRepository extends JpaRepository<EventRegistration, String>
{
List<EventRegistration> findByUser_EmailAddress(String email);
int countByEvent_Code(String eventCode);
}
As of now the countBy method causes this error:
Caused by: org.springframework.data.mapping.PropertyReferenceException: No property count found for type com.brazencareerist.brazenconnect.model.relational.EventRegistration
What's the proper syntax for what I'm trying to do here?
This works as expected as of the just released Spring Data JPA 1.4.0.M1.