How to create a GORM model from GRPC protobuf [duplicate] - postgresql

I am trying to figure out how to integrate the gorm.Model fields (deleted_at, create_at, id, etc) into my proto3 definitions. However, I can't a datetime type for proto3. I tried looking for documentation on how to serialize the gorm fields to strings (since proto3 handles strings) but I have not found anything.
Has anyone been able to successfully use the gorm model fields in their proto definitions? I'm using go-micro's plugin to generate *pb.go files.
Here's my current message definition which doesn't work. It seems like empty strings are being stored in the database for deleted_at since when querying for deleted_at is null the postgres database returns nothing.
message DatabaseConfig {
string address = 1;
int32 port = 2;
string databaseName = 3;
string username = 4;
string password = 5;
string databaseType = 6;
string quertStatement = 7;
int32 id = 8;
string createdAt = 9;
string updatedAt = 10;
string deletedAt = 11;
}
UPDATE:
I've updated my proto def to the following but gorm still isn't properly using the Id, CreatedAt, UpdatedAt, and DeletedAt fields
syntax = "proto3";
package go.micro.srv.importer;
import "google/protobuf/timestamp.proto";
import "github.com/gogo/protobuf/gogoproto/gogo.proto";
service ImporterService {
rpc CreateDatabaseConfig(DatabaseConfig) returns (Response) {}
rpc RetrieveDatabaseConfig(GetRequest) returns (Response) {}
rpc UpdateDatabaseConfig(DatabaseConfig) returns (Response) {}
rpc DeleteDatabaseConfig(DatabaseConfig) returns (Response) {}
}
message GetRequest {}
message DatabaseConfig {
string address = 1;
int32 port = 2;
string databaseName = 3;
string username = 4;
string password = 5;
string databaseType = 6;
string quertStatement = 7;
int32 id = 8;
google.protobuf.Timestamp createdAt = 9 [(gogoproto.stdtime) = true];
google.protobuf.Timestamp updatedAt = 10 [(gogoproto.stdtime) = true];
google.protobuf.Timestamp deletedAt = 11 [(gogoproto.stdtime) = true];
}
message Response {
bool created = 1;
DatabaseConfig database_config = 2;
repeated DatabaseConfig databaseConfigs = 3;
}

The protoc-gen-gorm project did not work for me. It looks like there is some blending of proto2 and proto3 happening, and ultimately I was unable to get it to work.
My solution was to create a script to do post processing after I generate the go files from protobuf.
If this was my proto profile/profile.proto:
message Profile {
uint64 id = 1;
string name = 2;
bool active = 3;
// ...
}
Which created profile/profile.pb.go with standard protoc command:
// ...
type Profile struct {
state protoimpl.MessageState
sizeCache protoimpl.SizeCache
unknownFields protoimpl.UnknownFields
Id uint64 `protobuf:"varint,1,opt,name=id,proto3" json:"id,omitempty"`
Name string `protobuf:"bytes,2,opt,name=name,proto3" json:"name,omitempty"`
Active bool `protobuf:"varint,3,opt,name=active,proto3" json:"active,omitempty"`
}
// ...
I use this script gorm.sh:
#!/bin/bash
g () {
sed "s/json:\"$1,omitempty\"/json:\"$1,omitempty\" gorm:\"type:$2\"/"
}
cat $1 \
| g "id" "primary_key" \
| g "name" "varchar(100)" \
> $1.tmp && mv $1{.tmp,}
Which I invoke on my go file after it's generated with ./gorm.sh profile/profile.pb.go and the result of profile/profile.pb.go is:
// ...
type Profile struct {
state protoimpl.MessageState
sizeCache protoimpl.SizeCache
unknownFields protoimpl.UnknownFields
Id uint64 `protobuf:"varint,1,opt,name=id,proto3" json:"id,omitempty" gorm:"type:primary_key"`
Name string `protobuf:"bytes,2,opt,name=name,proto3" json:"name,omitempty" gorm:"type:varchar(100)"`
Active bool `protobuf:"varint,3,opt,name=active,proto3" json:"active,omitempty"`
}
// ...

Try to use protoc-gen-gorm. It will create another file .pb.gorm.go

Might be an option to use something like this: https://github.com/favadi/protoc-go-inject-tag to generate the tags automatically (I am still looking into this myself)

Related

MapStruct protobuf List to Pojo Mapping

As part of GRPC api I am trying to map proto autogenerated classes to pojo. This is the .proto file
message AccountModelProto
{
repeated VerificationModelProto verification = 1;
}
message VerificationModelProto
{
string status = 1;
string comment = 2;
string verificationType = 3;
repeated VerificationAttributeModelProto verificationAttributes = 4;
}
message VerificationAttributeModelProto
{
string type = 1;
string label = 2;
bool attributeStatus = 3;
}
The mapper for the above code is. I referred to the examples provided by mapstruct according to it I don't need to provide explicit mapping of VerificationModelProto to List but I am getting compiler error
error: Can't map property "Collection verification" to "VerificationModelProto verificationList". Consider to declare/implement a mapping method: "VerificationModelProto map(Collection value)".
#Mapping(source = "verification", target = "verificationList", qualifiedByName = "verificationModelToVerificationProtoMapping")
AccountModelProto map(AccountModel accountModel);
#Named("verificationModelToVerificationProtoMapping")
default VerificationModelProto map (VerificationModel verificationModel)
{
VerificationModelProto.Builder builder = VerificationModelProto.newBuilder()
.setComment(verificationModel.getComment())
.setStatus(verificationModel.getStatus().toString())
.setVerificationType(verificationModel.getVerificationType());
for(VerificationAttributeModel verificationAttributeModel: verificationModel.getVerificationAttributes())
{
builder.addVerificationAttributes(getVerificationAttributeBuilder(verificationAttributeModel));
}
return builder.build();
}
default VerificationAttributeModelProto getVerificationAttributeBuilder(VerificationAttributeModel verificationAttributeModel)
{
VerificationAttributeModelProto.Builder builder = VerificationAttributeModelProto.newBuilder()
.setAttributeStatus(verificationAttributeModel.getAttributeStatus())
.setType(verificationAttributeModel.getType())
.setLabel(verificationAttributeModel.getLabel());
return builder.build();
}
How to get through this. I added CollectionMappingStrategy as CollectionMappingStrategy.ADDER_PREFERRED.

Basic where clause with InfluxDB and Grafana

I'm struggling with a real simple task:
Make a SELECT property_a WHERE property_b == "my_value" using InfluxDB.
I have have the following datastruct:
type RegionsJsonData struct {
Data string `json:"data"`
Stato string `json:"stato"`
CodiceRegione int `json:"codice_regione"`
DenominazioneRegione string `json:"denominazione_regione"`
Lat float64 `json:"lat"`
Long float64 `json:"long"`
RicoveratiConSintomi int `json:"ricoverati_con_sintomi"`
TerapiaIntensiva int `json:"terapia_intensiva"`
TotaleOspedalizzati int `json:"totale_ospedalizzati"`
IsolamentoDomiciliare int `json:"isolamento_domiciliare"`
TotaleAttualmentePositivi int `json:"totale_attualmente_positivi"`
NuoviAttualmentePositivi int `json:"nuovi_attualmente_positivi"`
DimessiGuariti int `json:"dimessi_guariti"`
Deceduti int `json:"deceduti"`
TotaleCasi int `json:"totale_casi"`
Tamponi int `json:"tamponi"`
Datetime time.Time
}
This structure will be populated with some data, than i insert the struct into InfluxDB with the following statement:
var m map[string]interface{} = make(map[string]interface{})
m["codice_regione"] = provinceData[i].CodiceRegione
m["denominazione_regione"] = provinceData[i].DenominazioneRegione
m["lat"] = provinceData[i].Lat
m["long"] = provinceData[i].Long
m["ricoverati_con_sintomi"] = provinceData[i].RicoveratiConSintomi
m["terapia_intensiva"] = provinceData[i].TerapiaIntensiva
m["totale_ospedalizzati"] = provinceData[i].TotaleOspedalizzati
m["isolamento_domiciliare"] = provinceData[i].IsolamentoDomiciliare
m["totale_attualmente_positivi"] = provinceData[i].TotaleAttualmentePositivi
m["nuovi_attualmente_positivi"] = provinceData[i].NuoviAttualmentePositivi
m["dimessi_guariti"] = provinceData[i].DimessiGuariti
m["deceduti"] = provinceData[i].Deceduti
m["totale_casi"] = provinceData[i].TotaleCasi
m["tamponi"] = provinceData[i].Tamponi
pts[i] = client.Point{
Measurement: "regions_data",
Tags: nil,
Time: provinceData[i].Datetime,
Fields: m}
The data are inserted into InfluxDB, I'm able to display some graph using Grafana.
However, i need to create a graph for every "denominazione_regione" field.
So, from Grafana, I've made the following query:
But unfortunately no data are displayed, am i missing something?
How to make a WHERE clause from Grafana using InfluxDB?
As pointed by #JanGaraj, if you want to make some query you need to save the field as a tag as following:
var m map[string]interface{} = make(map[string]interface{})
m["total_deaths"] = worldData[i].TotalDeaths
m["total_cases"] = worldData[i].TotalCases
m["new_deaths"] = worldData[i].NewDeaths
m["new_cases"] = worldData[i].NewCases
pts[i] = client.Point{
Measurement: "all_world_data",
Tags: map[string]string{"nation": worldData[i].State},
Time: worldData[i].Date,
Fields: m}
Than you can run a query as following:

Integrating gorm.Model fields into protobuf definitions

I am trying to figure out how to integrate the gorm.Model fields (deleted_at, create_at, id, etc) into my proto3 definitions. However, I can't a datetime type for proto3. I tried looking for documentation on how to serialize the gorm fields to strings (since proto3 handles strings) but I have not found anything.
Has anyone been able to successfully use the gorm model fields in their proto definitions? I'm using go-micro's plugin to generate *pb.go files.
Here's my current message definition which doesn't work. It seems like empty strings are being stored in the database for deleted_at since when querying for deleted_at is null the postgres database returns nothing.
message DatabaseConfig {
string address = 1;
int32 port = 2;
string databaseName = 3;
string username = 4;
string password = 5;
string databaseType = 6;
string quertStatement = 7;
int32 id = 8;
string createdAt = 9;
string updatedAt = 10;
string deletedAt = 11;
}
UPDATE:
I've updated my proto def to the following but gorm still isn't properly using the Id, CreatedAt, UpdatedAt, and DeletedAt fields
syntax = "proto3";
package go.micro.srv.importer;
import "google/protobuf/timestamp.proto";
import "github.com/gogo/protobuf/gogoproto/gogo.proto";
service ImporterService {
rpc CreateDatabaseConfig(DatabaseConfig) returns (Response) {}
rpc RetrieveDatabaseConfig(GetRequest) returns (Response) {}
rpc UpdateDatabaseConfig(DatabaseConfig) returns (Response) {}
rpc DeleteDatabaseConfig(DatabaseConfig) returns (Response) {}
}
message GetRequest {}
message DatabaseConfig {
string address = 1;
int32 port = 2;
string databaseName = 3;
string username = 4;
string password = 5;
string databaseType = 6;
string quertStatement = 7;
int32 id = 8;
google.protobuf.Timestamp createdAt = 9 [(gogoproto.stdtime) = true];
google.protobuf.Timestamp updatedAt = 10 [(gogoproto.stdtime) = true];
google.protobuf.Timestamp deletedAt = 11 [(gogoproto.stdtime) = true];
}
message Response {
bool created = 1;
DatabaseConfig database_config = 2;
repeated DatabaseConfig databaseConfigs = 3;
}
The protoc-gen-gorm project did not work for me. It looks like there is some blending of proto2 and proto3 happening, and ultimately I was unable to get it to work.
My solution was to create a script to do post processing after I generate the go files from protobuf.
If this was my proto profile/profile.proto:
message Profile {
uint64 id = 1;
string name = 2;
bool active = 3;
// ...
}
Which created profile/profile.pb.go with standard protoc command:
// ...
type Profile struct {
state protoimpl.MessageState
sizeCache protoimpl.SizeCache
unknownFields protoimpl.UnknownFields
Id uint64 `protobuf:"varint,1,opt,name=id,proto3" json:"id,omitempty"`
Name string `protobuf:"bytes,2,opt,name=name,proto3" json:"name,omitempty"`
Active bool `protobuf:"varint,3,opt,name=active,proto3" json:"active,omitempty"`
}
// ...
I use this script gorm.sh:
#!/bin/bash
g () {
sed "s/json:\"$1,omitempty\"/json:\"$1,omitempty\" gorm:\"type:$2\"/"
}
cat $1 \
| g "id" "primary_key" \
| g "name" "varchar(100)" \
> $1.tmp && mv $1{.tmp,}
Which I invoke on my go file after it's generated with ./gorm.sh profile/profile.pb.go and the result of profile/profile.pb.go is:
// ...
type Profile struct {
state protoimpl.MessageState
sizeCache protoimpl.SizeCache
unknownFields protoimpl.UnknownFields
Id uint64 `protobuf:"varint,1,opt,name=id,proto3" json:"id,omitempty" gorm:"type:primary_key"`
Name string `protobuf:"bytes,2,opt,name=name,proto3" json:"name,omitempty" gorm:"type:varchar(100)"`
Active bool `protobuf:"varint,3,opt,name=active,proto3" json:"active,omitempty"`
}
// ...
Try to use protoc-gen-gorm. It will create another file .pb.gorm.go
Might be an option to use something like this: https://github.com/favadi/protoc-go-inject-tag to generate the tags automatically (I am still looking into this myself)

Build dynamic LINQ queries from a string - Use Reflection?

I have some word templates(maybe thousands). Each template has merge fields which will be filled from database. I don`t like writing separate code for every template and then build the application and deploy it whenever a template is changed or a field on the template is added!
Instead, I'm trying to define all merge fields in a separate xml file and for each field I want to write the "query" which will be called when needed. EX:
mergefield1 will call query "Case.Parties.FirstOrDefault.NameEn"
mergefield2 will call query "Case.CaseNumber"
mergefield3 will call query "Case.Documents.FirstOrDefault.DocumentContent.DocumentType"
Etc,
So, for a particular template I scan its merge fields, and for each merge field I take it`s "query definition" and make that request to database using EntityFramework and LINQ. Ex. it works for these queries: "TimeSlots.FirstOrDefault.StartDateTime" or
"Case.CaseNumber"
This will be an engine which will generate word documents and fill it with merge fields from xml. In addition, it will work for any new template or new merge field.
Now, I have worked a version using reflection.
public string GetColumnValueByObjectByName(Expression<Func<TEntity, bool>> filter = null, string objectName = "", string dllName = "", string objectID = "", string propertyName = "")
{
string objectDllName = objectName + ", " + dllName;
Type type = Type.GetType(objectDllName);
Guid oID = new Guid(objectID);
dynamic Entity = context.Set(type).Find(oID); // get Object by Type and ObjectID
string value = ""; //the value which will be filled with data from database
IEnumerable<string> linqMethods = typeof(System.Linq.Enumerable).GetMethods(BindingFlags.Static | BindingFlags.Public).Select(s => s.Name).ToList(); //get all linq methods and save them as list of strings
if (propertyName.Contains('.'))
{
string[] properies = propertyName.Split('.');
dynamic object1 = Entity;
IEnumerable<dynamic> Child = new List<dynamic>();
for (int i = 0; i < properies.Length; i++)
{
if (i < properies.Length - 1 && linqMethods.Contains(properies[i + 1]))
{
Child = type.GetProperty(properies[i]).GetValue(object1, null);
}
else if (linqMethods.Contains(properies[i]))
{
object1 = Child.Cast<object>().FirstOrDefault(); //for now works only with FirstOrDefault - Later it will be changed to work with ToList or other linq methods
type = object1.GetType();
}
else
{
if (linqMethods.Contains(properies[i]))
{
object1 = type.GetProperty(properies[i + 1]).GetValue(object1, null);
}
else
{
object1 = type.GetProperty(properies[i]).GetValue(object1, null);
}
type = object1.GetType();
}
}
value = object1.ToString(); //.StartDateTime.ToString();
}
return value;
}
I`m not sure if this is the best approach. Does anyone have a better suggestion, or maybe someone has already done something like this?
To shorten it: The idea is to make generic linq queries to database from a string like: "Case.Parties.FirstOrDefault.NameEn".
Your approach is very good. I have no doubt that it already works.
Another approach is using Expression Tree like #Egorikas have suggested.
Disclaimer: I'm the owner of the project Eval-Expression.NET
In short, this library allows you to evaluate almost any C# code at runtime (What you exactly want to do).
I would suggest you use my library instead. To keep the code:
More readable
Easier to support
Add some flexibility
Example
public string GetColumnValueByObjectByName(Expression<Func<TEntity, bool>> filter = null, string objectName = "", string dllName = "", string objectID = "", string propertyName = "")
{
string objectDllName = objectName + ", " + dllName;
Type type = Type.GetType(objectDllName);
Guid oID = new Guid(objectID);
object Entity = context.Set(type).Find(oID); // get Object by Type and ObjectID
var value = Eval.Execute("x." + propertyName, new { x = entity });
return value.ToString();
}
The library also allow you to use dynamic string with IQueryable
Wiki: LINQ-Dynamic

How can I deserialize an ADO.NET DataTable that contains null values using Json.NET?

I am attempting to use Newtonsoft.Json.Net35 Version 4.0.2.0 to deserialize an ADO.NET DataTable that contains null values. Serialization works fine:
[Test]
public void SerializeDataTableWithNull()
{
var table = new DataTable();
table.Columns.Add("item");
table.Columns.Add("price", typeof(double));
table.Rows.Add("shirt", 49.99);
table.Rows.Add("pants", 54.99);
table.Rows.Add("shoes"); // no price
var json = JsonConvert.SerializeObject(table);
Assert.AreEqual(#"["
+ #"{""item"":""shirt"",""price"":49.99},"
+ #"{""item"":""pants"",""price"":54.99},"
+ #"{""item"":""shoes"",""price"":null}]", json);
}
Deserialization works fine if values are missing:
[Test]
public void DerializeDataTableWithImplicitNull()
{
const string json = #"["
+ #"{""item"":""shirt"",""price"":49.99},"
+ #"{""item"":""pants"",""price"":54.99},"
+ #"{""item"":""shoes""}]";
var table = JsonConvert.DeserializeObject<DataTable>(json);
Assert.AreEqual("shirt", table.Rows[0]["item"]);
Assert.AreEqual("pants", table.Rows[1]["item"]);
Assert.AreEqual("shoes", table.Rows[2]["item"]);
Assert.AreEqual(49.99, (double)table.Rows[0]["price"], 0.01);
Assert.AreEqual(54.99, (double)table.Rows[1]["price"], 0.01);
Assert.IsInstanceOf(typeof(System.DBNull), table.Rows[2]["price"]);
}
If, however, values are explicitly null:
[Test]
public void DerializeDataTableWithExplicitNull()
{
const string json = #"["
+ #"{""item"":""shirt"",""price"":49.99},"
+ #"{""item"":""pants"",""price"":54.99},"
+ #"{""item"":""shoes"",""price"":null}]";
var table = JsonConvert.DeserializeObject<DataTable>(json);
Assert.AreEqual("shirt", table.Rows[0]["item"]);
Assert.AreEqual("pants", table.Rows[1]["item"]);
Assert.AreEqual("shoes", table.Rows[2]["item"]);
Assert.AreEqual(49.99, (double)table.Rows[0]["price"], 0.01);
Assert.AreEqual(54.99, (double)table.Rows[1]["price"], 0.01);
Assert.IsInstanceOf(typeof(System.DBNull), table.Rows[2]["price"]);
}
DeserializeObject throws "System.ArgumentException : Cannot set Column 'price' to be null. Please use DBNull instead." The following workaround works for my particular JSON:
var regex = new Regex(#",?""[_\w]+"":null");
var nullless = regex.Replace(json, string.Empty);
var table = JsonConvert.DeserializeObject<DataTable>(nullless);
but like all regular expression-based kludges this is clearly brittle.
Finally, the questions:
Is this a bug?
Json.NET has many events that can be hooked. Is there a way to get notified when a when a null value is encountered and explicitly set the value to System.DBNull?
Thanks in advance,
Frank
It looks like this is a bug which is easily fixed by replacing
dr[columnName] = reader.Value
with
dr[columnName] = reader.Value ?? System.DBNull.Value
in Newtonsoft.Json.Converters.DataTableConverter. I have entered an issue in the tracker.