Using LEFT JOIN on Golang and mapping to a struct - postgresql

I have the following structs on Go:
type Struct1 struct {
ID int64 `db:id`
InternalID int64 `db:internal_id`
Structs2 []Struct2
}
type Struct2 struct {
ID int64 `db:id`
InternalID int64 `db:internal_id`
SomeText string `db:some_text`
}
The relation between this two is: There only can be one Struct1, connect to N Struct2 by the internal_id. So I am doing this query:
SELECT*
FROM struct1 st1
LEFT JOIN struct2 st2
ON
st1.internal_id = st2.internal_id
LIMIT 10
OFFSET (1 - 1) * 10;
By executing this query on Go, I wanna know if i can: Create an array of Struct1, by mapping correctly the array Struct2 on it. If it is possible, how can I do it? Thanks in advance.
I'm using Postgres and sqlx.

This is entirely dependent on the library/client you're using to connect to the database, however, I have never seen a library support what you're trying to do so I'll provide a custom implementation that you can do. Full disclosure, I did not test this but I hope it gives you the general idea and anyone should feel free to add edits.
package main
type Struct1 struct {
ID int64 `db:id`
InternalID int64 `db:internal_id`
Structs2 []Struct2
}
type Struct2 struct {
ID int64 `db:id`
InternalID int64 `db:internal_id`
SomeText string `db:some_text`
}
type Row struct {
Struct1
Struct2
}
func main() {
var rows []*Row
// decode the response into the rows variable using whatever SQL client you use
// We'll group the struct1s into a map, then iterate over all the rows, taking the
// struct2 contents off of rows we've already added to the map and then appending
// the struct2 to them. This effectively turns rows into one-to-many relationships
// between struct1 and struct2.
mapped := map[int64]*Struct1{}
for _, r := range rows {
// The key of the map is going to be the internal ID of struct1 (that's the ONE
// in the one-to-many relationship)
if _, ok := mapped[r.Struct1.InternalID]; ok {
// Make sure to initialize the key if this is the first row with struct1's
// internal ID.
mapped[r.Struct1.InternalID] = &r.Struct1
}
// Append the struct 2 (the MANY in the one-to-many relationship) to the struct1s
// array of struct2s.
mapped[r.Struct1.InternalID].Structs2 = append(mapped[r.Struct1.InternalID].Structs2, r.Struct2)
}
// Then convert it to a slice if needed
results := make([]*Struct1, len(mapped))
i := 0
for _, v := range mapped {
results[i] = v
i++
}
}

Related

Unable to INSERT/UPDATE data with custom type in postgresql using golang

I am trying to insert/update data in PostgreSQL using jackc/pgx into a table that has column of custom type. This is the table type written as a golan struct:
// Added this struct as a Types in PSQL
type DayPriceModel struct {
Date time.Time `json:"date"`
High float32 `json:"high"`
Low float32 `json:"low"`
Open float32 `json:"open"`
Close float32 `json:"close"`
}
// The 2 columns in my table
type SecuritiesPriceHistoryModel struct {
Symbol string `json:"symbol"`
History []DayPriceModel `json:"history"`
}
I have written this code for inserting data:
func insertToDB(data SecuritiesPriceHistoryModel) {
DBConnection := config.DBConnection
_, err := DBConnection.Exec(context.Background(), "INSERT INTO equity.securities_price_history (symbol) VALUES ($1)", data.Symbol, data.History)
}
But I am unable to insert the custom data type (DayPriceModel).
I am getting an error
Failed to encode args[1]: unable to encode
The error is very long and mostly shows my data so I have picked out the main part.
How do I INSERT data into PSQL with such custom data types?
PS: An implementation using jackc/pgx is preferred but database/SQL would just do fine
I'm not familiar enough with pgx to know how to setup support for arrays of composite types. But, as already mentioned in the comment, you can implement the driver.Valuer interface and have that implementation produce a valid literal, this also applies if you are storing slices of structs, you just need to declare a named slice and have that implement the valuer, and then use it instead of the unnamed slice.
// named slice type
type DayPriceModelList []DayPriceModel
// the syntax for array of composites literal looks like
// this: '{"(foo,123)", "(bar,987)"}'. So the implementation
// below must return the slice contents in that format.
func (l DayPriceModelList) Value() (driver.Value, error) {
// nil slice? produce NULL
if l == nil {
return nil, nil
}
// empty slice? produce empty array
if len(l) == 0 {
return []byte{'{', '}'}, nil
}
out := []byte{'{'}
for _, v := range l {
// This assumes that the date field in the pg composite
// type accepts the default time.Time format. If that is
// not the case then you simply provide v.Date in such a
// format which the composite's field understand, e.g.,
// v.Date.Format("<layout that the pg composite understands>")
x := fmt.Sprintf(`"(%s,%f,%f,%f,%f)",`,
v.Date,
v.High,
v.Low,
v.Open,
v.Close)
out = append(out, x...)
}
out[len(out)-1] = '}' // replace last "," with "}"
return out, nil
}
And when you are writing the insert query, make sure to add an explicit cast right after the placeholder, e.g.
type SecuritiesPriceHistoryModel struct {
Symbol string `json:"symbol"`
History DayPriceModelList `json:"history"` // use the named slice type
}
// ...
_, err := db.Exec(ctx, `INSERT INTO equity.securities_price_history (
symbol
, history
) VALUES (
$1
, $2::my_composite_type[])
`, data.Symbol, data.History)
// replace my_composite_type with the name of the composite type in the database
NOTE#1: Depending on the exact definition of your composite type in postgres the above example may or may not work, if it doesn't, simply adjust the code to make it work.
NOTE#2: The general approach in the example above is valid, however it is likely not very efficient. If you need the code to be performant do not use the example verbatim.

Insert Array of Structs into DB using gorm

I need to insert into a Postgresql database using gorm. But the struct in question has fields containing arrays of custom types:
type VideoFile struct {
UID string
FileName string
Format string
Path string
}
type myStruct struct {
ID string
UserId uint64
UserType int
FaceVideoFiles []VideoFile
MeetingVideoFiles []VideoFile
}
func (r *videoRepo) CreateRecord(input *myStruct) error {
tx := r.base.GetDB().Begin()
err := tx.Create(input).Error
if err != nil {
tx.Rollback()
return err
}
tx.Commit()
return nil
}
ID field is the primary key in database. I am not sure which tags to add to the struct fields so I leave it empty here (e.g. gorm:"primaryKey" json:"id", etc). If the FaceVideoFiles is not an array of struct, things seem to work, but that is not the case with an array of struct. What should I do so that I can insert those arrays of structs (not individual structs) into my database?
You have to use one to many. More information related to this. Please follow documentation. https://gorm.io/docs/has_many.html#Has-Many
type Dog struct {
ID int
Name string
Toys []Toy `gorm:"polymorphic:Owner;"`
}
type Toy struct {
ID int
Name string
OwnerID int
OwnerType string
}
db.Create(&Dog{Name: "dog1", Toys: []Toy{{Name: "toy1"}, {Name: "toy2"}}})
// INSERT INTO `dogs` (`name`) VALUES ("dog1")
// INSERT INTO `toys` (`name`,`owner_id`,`owner_type`) VALUES ("toy1","1","dogs"), ("toy2","1","dogs")

How to range over bson.D primitive.A slice mongo-go-driver?

My data structure is
{
_id: ObjectID,
...
fields: [
{ name: "Aryan" },
{ books : [ 1,2,3 ] },
]
}
In our application a user can define his own fields data structure but with a key value structure. So, we had no way of knowing the structure of the data.
So in our document struct we had
type Document struct {
Fields map[string]interface{}
}
as the second parameter returned by mongo was primitive.A ( []interface{} under the hood ), the individual item could have been an array, map, anything.
But we couldn't range over that for it being.
How can I get the individual values like the book ids [1,2,3] or the name value "Aryan" ?
After a couple of attempts to solving this, I was at a state where my current element it self is an interface{} ( [1,2,3] in this case ) and I couldn't get the individual 1,2,3s.
But finally managed to solve it
for k, val := range doc.Fields {
v := reflect.ValueOf(val)
switch reflect.TypeOf(t).Kind() {
case reflect.Slice:
// getting the individual ids
for i := 0; i < v.Len(); i++ {
fmt.Println(v.Index(i))
}
case reflect.String:
default:
// handle
}
}
Note: v.Index(i) returns a data type of Value .
to change the data type
v.Index(i).Interface().(string) // string
v.Index(i).Interface().(float64) // float
Ideally you should avoid working with interface{} type since it's very error prone and the compiler can't help you. The idiomatic way is to define a struct for your model with BSON tags like in this example`
type MyType struct {
ID primitive.ObjectID `bson:"_id,omitempty"`
Fields []Field `bson:"fields,omitempty"`
}
type Field struct {
Name string `bson:"name,omitempty"`
Books []int `bson:"books,omitempty"`
}
Field here is defined as the reunion of all possible fields which again is not ideal but at least the compiler can help you and developers know what to expect from the database document.

Map Mongo _id in two different struct fields in Golang

I am working on a project that uses combination of Go and MongoDB. I am stuck at a place where I have a struct like:
type Booking struct {
// booking fields
Id int `json:"_id,omitempty" bson:"_id,omitempty"`
Uid int `json:"uid,omitempty" bson:"uid,omitempty"`
IndustryId int `json:"industry_id,omitempty" bson:"industry_id,omitempty"`
LocationId int `json:"location_id,omitempty" bson:"location_id,omitempty"`
BaseLocationId int `json:"base_location_id,omitempty" bson:"base_location_id,omitempty"`
}
In this struct, the field Id is of int type. But as we know MongoDB's default id is of bsonObject type. Some times, the system generates the default MongoDB id in Id field.
To overcome this I have modified the struct like this:
type Booking struct {
// booking fields
Id int `json:"_id,omitempty" bson:"_id,omitempty"`
BsonId bson.ObjectId `json:"bson_id" bson:"_id,omitempty"`
Uid int `json:"uid,omitempty" bson:"uid,omitempty"`
IndustryId int `json:"industry_id,omitempty" bson:"industry_id,omitempty"`
LocationId int `json:"location_id,omitempty" bson:"location_id,omitempty"`
BaseLocationId int `json:"base_location_id,omitempty" bson:"base_location_id,omitempty"`
}
In above struct, I have mapped the same _id field in two different struct fields Id (type int) and BsonId (type bson.ObjectId). I want if id of integer type comes, it maps under Id otherwise under BsonId.
But this thing is giving following error:
Duplicated key '_id' in struct models.Booking
How can I implement this type of thing with Go Structs ??
Update:
Here is the code I have written for custom marshaling/unmarshaling:
func (booking *Booking) SetBSON(raw bson.Raw) (err error) {
type bsonBooking Booking
if err = raw.Unmarshal((*bsonBooking)(booking)); err != nil {
return
}
booking.BsonId, err = booking.Id
return
}
func (booking *Booking) GetBSON() (interface{}, error) {
booking.Id = Booking.BsonId
type bsonBooking *Booking
return bsonBooking(booking), nil
}
But this is giving type mismatch error of Id and BsonId fields. What should I do now ?
In your original struct you used this tag for the Id field:
bson:"_id,omitempty"
This means if the value of the Id field is 0 (zero value for the int type), then that field will not be sent to MongoDB. But the _id property is mandatory in MongoDB, so in this case the MongoDB server will generate an ObjectId for it.
To overcome this, the easiest is to ensure the Id will is always non-zero; or if the 0 is a valid id, remove the omitempty option from the tag.
If your collection must allow ids of mixed type (int and ObjectId), then the easiest would be to define the Id field with type of interface{} so it can accomodate both (actually all) types of key values:
Id interface{} `json:"_id,omitempty" bson:"_id,omitempty"`
Yes, working with this may be a little more cumbersome (e.g. if you explicitly need the id as int, you need to use type assertion), but this will solve your issue.
If you do need 2 id fields, one with int type and another with ObjectId type, then your only option will be to implement custom BSON marshaling and unmarshaling. This basically means to implement the bson.Getter and / or bson.Setter interfaces (one method for each) on your struct type, in which you may do anything you like to populate your struct or assemble the data to be actually saved / inserted. For details and example, see Accessing MongoDB from Go.
Here's an example how using custom marshaling it may look like:
Leave out the Id and BsonId fields from marshaling (using the bson:"-" tag), and add a 3rd, "temporary" id field:
type Booking struct {
Id int `bson:"-"`
BsonId bson.ObjectId `bson:"-"`
TempId interface{} `bson:"_id"`
// rest of your fields...
}
So whatever id you have in your MongoDB, it will end up in TempId, and only this id field will be sent and saved in MongoDB's _id property.
Use the GetBSON() method to set TempId from the other id fields (whichever is set) before your struct value gets saved / inserted, and use SetBSON() method to "copy" TempId's value to one of the other id fields based on its dynamic type after the document is retrieved from MongoDB:
func (b *Booking) GetBSON() (interface{}, error) {
if b.Id != 0 {
b.TempId = b.Id
} else {
b.TempId = b.BsonId
}
return b, nil
}
func (b *Booking) SetBSON(raw bson.Raw) (err error) {
if err = raw.Unmarshal(b); err != nil {
return
}
if intId, ok := b.TempId.(int); ok {
b.Id = intId
} else bsonId, ok := b.TempId.(bson.ObjectId); ok {
b.BsonId = bsonId
} else {
err = errors.New("invalid or missing id")
}
return
}
Note: if you dislike the TempId field in your Booking struct, you may create a copy of Booking (e.g. tempBooking), and only add TempId into that, and use tempBooking for marshaling / unmarshaling. You may use embedding (tempBooking may embed Booking) so you can even avoid repetition.

Problems with generic Arrays in Swift [duplicate]

This question already has an answer here:
Swift: How to declare a 2d array (grid or matrix) in Swift to allow random insert
(1 answer)
Closed 7 years ago.
Since I'm pretty new to native programming in iOS, I need some help with the following situation. What I need is a class which holds an array or dictionary with a combination of 2 integers as the key (used for coordiates in a grid matrix) and an Integer as the value. I found a solution in a tutorial somewhere, which looks like this:
class Array2D<T> {
let columns: Int
let rows: Int
var array: Array<T?>
init(columns: Int, rows: Int) {
self.columns = columns
self.rows = rows
array = Array<T?>(count: rows * columns, repeatedValue: nil)
}
subscript(column: Int, row: Int) ->T! {
get {
return array[(row * columns) + column]
}
set(newValue) {
array[(row * columns) + column] = newValue
}
}
}
Unfortunately, this type of array already has the length of rows*columns, as soon as the class is initialized, which I don't want. The array should be empty in the beginning. But if I initialize an empty Array, I get problems when it comes to getting or setting the value for a key which doesn't exist yet. Also, I need to be able to easily increment the values of a specific key (like array[2, 3]++). Does anybody have a good solution for my problem?
Just use it with optionals so there are no initial values set. In this example you have your columns an your rows which are optionals and initialized with nil. The array is initialized as an empty array.
class Array2D<T> {
let columns: Int? = nil
let rows: Int? = nil
var array: Array<T?> = []
}
Why not create struct for a coordinate like
struct Coordinate {
let column: Int
let row: Int
}
... then provide necessary implementations for Equatable and Hashable protocols, and then use it as a key in dictionary like:
struct Array2D<T> {
private var storage = [Coordinate: T]()
// ...
}
... and then provide subscript for that Array2D, which when a value is assigned would simply insert new entry into an underwater dictionary, and when the value is read, just read it from there.