Saving Extra Data With GoLang - postgresql

So I am using Go Lang 1.10.3 with the Echo framework with Gorm and Postgres as my database.
I have the three tables / structs , Profile, Addresses and Profile_addresses
The structs are as follows,
Profile
type Profile struct {
gorm.Model
Company string `gorm:"size:255" json:"company"`
AddedDate time.Time `gorm:"type:date" json:"date_added"`
ProfileAddresses []ProfileAddress `gorm:"foreignkey:profile_fk" json:"address_id"`
}
Address
type Address struct {
gorm.Model
AddressLine1 string `gorm:"size:255" json:"line1"`
AddressLine2 string `gorm:"size:255" json:"line2"`
AddressLine3 string `gorm:"size:255" json:"line3"`
City string `gorm:"size:200" json:"city"`
Country string `gorm:"size:255" json:"country"`
Postcode string `gorm:"size:12" json:"postcode"`
ProfileAddresses []ProfileAddress `gorm:"foreignkey:address_fk"`
}
And Profile Address
type ProfileAddress struct {
gorm.Model
Archive bool `json:"archive"`
Profile Profile
Address Address
AddressFK int`gorm:"ForeignKey:id"`
ProfileFK int`gorm:"ForeignKey:id"`
}
These tables all get made fine but I am now trying to save an address ID to the Profile Address table when making a new Profile. Posting data to /profile/add (with Postman) including the following data
{
"company": "testing-000001",
"date_added": "3051-11-09T00:00:00+00:00",
"address_id": 3
}
Now I can save a new profile and a new address but not save this data. I have only just added the json:"address_id"` option to the end of the Profile struct but that did not work.
I have set this up like so, as one profile might have many address so there needs to be a linked table with all the address ids. I sure that I could do this in a two stage step, e.g. save the profile, then save the addresses I want to add to that profile but I would like to get this to work. I also dont need to create a new address, these would have already been added into the system.
So what am I doing wrong?
Any help most welcome.
Thanks,

This is how I got it to work, please let me know if this is not correct or if there is a better way.
So I added the Profile struct to a wrapper/nested struct, like so:
type ProfileWrapper struct {
gorm.Model
ProfileData Profile
AddressID int `json:"address_id"`
}
So I then changed/updated the json data I was sending into this struct to this:
{
"ProfileData": {
"company": "testing-with-add-0001",
"date_added": "9067-11-09T00:00:00+00:00"
},
"address_id": 3
}
So to saved the profile I did this,
db.Create( &p.ProfileData )
I then built a new struct for the Profile Address information to be created/added, like so:
pd := structs.ProfileAddress{AddressFK: p.AddressID, ProfileFK: profileID}
db.Create( &pd )
Not sure if this is the best way of doing this, but it seems to work. If this is wrong please let me know.
Thanks.

Related

GORM foreign key doesn't seem to add proper fields

I have the following model:
type Drink struct {
gorm.Model // Adds some metadata fields to the table
ID uuid.UUID `gorm:"type:uuid;primary key"`
Name string `gorm:"index;not null;"`
Volume float64 `gorm:"not null;type:decimal(10,2)"`
ABV float64 `gorm:"not null;type:decimal(10,2);"`
Price float64 `gorm:"not null;type:decimal(10,2);"`
Location Location `gorm:"ForeignKey:DrinkID;"`
}
type Location struct {
gorm.Model // Adds some metadata fields to the table
ID uuid.UUID `gorm:"primary key;type:uuid"`
DrinkID uuid.UUID
Name string `gorm:"not null;"`
Address string `gorm:"not null;type:decimal(10,2)"`
Phone int `gorm:"not null;type:decimal(10,0);"`
}
however, when I run the program, it adds both tables, however there is no location field in the Drink table.
My database looks like this after the migrations, regardless of whether I drop the tables previously:
I have a sneaking feeling it might be because I am not using the gorm default ID, but if that's the case can anyone point me to how to override the default ID with a UUID instead of a uint the proper way? or if that's not even the issue, please, I've been working on this for a few days now and I really don't want to take the "easy" road of just using the defaults gorm provides, I actually want to understand what is going on here and how to properly do what I am trying to do. I am getting no errors when running the API, and the migration appears to run as well, it's just the fields I have defined are not actually showing up in the database, which means that the frontend won't be able to add data properly.
What I WANT to happen here is that a list of stores will be available in the front-end, and when a user adds a drink, they will have to select from that list of stores. Each drink added should only have 1 store, as the drinks prices at different stores would be different. So technically there would be many "repeated" drinks in the drink table, but connected to different Locations.
First point is as you are using custom primary key, you should not use gorm.Model as it contains ID field in it. Reference
Second point is according to your description, store (location) has one to
many relationship with drink. That means a store can have multiple
drinks but a drink should belong to only one store. In one-to-many
relationship there should be a reference or relation id in the many
side. That means in your case in drink table. Then your struct
should look like this:
MyModel Struct
type MyModel struct {
CreatedAt time.Time
UpdatedAt time.Time
DeletedAt gorm.DeletedAt `gorm:"index"`
}
Location Struct (Store)
type Location struct {
MyModel
ID uuid.UUID `gorm:"primary key;type:uuid"`
// other columns...
Drinks []Drink
}
Drink Struct
type Drink struct {
MyModel
ID uuid.UUID `gorm:"type:uuid;primary key"`
//other columns...
LocationID uuid.UUID// This is important
}
Then gorm will automatically consider LocationID in drink table will be referring the ID field of Location Table. You can also explicitly instruct this to gorm using gorm:"foreignKey:LocationID;references:ID" in Location struct's Drinks array field.
Reference

Json Response showing empty values in beego Contoller

I have two Database table. I want to get the values from those tables by mysql queries and pass it to the controller. There I bind it one struct and show the json values as response to the controller. But the problem is there are some uncommon columns between two table but I have to use every fields in common Struct. When I got the response for one table other table's field are showing empty in the json object and vice versa. How to show only the related fields in every json object?
Code:
Model:
type Student struct {
StudentId int
Name string
Address string
}
type Teacher struct {
TeacherId int
Name string
Address string
Department string
}
type TeacherStudent struct {
StudentId int
Name string
Address string
TeacherId int
Department string
}
Controller:
json.Unmarshal([]byte(datas), &ts)
c.Data["json"] = ts
c.ServeJSON()
Response in Postman:
[
{
"StudnetId": 501,
"Name": Mark,
"Address": Canbera,
"TeacherId": 0,
"Deparment": ""
},
{
"StudnetId": 0,
"Name": John,
"Address": Melbourne,
"TeacherId": 101,
"Deparment": "Science"
}]
For Studnet Response I Don't want to show the TeacherId and Department and For Teacher Record I don't want to show the StudentId in json Object.
The solution for this is using json:"omitempty"
Referring to this article: https://www.sohamkamani.com/golang/omitempty/
Note: json:"omitempty" works for the fields only. It will not work for the structs inside the structs. Also it will not work for the time.Time fields. Referring to this article to handle the time.Time issue: JSON omitempty With time.Time Field

How to scan into nested structs with sqlx?

Let's assume that I have two models,
type Customer struct {
Id int `json:"id" db:"id"`
Name string `json:"name" db:"name"`
Address Address `json:"adress"`
}
type Address struct {
Street string `json:"street" db:"street"`
City string `json:"city" db:"city"`
}
// ...
customer := models.Customer{}
err := db.Get(&customer , `select * from users where id=$1 and name=$2`, id, name)
But this scan throws an error as: missing destination name street in *models.Customer
Am I doing something wrong? As you can see I already updated the db corresponding of the value. I doubled check so case sensitivity shouldn't be a problem.
Or is it not possible using https://github.com/jmoiron/sqlx?
I can see it in the documentation but still couldn't figure out how to solve it.
http://jmoiron.github.io/sqlx/#advancedScanning
The users table is declared as:
CREATE TABLE `users` (
`id` varchar(256) NOT NULL,
`name` varchar(150) NOT NULL,
`street` varchar(150) NOT NULL,
`city` varchar(150) NOT NULL,
)
The very link you posted gives you an hint about how to do this:
StructScan is deceptively sophisticated. It supports embedded structs, and assigns to fields using the same precedence rules that Go uses for embedded attribute and method access
So given your DB schema, you can simply embed Address into Customer:
type Customer struct {
Id int `json:"id" db:"id"`
Name string `json:"name" db:"name"`
Address
}
In your original code, Address was a field with its own db tag. This is not correct, and by the way your schema has no address column at all. (it appears you edited it out of your code snippet)
By embedding the struct into Customer instead, Address fields including tags are promoted into Customer and sqlx will be able to populate them from your query result.
Warning: embedding the field will also flatten the output of any JSON marshalling. It will become:
{
"id": 1,
"name": "foo",
"street": "bar",
"city": "baz"
}
If you want to place street and city into a JSON address object as based on your original struct tags, the easiest way is probably to remap the DB struct to your original type.
You could also scan the query result into a map[string]interface{} but then you have to be careful about how Postgres data types are represented in Go.
I had the same problem and came up with a slightly more elegant solution than #blackgreen's.
He's right, the easiest way is to embed the objects, but I do it in a temporary object instead of making the original messier.
You then add a function to convert your temp (flat) object into your real (nested) one.
type Customer struct {
Id int `json:"id" db:"id"`
Name string `json:"name" db:"name"`
Address Address `json:"adress"`
}
type Address struct {
Street string `json:"street" db:"street"`
City string `json:"city" db:"city"`
}
type tempCustomer struct {
Customer
Address
}
func (c *tempCustomer) ToCustomer() Customer {
customer := c.Customer
customer.Address = c.Address
return customer
}
Now you can scan into tempCustomer and simply call tempCustomer.ToCustomer before you return. This keeps your JSON clean and doesn't require a custom scan function.

Foreign key not created with one to many association

I have a Task type that has a list of Runner type objects in it. I am trying to map it to database using golang gorm but it doesn't have foreign key and i am getting invalid association during migration
My Task struct:
type Task struct {
gorm.Model
Name string `gorm:"not null;unique_index"`
Description string
Runners []Runner
}
My Runner struct:
type Runner struct {
gorm.Model
Name string `gorm:"not null;unique"`
Description string
}
My migration code:
func migrateSchema () (err error) {
db, err := context.DBProvider()
if err != nil {
return
}
db.Model(&Task{}).Related(&Runner{})
db.AutoMigrate(&Task{})
db.AutoMigrate(&Runner{})
return
}
On db.AutoMigrate(&Task{}) I get invalid association message in console and when I check the database there is no foreign key created or no reference field created on runners table
What am I doing wrong?
I had a similar issue, and it took me forever to figure it out. I believe the GORM documentation could definitely be better. Here's the relevant code snippet from the GORM site:
//User has many emails, UserID is the foreign key
type User struct {
gorm.Model
Emails []Email
}
type Email struct {
gorm.Model
Email string
UserID uint
}
db.Model(&user).Related(&emails)
//// SELECT * FROM emails WHERE user_id = 111; // 111 is user's primary key
Why your code isn't working:
First you need to add a TaskID field to your Runner struct.
db.Model(&Task{}).Related(&Runner{}) doesn't do what you think it does. If you look at the code snippet from GORM, the SELECT comment kind of explains it (not very well though). The example is assuming that the &user is already populated and has an ID of 111, then it fetches the emails storing them in &emails that match the UserID of &user.

How to work with multiple types in single collection with mgo

I'm very new to golang and try to write a simple event-sourcing user-management webapi using mongodb as backing database. Now i have User, which looks something like this:
type User struct {
Id bson.ObjectId `json:"id" bson:"_id"`
UserName string `json:"username" bson:"username"`
Email string `json:"email" bson:"email"`
PwdHash string `json:"pwd_hash" bson:"pwd_hash"`
FullName string `json:"fullname" bson:"fullname"`
}
and three events, happening to user, when somebody uses api:
type UserCreatedEvent struct {
UserId bson.ObjectId `json:"id" bson:"_id"`
//time when event was issued
CreatedEpoch time.Time `json:"created_epoch" bson:"created_epoch"`
//id of user that made a change
IssuedByUserId bson.ObjectId `json:"issuedby_userid" bson:"issuedby_userid"`
}
type UserDeletedEvent struct {
UserId bson.ObjectId `json:"id" bson:"_id"`
CreatedEpoch time.Time `json:"created_epoch" bson:"created_epoch"`
//id of user that made a change
IssuedByUserId bson.ObjectId `json:"issuedby_userid" bson:"issuedby_userid"`
}
type UserUpdatedEvent struct {
UserId bson.ObjectId `json:"id" bson:"_id"`
CreatedEpoch time.Time `json:"created_epoch" bson:"created_epoch"`
//id of user that made a change
IssuedByUserId bson.ObjectId `json:"issuedby_userid" bson:"issuedby_userid"`
ChangedFieldName string `json:"changed_field_name" bson:"changed_field_name"`
NewChangedFieldValue string `json:"new_changed_field_value" bson:"new_changed_field_value"`
}
Now i'm stuck on saving and retrieving events from db. The problem is i want to store them in a single collection, so that i have a full plain history of user modifications. But i can't find how to correctly store event type name as a mongo document field and then use it in searches. What is the idiomatic go-way to do this?
I'll be gratefull for any help.
What's nice about a non-relational database is that a little redundancy is OK and is faster for retrieval since you're not joining things together. You could just add your bottom three objects as properties on your User in whatever fashion makes sense for you and your data. Then you wouldn't need to store the UserId on those objects either.
If you need to search over the Events quickly, you could create another collection to hold them. You'd be inserting to two collections, but retrieval times/logic should be pretty good/easy to write.
Your new Event would be something like:
type UserEventType int
const (
Created UserEventType = iota
Deleted
Updated
)
type UserEvent struct {
UserId bson.ObjectId `json:"id" bson:"_id"`
CreatedEpoch time.Time `json:"created_epoch" bson:"created_epoch"`
//id of user that made a change
IssuedByUserId bson.ObjectId `json:"issuedby_userid" bson:"issuedby_userid"`
ChangedFieldName string `json:"changed_field_name,omitempty" bson:"changed_field_name,omitempty"`
NewChangedFieldValue string `json:"new_changed_field_value,omitempty" bson:"new_changed_field_value,omitempty"`
EventType UserEventType `json:"user_event_type" bson:"user_event_type"`
}
Notice the omitempty on the fields that are optional depending on the type of event.
You really aren't supposed to store different objects in the same collection. Here's a line from the Mongo documentation:
MongoDB stores documents in collections. Collections are analogous to tables in relational databases.
In case you aren't familiar with relational databases, a table would essentially represent one type of object.