Using an opaque ID type in place of primitive.ObjectID - mongodb

In a db package, I don't want to expose MongoDB's primitive.ObjectID type as part of its public API. Instead, I want to define type Id interface{} within the db package and expose that. Since the driver doesn't know how to transform the database's ObjectID type into my custom ID type, I registered a type decoder like so:
type Id interface{}
var idType = reflect.TypeOf((*Id)(nil)).Elem()
type User struct {
Id `bson:"_id,omitempty"`
}
func main() {
reg := bson.NewRegistryBuilder().
RegisterTypeDecoder(idType, bsoncodec.ValueDecoderFunc(idDecodeValue)).
Build()
client, err := mongo.Connect(context.TODO(), options.Client().
ApplyURI("...").
SetRegistry(reg))
coll := client.Database("...").Collection("...")
var u0 User
coll.InsertOne(context.TODO(), u0)
var u1 User
coll.FindOne(context.TODO(), bson.D{}).Decode(&u1)
}
func idDecodeValue(dc bsoncodec.DecodeContext, vr bsonrw.ValueReader, val reflect.Value) error {
if !val.IsValid() || val.Type() != idType {
return bsoncodec.ValueDecoderError{
Name: "IdDecodeValue",
Types: []reflect.Type{idType},
Received: val,
}
}
oid, _ := vr.ReadObjectID()
val.Set(reflect.ValueOf(oid))
return nil
}
I saw that there exists a bsoncodec.NewEmptyInterfaceCodec() to use in place of idDecodeValue, but that only seems to work when User.Id's type is exactly interface{}. I wrote idDecodeValue based off of the existing codecs, but I'm not entirely sure what's going on (mostly due to not knowing when val.IsValid() would ever return false). Is all of this the best, most idiomatic way to go about supporting a custom ID type?

Related

Loop over collections in mongodb and decode to various go structs

I have a series of collections in a mongodb instance where I want to pull all the collections and decode all documents in the collections so I can mutate them as I go.
I am using go to list the collection names:
...
collections, err := db.ListCollectionNames(context.TODO(), bson.M{})
if err != nil {
log.Fatalf("Failed to get coll names: %v", err)
}
for _, coll := range collections {
collection := db.Collection(coll)
...
This works flawlessly and the next step would be to open a cursor to each collection and loop over the cursor and call decode on each object to its corresponding type.
I have defined all the types in a types package:
type Announcements struct {
Id string `bson:"_id"`
Content string `bson:"content"`
Title string `bson:"title"`
DateModified time.Time `bson:"dateModified,omitempty"`
ModifiedBy string `bson:"modifiedBy,omitempty"`
}
The issue is that I cant define the variable to be used for decoding dynamically.
This code would need to be repeated for every single collection
var obj Announcements // This would need to be dynamic
for cursor.Next(context.Background()) {
err := cursor.Decode(&obj)
if err != nil {
log.Fatal(err)
}
log.Printf("%#v\n", obj)
}
Is there a way to do this without repeating this many times? I realize a dynamically typed language would be better for this. Wanted to ask before I migrate the work to a scripting language.
I have tried using the reflect package and switch statements to instantiate the variable dynamnically and using the interface{} type but that force the decode to use bson types and not my defined structs.
-- edit --
I have tried to use a map to link collection names to types but to no avail.
var Collections = map[string]interface{}{
"announcements": Announcements{},
"badges": Badges{},
"betaUser": BetaUser{},
"bonusLearningItems": BonusLearningItems{},
"books": Books{},
...
}
So that when I use the obj var I tried an assignment like:
var obj types.Collections[coll]
hoping that would give me a variable of type Announcement if I was had looped to the announcements collection. but when I call decode it returns bson types.
I need to dynamically define the obj variable type.
There are many ways you can do this. One of them is by using a callback function:
func Process(cursor *mongo.Cursor, cb func(), out interface{}) {
for cursor.Next(context.Background()) {
err := cursor.Decode(out)
if err != nil {
log.Fatal(err)
}
cb()
log.Printf("%#v\n", obj)
}
}
Then define a map:
type collHandler struct {
New func() interface{}
Process func(interface{})
}
var collections=map[string]func() interface{} {
"announcements": collHandler{
New: func() interface {} {return &Announcement{}},
Process: func(data interface{}) {
processAnnouncements(data.(*Announcement))
},
},
...
}
And then you can:
handler:=collections[collectionName]
obj:=handler.New()
process(cursor,func() {
handler.Process(obj)
},
&obj)
This way, you can place the iteration into a common function, and use closures to deal with type specific logic.

Gorm Scan Nested Structs

I'm trying to query a view in golang with gorm and save the result in a struct EdgeType containing NodeType. (Basically trying to implement the graphql-relay connection specification).
The view contains 4 fields:
cursor bigint
id bigint
text text
user_id bigint
type NodeType struct {
ID int64
Text string
UserID int64
}
type EdgeType struct {
Cursor int64
Edge NodeType
}
func (receiver EdgeType) TableName() string {
return "connection_view"
}
As this alone doesn't work, so I tried to implement Scanner/Value interface.
Unfortunatelly, Scan isn't executed at all with the following call:
connection := make([]EdgeType, 0)
err := db.GormDB.Find(&connection).Error
This leads to my next problem: I can't debug my Scan/Value functions if they're not called. I've seen another answer pretty much describing my issue but with the advantage to be able to map the Child type to a specific geolocation type in postgres.
func (receiver *NodeType) Scan(src interface{}) error {
// raw := src.(string)
// fmt.Printf("raw: %s", raw)
// maybe parse the src string and set parsed data to receiver?
return nil
}
func (receiver NodeType) Value() (driver.Value, error) {
// what to return for multiple fields?
return receiver, nil
}
What can I return in my Value method to handle multiple fields at once? Or: Is that even possible/supported? Have I declared the Scan function wrongly or why isn't it called?
As mkopriva pointed out, there's a gorm:"embedded" tag.
type NodeType struct {
ID int64
Text string
UserID int64
}
type EdgeType struct {
Cursor int64
Edge NodeType `gorm:"embedded"`
}
This works without any Scan/Value functions.

Addressing potential null fields in go's sqlx [duplicate]

Go types like Int64 and String cannot store null values,
so I found I could use sql.NullInt64 and sql.NullString for this.
But when I use these in a Struct,
and generate JSON from the Struct with the json package,
then the format is different to when I use regular Int64 and String types.
The JSON has an additional level because the sql.Null*** is also a Struct.
Is there a good workaround for this,
or should I not use NULLs in my SQL database?
Types like sql.NullInt64 do not implement any special handling for JSON marshaling or unmarshaling, so the default rules apply. Since the type is a struct, it gets marshalled as an object with its fields as attributes.
One way to work around this is to create your own type that implements the json.Marshaller / json.Unmarshaler interfaces. By embedding the sql.NullInt64 type, we get the SQL methods for free. Something like this:
type JsonNullInt64 struct {
sql.NullInt64
}
func (v JsonNullInt64) MarshalJSON() ([]byte, error) {
if v.Valid {
return json.Marshal(v.Int64)
} else {
return json.Marshal(nil)
}
}
func (v *JsonNullInt64) UnmarshalJSON(data []byte) error {
// Unmarshalling into a pointer will let us detect null
var x *int64
if err := json.Unmarshal(data, &x); err != nil {
return err
}
if x != nil {
v.Valid = true
v.Int64 = *x
} else {
v.Valid = false
}
return nil
}
If you use this type in place of sql.NullInt64, it should be encoded as you expect.
You can test this example here: http://play.golang.org/p/zFESxLcd-c
If you use the null.v3 package, you won't need to implement any of the marshal or unmarshal methods. It's a superset of the sql.Null structs and is probably what you want.
package main
import "gopkg.in/guregu/null.v3"
type Person struct {
Name string `json:"id"`
Age int `json:"age"`
NickName null.String `json:"nickname"` // Optional
}
If you'd like to see a full Golang webserver that uses sqlite, nulls, and json you can consult this gist.

Trying to convert string to instance variable

I'm new to GO language.
Trying to learn GO by building real web application.
I'm using revel framework.
And here is my resource routes:
GET /resource/:resource Resource.ReadAll
GET /resource/:resource/:id Resource.Read
POST /resource/:resource Resource.Create
PUT /resource/:resource/:id Resource.Update
DELETE /resource/:resource/:id Resource.Delete
for example:
GET /resource/users calls Resource.ReadAll("users")
And this is my Resource controller (it's just a dummy actions for now):
type Resource struct {
*revel.Controller
}
type User struct {
Id int
Username string
Password string
}
type Users struct {}
func (u Users) All() string {
return "All"
}
func (c Resource) ReadAll(resource string) revel.Result {
fmt.Printf("GET %s", resource)
model := reflect.New(resource)
fmt.Println(model.All())
return nil
}
I'm trying get instance of Users struct by converting resource string to object to call All function.
and the error:
cannot use resource (type string) as type reflect.Type in argument to
reflect.New: string does not implement reflect.Type (missing Align
method)
I'm new to GO please don't judge me :)
Your problem is here:
model := reflect.New(resource)
You can't instantiate a type from a string that way. You need to either use a switch there and do stuff depending on the model:
switch resource {
case "users":
model := &Users{}
fmt.Println(model.All())
case "posts":
// ...
}
Or use reflect correctly. Something like:
var types = map[string]reflect.Type{
"users": reflect.TypeOf(Users{}) // Or &Users{}.
}
// ...
model := reflect.New(types[resource])
res := model.MethodByName("All").Call(nil)
fmt.Println(res)

Go compile error "undefined function"

I have the following pieces of code:
Interface & function definition:
package helper
import "gopkg.in/mgo.v2/bson"
// Defines an interface for identifiable objects
type Identifiable interface {
GetIdentifier() bson.ObjectId
}
// Checks if a slice contains a given object with a given bson.ObjectId
func IsInSlice(id bson.ObjectId, objects []Identifiable) bool {
for _, single := range objects {
if single.GetIdentifier() == id {
return true
}
}
return false
}
The definition of the user struct which satisfies 'Identifiable':
package models
import (
"github.com/my/project/services"
"gopkg.in/mgo.v2/bson"
)
type User struct {
Id bson.ObjectId `json:"_id,omitempty" bson:"_id,omitempty"`
Username string `json:"username" bson:"username"`
Email string `json:"email" bson:"email"`
Password string `json:"password" bson:"password"`
Albums []bson.ObjectId `json:"albums,omitempty" bson:"albums,omitempty"`
Friends []bson.ObjectId `json:"friends,omitempty" bson:"friends,omitempty"`
}
func GetUserById(id string) (err error, user *User) {
dbConnection, dbCollection := services.GetDbConnection("users")
defer dbConnection.Close()
err = dbCollection.Find(bson.M{"_id": bson.ObjectIdHex(id)}).One(&user)
return err, user
}
func (u *User) GetIdentifier() bson.ObjectId {
return u.Id
}
A test which checks for the existence of an object inside a slice:
package controllers
import (
"github.com/my/project/helper"
"gopkg.in/mgo.v2/bson"
)
var testerId = bson.NewObjectId()
var users = []models.Users{}
/* Some code to get users from db */
if !helper.IsInSlice(testerId, users) {
t.Fatalf("User is not saved in the database")
}
When I try to compile the test it I get the error: undefined helper.IsInSlice. When I rewrite the IsInSlice method to not take []Identifiable but []models.User it works fine.
Any ideas?
Your problem is that you're trying to use a value of type []models.Users{} as a value of type []Identifiable. While models.Users implements Identifiable, Go's type system is designed so that slices of values implementing an interface cannot be used as (or converted to) slices of the interface type.
See the Go specification's section on conversions for more details.
Apparently, Go did not rebuild my package and was looking for the function in an old build. It was therefore undefined. Performing rm -fr [GO-ROOT]/pkg/github.com/my/project/models did the trick.