How to insert math/big.Int in mongo via mgo in golang - mongodb

I have a struct that contains math/big.Int fields. I would like to save the struct in mongodb using mgo. Saving the numbers as a strings is good enough in my situation.
I have looked at the available field's tags and nothing seams to allow custom serializer. I was expecting to implement an interface similar to encoding/json.Marshaler but I have found None of such interface in the documentation.
Here is a trivial example of what I want I need.
package main
import (
"labix.org/v2/mgo"
"math/big"
)
type Point struct {
X, Y *big.Int
}
func main() {
session, err := mgo.Dial("localhost")
if err != nil {
panic(err)
}
defer session.Close()
c := session.DB("test").C("test")
err = c.Insert(&Point{big.NewInt(1), big.NewInt(1)})
if err != nil { // should not panic
panic(err)
}
// The code run as expected but the fields X and Y are empty in mongo
}
Thnaks!

The similar interface is named bson.Getter:
http://labix.org/v2/mgo/bson#Getter
It can look similar to this:
func (point *Point) GetBSON() (interface{}, error) {
return bson.D{{"x", point.X.String()}, {"y", point.Y.String()}}, nil
}
And there's also the counterpart interface in the setter side, if you're interested:
http://labix.org/v2/mgo/bson#Setter
For using it, note that the bson.Raw type provided as a parameter has an Unmarshal method, so you could have a type similar to:
type dbPoint struct {
X string
Y string
}
and unmarshal it conveniently:
var dbp dbPoint
err := raw.Unmarshal(&dbp)
and then use the dbp.X and dbp.Y strings to put the big ints back into the real (point *Point) being unmarshalled.

Related

[]string to jsonb with Gorm and postgres

I have a Go struct which contains a slice of strings which I'd like to save as a jsonB object in Postgres with GORM.
I've come accross a solution which requires to use the GORM specific type (postgres.Jsonb) which I'd like to avoid.
When I try to run the AutoMigrate with a slice in my model, it panics and won't start although when I wrap this slice in a struct (which I'm okay doing) it will run without error but won't create the column in postgres.
type User struct {
gorm.Model
Data []string `sql:"type:"jsonb"; json:"data"`
} //Panics
type User struct {
gorm.Model
Data struct {
NestedData []string
} `sql:"type:"jsonb"; json:"data"`
} //Doesn't crash but doesn't create my column
Has anyone been able to manipulate jsonb with GORM without using the postgres.Jsonb type in models ?
The simplest way to use JSONB in Gorm is to use pgtype.JSONB.
Gorm uses pgx as it driver, and pgx has package called pgtype, which has type named pgtype.JSONB.
If you have already install pgx as Gorm instructed, you don't need install any other package.
This method should be the best practice since it using underlying driver and no custom code is needed. It can also be used for any JSONB type beyond []string.
type User struct {
gorm.Model
Data pgtype.JSONB `gorm:"type:jsonb;default:'[]';not null"`
}
Get value from DB
u := User{}
db.find(&u)
var data []string
err := u.Data.AssignTo(&data)
if err != nil {
t.Fatal(err)
}
Set value to DB
u := User{}
err := u.Data.Set([]string{"abc","def"})
if err != nil {
return
}
db.Updates(&u)
Maybe:
type DataJSONB []string
func (dj DataJSONB) Value() (driver.Value, error) {
return json.Marshal(dj)
}
func (dj *DataJSONB) Scan(value interface{}) error {
b, ok := value.([]byte)
if !ok {
return fmt.Errorf("[]byte assertion failed")
}
return json.Unmarshal(b, dj)
}
// Your bit
type User struct {
gorm.Model
Data DataJSONB `sql:"type:"jsonb"; json:"data"`
}
Define a new type:
type Data map[string]interface{}
And implement the Valuer and Scanner interfaces onto them, which allows the field to be converted to a value for the database, and scanned back into the field, respectively:
// Value converts into []byte
func (d Data) Value() (driver.Value, error) {
j, err := json.Marshal(d)
return j, err
}
// Scan puts the []byte back into Data
func (d *Data) Scan(src interface{}) error {
source, ok := src.([]byte)
if !ok {
return errors.New("Type assertion .([]byte) failed.")
}
var i interface{}
if err := json.Unmarshal(source, &i); err != nil {
return err
}
*d, ok = i.(map[string]interface{})
if !ok {
return errors.New("Type assertion .(map[string]interface{}) failed.")
}
return nil
}
Then you can define your field in your model like this:
type User struct {
gorm.Model
Data Data `type: jsonb not null default '{}'::jsonb`
}
Using the underlying map[string]interface{} type is nice too, as you can Unmarshal/Marshal any JSON to/from it.

Mocking MongoDB response in Go

I'm fetching a document from MongoDB and passing it into function transform, e.g.
var doc map[string]interface{}
err := collection.FindOne(context.TODO(), filter).Decode(&doc)
result := transform(doc)
I want to write unit tests for transform, but I'm not sure how to mock a response from MongoDB. Ideally I want to set something like this up:
func TestTransform(t *testing.T) {
byt := []byte(`
{"hello": "world",
"message": "apple"}
`)
var doc map[string]interface{}
>>> Some method here to Decode byt into doc like the code above <<<
out := transform(doc)
expected := ...
if diff := deep.Equal(expected, out); diff != nil {
t.Error(diff)
}
}
One way would be to json.Unmarshal into doc, but this sometimes gives different results. For example, if the document in MongoDB has an array in it, then that array is decoded into doc as a bson.A type not []interface{} type.
A member from my team recently found out there is a hidden gem inside the official MongoDB driver for GO: https://pkg.go.dev/go.mongodb.org/mongo-driver#v1.9.1/mongo/integration/mtest. Although the package is in experimental mode and there is no backward compatibility guaranteed for it, it can help you to perform unit testing, at least with this version of the driver.
You can check this cool article with plenty of examples of how to use it: https://medium.com/#victor.neuret/mocking-the-official-mongo-golang-driver-5aad5b226a78. Additionally, here is the repository with the code samples for this article: https://github.com/victorneuret/mongo-go-driver-mock.
So, based in your example and the samples from the article I think you could try something like the following (of course, you might need to tweak and experiment with this):
func TestTransform(t *testing.T) {
mt := mtest.New(t, mtest.NewOptions().ClientType(mtest.Mock))
defer mt.Close()
mt.Run("find & transform", func(mt *mtest.T) {
myollection = mt.Coll
expected := myStructure{...}
mt.AddMockResponses(mtest.CreateCursorResponse(1, "foo.bar", mtest.FirstBatch, bson.D{
{"_id", expected.ID},
{"field-1", expected.Field1},
{"field-2", expected.Field2},
}))
response, err := myFindFunction(expected.ID)
if err != nil {
t.Error(err)
}
out := transform(response)
if diff := deep.Equal(expected, out); diff != nil {
t.Error(diff)
}
})
}
Alternatively, you can perform a more real testing and in an automated way via integration testing with Docker containers. There are a few good packages that could help you with this:
https://github.com/ory/dockertest
https://github.com/testcontainers/testcontainers-go
I have followed this approach with dockertest library to automate a full integration testing environment that could be setUp and tearDown via the go test -v -run Integration command. See a full example here: https://github.com/AnhellO/learn-dockertest/tree/master/mongo.
Hope this helps.
The best solution to write testable could would be to extract your code to a DAO or Data-Repository. You would define an interface which would return what you need. This way, you can just used a Mocked Version for testing.
// repository.go
type ISomeRepository interface {
Get(string) (*SomeModel, error)
}
type SomeRepository struct { ... }
func (r *SomeRepository) Get(id string) (*SomeModel, error) {
// Handling a real repository access and returning your Object
}
When you need to mock it, just create a Mock-Struct and implement the interface:
// repository_test.go
type SomeMockRepository struct { ... }
func (r *SomeRepository) Get(id string) (*SomeModel, error) {
return &SomeModel{...}, nil
}
func TestSomething() {
// You can use your mock as ISomeRepository
var repo *ISomeRepository
repo = &SomeMockRepository{}
someModel, err := repo.Get("123")
}
This is best used with some kind of dependency-injection, so passing this repository as ISomeRepository into the function.
Using monkey library to hook any function from mongo driver.
For example:
func insert(collection *mongo.Collection) (int, error) {
ctx, _ := context.WithTimeout(context.Background(), 10*time.Second)
u := User{
Name: "kevin",
Age: 20,
}
res, err := collection.InsertOne(ctx, u)
if err != nil {
log.Printf("error: %v", err)
return 0, err
}
id := res.InsertedID.(int)
return id, nil
}
func TestInsert(t *testing.T) {
var c *mongo.Collection
var guard *monkey.PatchGuard
guard = monkey.PatchInstanceMethod(reflect.TypeOf(c), "InsertOne",
func(c *mongo.Collection, ctx context.Context, document interface{}, opts ...*options.InsertOneOptions) (*mongo.InsertOneResult, error) {
guard.Unpatch()
defer guard.Restore()
log.Printf("record: %+v, collection: %s, database: %s", document, c.Name(), c.Database().Name())
res := &mongo.InsertOneResult{
InsertedID: 100,
}
return res, nil
})
collection := client.Database("db").Collection("person")
id, err := insert(collection)
require.NoError(t, err)
assert.Equal(t, id, 100)
}

How to marshal json string to bson document for writing to MongoDB?

What I am looking is equivalent of Document.parse()
in golang, that allows me create bson from json directly? I do not want to create intermediate Go structs for marshaling
The gopkg.in/mgo.v2/bson package has a function called UnmarshalJSON which does exactly what you want.
The data parameter should hold you JSON string as []byte value.
func UnmarshalJSON(data []byte, value interface{}) error
UnmarshalJSON unmarshals a JSON value that may hold non-standard syntax as defined in BSON's extended JSON specification.
Example:
var bdoc interface{}
err = bson.UnmarshalJSON([]byte(`{"id": 1,"name": "A green door","price": 12.50,"tags": ["home", "green"]}`),&bdoc)
if err != nil {
panic(err)
}
err = c.Insert(&bdoc)
if err != nil {
panic(err)
}
mongo-go-driver has a function bson.UnmarshalExtJSON that does the job.
Here's the example:
var doc interface{}
err := bson.UnmarshalExtJSON([]byte(`{"foo":"bar"}`), true, &doc)
if err != nil {
// handle error
}
There is no longer a way to do this directly with supported libraries (e.g. the mongo-go-driver). You would need to write your own converter based on the bson spec.
Edit: here's one that by now has seen a few Terabytes of use in prod.
https://github.com/dustinevan/mongo/blob/main/bsoncv/bsoncv.go
I do not want to create intermediate Go structs for marshaling
If you do want/need to create an intermediate Go BSON structs, you could use a conversion module such github.com/sindbach/json-to-bson-go. For example:
import (
"fmt"
"github.com/sindbach/json-to-bson-go/convert"
"github.com/sindbach/json-to-bson-go/options"
)
func main() {
doc := `{"foo": "buildfest", "bar": {"$numberDecimal":"2021"} }`
opt := options.NewOptions()
result, _ := convert.Convert([]byte(doc), opt)
fmt.Println(result)
}
Will produce output:
package main
import "go.mongodb.org/mongo-driver/bson/primitive"
type Example struct {
Foo string `bson:"foo"`
Bar primitive.Decimal128 `bson:"bar"`
}
This module is compatible with the official MongoDB Go driver, and as you can see it supports Extended JSON formats.
You can also visit https://json-to-bson-map.netlify.app to try the module in action. You can paste a JSON document, and see the Go BSON structs as output.
A simple converter that uses go.mongodb.org/mongo-driver/bson/bsonrw:
func JsonToBson(message []byte) ([]byte, error) {
reader, err := bsonrw.NewExtJSONValueReader(bytes.NewReader(message), true)
if err != nil {
return []byte{}, err
}
buf := &bytes.Buffer{}
writer, _ := bsonrw.NewBSONValueWriter(buf)
err = bsonrw.Copier{}.CopyDocument(writer, reader)
if err != nil {
return []byte{}, err
}
marshaled := buf.Bytes()
return marshaled, nil
}

Go Interface/Container usage

Newbie Go programmer here. I'm writing a package that reads a JSON configuration file. It uses the built-in JSON decoding, of course. But I want it to be able to include other JSON files as well, by looking for an array of filenames with the key of 'Includes'. I got it working as just a function and passing in a struct for the JSON data that includes a slice of strings labeled 'Includes', but I don't know how to specify this as a package.
Here's the function:
func ReadConfig(filename string, configuration *Configuration) error {
log.Println("reading file", filename)
file, err := os.Open(filename)
if err != nil {
log.Println("Can't read", filename)
return err
}
decoder := json.NewDecoder(file)
if err := decoder.Decode(&configuration); err != nil {
log.Println(err)
return err
}
includes := make([]string, len(configuration.Includes))
copy(includes, configuration.Includes)
config.Includes = configuration.Includes[0:0]
for _, inc := range includes {
log.Println(inc)
if err := ReadConfig(inc, configuration); err != nil {
return err
}
}
return nil
}
Which works with:
type Configuration struct {
Includes []string
.... other defs
}
But, in a package, I want ReadConfig to take any kind Configuration struct, as long as one of its members is 'Includes []string'.
I believe I need to change the ReadConfig def to:
func ReadConfig(filename string, configuration interface{})
But what I don't know is how to access the Includes slice within that.
Just create an interface for it
type Configurable interface {
Configuration() []string
}
And then provide a Configuration method instead of a field for your structs, and change the signature of your function to func ReadConfig(filename string, configuration Configurable).
It'd be much easier to just pass in the slice instead of the struct though.

Prevent mgo/bson Unmarshal to clear unexported fields

I try to populate the exported fields of a struct with content fetched from a MongoDb-database using the labix.org/v2/mgo package.
mgo uses the labix.org/v2/mgo/bson package to unmarshal the data. But the unmarshaller sets all unexported fields to their zero value.
Is there any way to prevent this behavior?
Working example:
package main
import (
"fmt"
"labix.org/v2/mgo/bson"
)
type Sub struct{ Int int }
type Player struct {
Name string
unexpInt int
unexpPoint *Sub
}
func main() {
dta,err := bson.Marshal(bson.M{"name": "ANisus"})
if err != nil {
panic(err)
}
p := &Player{unexpInt: 12, unexpPoint: &Sub{42}}
fmt.Printf("Before: %+v\n", p)
err = bson.Unmarshal(dta, p)
if err != nil {
panic(err)
}
fmt.Printf("After: %+v\n", p)
}
Output:
Before: &{Name: unexpInt:12 unexpPoint:0xf84005f500}
After: &{Name:ANisus unexpInt:0 unexpPoint:<nil>}
This is not possible. As you can see in the source code, struct values are explicitly being set to their zero value before filling in any fields.
There is no option to disable this behaviour. It is presumably in place to make sure the result of Unmarshal() only depends on the BSON data and not any prior state.