Newbie Go programmer here. I'm writing a package that reads a JSON configuration file. It uses the built-in JSON decoding, of course. But I want it to be able to include other JSON files as well, by looking for an array of filenames with the key of 'Includes'. I got it working as just a function and passing in a struct for the JSON data that includes a slice of strings labeled 'Includes', but I don't know how to specify this as a package.
Here's the function:
func ReadConfig(filename string, configuration *Configuration) error {
log.Println("reading file", filename)
file, err := os.Open(filename)
if err != nil {
log.Println("Can't read", filename)
return err
}
decoder := json.NewDecoder(file)
if err := decoder.Decode(&configuration); err != nil {
log.Println(err)
return err
}
includes := make([]string, len(configuration.Includes))
copy(includes, configuration.Includes)
config.Includes = configuration.Includes[0:0]
for _, inc := range includes {
log.Println(inc)
if err := ReadConfig(inc, configuration); err != nil {
return err
}
}
return nil
}
Which works with:
type Configuration struct {
Includes []string
.... other defs
}
But, in a package, I want ReadConfig to take any kind Configuration struct, as long as one of its members is 'Includes []string'.
I believe I need to change the ReadConfig def to:
func ReadConfig(filename string, configuration interface{})
But what I don't know is how to access the Includes slice within that.
Just create an interface for it
type Configurable interface {
Configuration() []string
}
And then provide a Configuration method instead of a field for your structs, and change the signature of your function to func ReadConfig(filename string, configuration Configurable).
It'd be much easier to just pass in the slice instead of the struct though.
Related
My application mostly consists of CRUDs to/from MongoDB using mongo-go-drive package. This function is one of gRPC server services and all it does is calling database method action.GetProducts(ctx) and it returns *mongo.cursor. Then the result is decoded. For each document, I put the document content into a singular product struct, then append it to products slices (the GetProductsResponse struct is made using gRPC proto repeated GetProductResponse type). After appending all product into GetProductsResponse, I return the response to gRPC client.
I am also new to testing in general, how should I break down the function and do the mocking (how to mock the cursor?) for unit testing? Is it even necessary in the first place to do unit test on the function even though all it does is appending the result, or should I just go straight for the integration test and skip the unit test since it involves database I/O?
func (s *Server) GetProducts(ctx context.Context, in *pb.EmptyRequest) (*pb.GetProductsResponse, error) {
cursor, err := action.GetProducts(ctx)
if err != nil {
return nil, err
}
products := pb.GetProductsResponse{}
res := model.Product{}
for cursor.Next(ctx) {
// Convert document to above struct
err := cursor.Decode(&res)
if err != nil {
return nil, fmt.Errorf("failed to decode document: %v", err)
}
product := &pb.GetProductResponse{ProductId: res.Product_id.Hex(), Name: res.Name, Price: res.Price, Qty: int32(res.Qty)}
products.Products = append(products.Products, product)
}
return &products, nil
}
If you interact with the DB is not unit testing anymore, because you're integrating with another external system.
Anyway, I use to define my "repository" layer function this way:
package repo
var FetchUserById = func(id string) (*model.User, error){
// here the real logic
return user, err
}
and then, when I have to test my "service" layer logic, I would mock the entire "repository" layer this way:
repo.FetchUserById = func(id string) (*model.User, err) {
return myMockedUser, nil
}
I'm fetching a document from MongoDB and passing it into function transform, e.g.
var doc map[string]interface{}
err := collection.FindOne(context.TODO(), filter).Decode(&doc)
result := transform(doc)
I want to write unit tests for transform, but I'm not sure how to mock a response from MongoDB. Ideally I want to set something like this up:
func TestTransform(t *testing.T) {
byt := []byte(`
{"hello": "world",
"message": "apple"}
`)
var doc map[string]interface{}
>>> Some method here to Decode byt into doc like the code above <<<
out := transform(doc)
expected := ...
if diff := deep.Equal(expected, out); diff != nil {
t.Error(diff)
}
}
One way would be to json.Unmarshal into doc, but this sometimes gives different results. For example, if the document in MongoDB has an array in it, then that array is decoded into doc as a bson.A type not []interface{} type.
A member from my team recently found out there is a hidden gem inside the official MongoDB driver for GO: https://pkg.go.dev/go.mongodb.org/mongo-driver#v1.9.1/mongo/integration/mtest. Although the package is in experimental mode and there is no backward compatibility guaranteed for it, it can help you to perform unit testing, at least with this version of the driver.
You can check this cool article with plenty of examples of how to use it: https://medium.com/#victor.neuret/mocking-the-official-mongo-golang-driver-5aad5b226a78. Additionally, here is the repository with the code samples for this article: https://github.com/victorneuret/mongo-go-driver-mock.
So, based in your example and the samples from the article I think you could try something like the following (of course, you might need to tweak and experiment with this):
func TestTransform(t *testing.T) {
mt := mtest.New(t, mtest.NewOptions().ClientType(mtest.Mock))
defer mt.Close()
mt.Run("find & transform", func(mt *mtest.T) {
myollection = mt.Coll
expected := myStructure{...}
mt.AddMockResponses(mtest.CreateCursorResponse(1, "foo.bar", mtest.FirstBatch, bson.D{
{"_id", expected.ID},
{"field-1", expected.Field1},
{"field-2", expected.Field2},
}))
response, err := myFindFunction(expected.ID)
if err != nil {
t.Error(err)
}
out := transform(response)
if diff := deep.Equal(expected, out); diff != nil {
t.Error(diff)
}
})
}
Alternatively, you can perform a more real testing and in an automated way via integration testing with Docker containers. There are a few good packages that could help you with this:
https://github.com/ory/dockertest
https://github.com/testcontainers/testcontainers-go
I have followed this approach with dockertest library to automate a full integration testing environment that could be setUp and tearDown via the go test -v -run Integration command. See a full example here: https://github.com/AnhellO/learn-dockertest/tree/master/mongo.
Hope this helps.
The best solution to write testable could would be to extract your code to a DAO or Data-Repository. You would define an interface which would return what you need. This way, you can just used a Mocked Version for testing.
// repository.go
type ISomeRepository interface {
Get(string) (*SomeModel, error)
}
type SomeRepository struct { ... }
func (r *SomeRepository) Get(id string) (*SomeModel, error) {
// Handling a real repository access and returning your Object
}
When you need to mock it, just create a Mock-Struct and implement the interface:
// repository_test.go
type SomeMockRepository struct { ... }
func (r *SomeRepository) Get(id string) (*SomeModel, error) {
return &SomeModel{...}, nil
}
func TestSomething() {
// You can use your mock as ISomeRepository
var repo *ISomeRepository
repo = &SomeMockRepository{}
someModel, err := repo.Get("123")
}
This is best used with some kind of dependency-injection, so passing this repository as ISomeRepository into the function.
Using monkey library to hook any function from mongo driver.
For example:
func insert(collection *mongo.Collection) (int, error) {
ctx, _ := context.WithTimeout(context.Background(), 10*time.Second)
u := User{
Name: "kevin",
Age: 20,
}
res, err := collection.InsertOne(ctx, u)
if err != nil {
log.Printf("error: %v", err)
return 0, err
}
id := res.InsertedID.(int)
return id, nil
}
func TestInsert(t *testing.T) {
var c *mongo.Collection
var guard *monkey.PatchGuard
guard = monkey.PatchInstanceMethod(reflect.TypeOf(c), "InsertOne",
func(c *mongo.Collection, ctx context.Context, document interface{}, opts ...*options.InsertOneOptions) (*mongo.InsertOneResult, error) {
guard.Unpatch()
defer guard.Restore()
log.Printf("record: %+v, collection: %s, database: %s", document, c.Name(), c.Database().Name())
res := &mongo.InsertOneResult{
InsertedID: 100,
}
return res, nil
})
collection := client.Database("db").Collection("person")
id, err := insert(collection)
require.NoError(t, err)
assert.Equal(t, id, 100)
}
What I am looking is equivalent of Document.parse()
in golang, that allows me create bson from json directly? I do not want to create intermediate Go structs for marshaling
The gopkg.in/mgo.v2/bson package has a function called UnmarshalJSON which does exactly what you want.
The data parameter should hold you JSON string as []byte value.
func UnmarshalJSON(data []byte, value interface{}) error
UnmarshalJSON unmarshals a JSON value that may hold non-standard syntax as defined in BSON's extended JSON specification.
Example:
var bdoc interface{}
err = bson.UnmarshalJSON([]byte(`{"id": 1,"name": "A green door","price": 12.50,"tags": ["home", "green"]}`),&bdoc)
if err != nil {
panic(err)
}
err = c.Insert(&bdoc)
if err != nil {
panic(err)
}
mongo-go-driver has a function bson.UnmarshalExtJSON that does the job.
Here's the example:
var doc interface{}
err := bson.UnmarshalExtJSON([]byte(`{"foo":"bar"}`), true, &doc)
if err != nil {
// handle error
}
There is no longer a way to do this directly with supported libraries (e.g. the mongo-go-driver). You would need to write your own converter based on the bson spec.
Edit: here's one that by now has seen a few Terabytes of use in prod.
https://github.com/dustinevan/mongo/blob/main/bsoncv/bsoncv.go
I do not want to create intermediate Go structs for marshaling
If you do want/need to create an intermediate Go BSON structs, you could use a conversion module such github.com/sindbach/json-to-bson-go. For example:
import (
"fmt"
"github.com/sindbach/json-to-bson-go/convert"
"github.com/sindbach/json-to-bson-go/options"
)
func main() {
doc := `{"foo": "buildfest", "bar": {"$numberDecimal":"2021"} }`
opt := options.NewOptions()
result, _ := convert.Convert([]byte(doc), opt)
fmt.Println(result)
}
Will produce output:
package main
import "go.mongodb.org/mongo-driver/bson/primitive"
type Example struct {
Foo string `bson:"foo"`
Bar primitive.Decimal128 `bson:"bar"`
}
This module is compatible with the official MongoDB Go driver, and as you can see it supports Extended JSON formats.
You can also visit https://json-to-bson-map.netlify.app to try the module in action. You can paste a JSON document, and see the Go BSON structs as output.
A simple converter that uses go.mongodb.org/mongo-driver/bson/bsonrw:
func JsonToBson(message []byte) ([]byte, error) {
reader, err := bsonrw.NewExtJSONValueReader(bytes.NewReader(message), true)
if err != nil {
return []byte{}, err
}
buf := &bytes.Buffer{}
writer, _ := bsonrw.NewBSONValueWriter(buf)
err = bsonrw.Copier{}.CopyDocument(writer, reader)
if err != nil {
return []byte{}, err
}
marshaled := buf.Bytes()
return marshaled, nil
}
I am using the GOB encoding for my project and i figured out (after a long fight) that empty strings are not encoded/decoded correctly. In my code i use a errormessage (string) to report any problems, this errormessage is most of the time empty. If i encode a empty string, it become nothing, and this gives me a problem with decoding. I don't want to alter the encoding/decoding because these parts are used the most.
How can i tell Go how to encode/decode empty strings?
Example:
Playground working code.
Playground not working code.
The problem isn't the encoding/gob module, but instead the custom MarshalBinary/UnmarshalBinary methods you've declared for Msg, which can't correctly round trip an empty string. There are two ways you could go here:
Get rid of the MarshalBinary/UnmarshalBinary methods and rely on GOB's default encoding for structures. This change alone wont' be enough because the fields of the structure aren't exported. If you're happy to export the fields then this is the simplest option: https://play.golang.org/p/rwzxTtaIh2
Use an encoding that can correctly round trip empty strings. One simple option would be to use GOB itself to encode the struct fields:
func (m Msg) MarshalBinary() ([]byte, error) {
var b bytes.Buffer
enc := gob.NewEncoder(&b)
if err := enc.Encode(m.x); err != nil {
return nil, err
}
if err := enc.Encode(m.y); err != nil {
return nil, err
}
if err := enc.Encode(m.z); err != nil {
return nil, err
}
return b.Bytes(), nil
}
// UnmarshalBinary modifies the receiver so it must take a pointer receiver.
func (m *Msg) UnmarshalBinary(data []byte) error {
dec := gob.NewDecoder(bytes.NewBuffer(data))
if err := dec.Decode(&m.x); err != nil {
return err
}
if err := dec.Decode(&m.y); err != nil {
return err
}
return dec.Decode(&m.z)
}
You can experiment with this example here: https://play.golang.org/p/oNXgt88FtK
The first option is obviously easier, but the second might be useful if your real example is a little more complex. Be careful with custom encoders though: GOB includes a few features that are intended to detect incompatibilities (e.g. if you add a field to a struct and try to decode old data), which are missing from this custom encoding.
I have a struct that contains math/big.Int fields. I would like to save the struct in mongodb using mgo. Saving the numbers as a strings is good enough in my situation.
I have looked at the available field's tags and nothing seams to allow custom serializer. I was expecting to implement an interface similar to encoding/json.Marshaler but I have found None of such interface in the documentation.
Here is a trivial example of what I want I need.
package main
import (
"labix.org/v2/mgo"
"math/big"
)
type Point struct {
X, Y *big.Int
}
func main() {
session, err := mgo.Dial("localhost")
if err != nil {
panic(err)
}
defer session.Close()
c := session.DB("test").C("test")
err = c.Insert(&Point{big.NewInt(1), big.NewInt(1)})
if err != nil { // should not panic
panic(err)
}
// The code run as expected but the fields X and Y are empty in mongo
}
Thnaks!
The similar interface is named bson.Getter:
http://labix.org/v2/mgo/bson#Getter
It can look similar to this:
func (point *Point) GetBSON() (interface{}, error) {
return bson.D{{"x", point.X.String()}, {"y", point.Y.String()}}, nil
}
And there's also the counterpart interface in the setter side, if you're interested:
http://labix.org/v2/mgo/bson#Setter
For using it, note that the bson.Raw type provided as a parameter has an Unmarshal method, so you could have a type similar to:
type dbPoint struct {
X string
Y string
}
and unmarshal it conveniently:
var dbp dbPoint
err := raw.Unmarshal(&dbp)
and then use the dbp.X and dbp.Y strings to put the big ints back into the real (point *Point) being unmarshalled.