How to get DecodeBytes() output without canonical extended JSON additions - mongodb

I use DecodeBytes() function to get the data from mongoDB (as the struct of the data can varies) with mongo-driver for Go.
My problem is when one of the values is int/double (and not string).
In that case it add adds some stuff of canonical extended JSON, for example 3 to "$numberDouble": "3.0".
How can I remove those additions of the canonical extended JSON?
func (m *Mongoclient) Find(collection string, filter interface{}) string {
findResult := m.Db.Collection(collection).FindOne(m.Ctx, filter)
if findResult.Err() != nil {
fmt.Println(findResult.Err().Error())
return ""
}
db, err := findResult.DecodeBytes()
if err != nil {
fmt.Println(err.Error())
return ""
}
return db.String()
}

The solution was to use Decode function to bson.M and than json.Marshal it:
var document bson.M
err := findResult.Decode(&document)
if err != nil {
fmt.Println(err.Error())
return ""
}
resBytes, err := json.Marshal(document)
if err != nil {
fmt.Println(err)
return ""
}
return string(resBytes)

Related

How to have mongodb decode struct passed into function

The Decode step of the following code does not populate the original document object correctly. It overwrites it with a bson object.
func main() {
c := Call{}
dbGetObject("collection", &c)
}
func dbGetObject(collectionName string, document interface{}) (err error) {
uri, creds, auth := dbGetAuth()
clientOpts := options.Client().ApplyURI(uri).SetAuth(creds)
client, err := mongo.Connect(context.TODO(), clientOpts)
if err != nil {
log.Fatal(err)
return err
}
defer client.Disconnect(context.TODO())
collection := client.Database(auth.Database).Collection(collectionName)
err = collection.FindOne(context.TODO(), bson.M{"number": "12345"}).Decode(&document)
if err != nil {
log.Fatal(err)
return err
}
return nil
}
Yet the following code does work properly:
func dbGetObject(collectionName string) (err error) {
uri, creds, auth := dbGetAuth()
clientOpts := options.Client().ApplyURI(uri).SetAuth(creds)
client, err := mongo.Connect(context.TODO(), clientOpts)
if err != nil {
log.Fatal(err)
return err
}
defer client.Disconnect(context.TODO())
collection := client.Database(auth.Database).Collection(collectionName)
c := Call{}
err = collection.FindOne(context.TODO(), bson.M{"number": "12345"}).Decode(&c)
if err != nil {
log.Fatal(err)
return err
}
return nil
}
The only difference being that the instance of the struct is passed into the function vs instantiated in the dbGetObject function. What am I doing wrong?
In the first example, the type of document is interface.
if you fix the argument type as below, it will work correctly.
func dbGetObject(collectionName string, document *Call)
I actually realized what is going on. In the call to the function and then to Decode I am passing a pointer. So, it's actually passing a pointer to a pointer into the Decode call. So the fix is to change the decode call from:
err = collection.FindOne(context.TODO(), bson.M{"number": "12345"}).Decode(&document)
to
err = collection.FindOne(context.TODO(), bson.M{"number": "12345"}).Decode(document)

Custom BSON marshal and unmarshal using mongo-driver

I have a struct field like this below. I also store raw protobuf of the same struct in db. Now every time fetch or save data to mongo. I have to update ReallyBigRaw, from the proto when I want to save to DB and when fetch I have to unmarshal ReallyBigRaw to ReallyBigObj to give out responses. Is there a way I can implement some interface or provide some callback functions so that the mongo driver does this automatically before saving or fetching data from DB.
Also, I am using the offical golang mongo driver not mgo, I have read some answers where can be done in mgo golang library.
import (
"github.com/golang/protobuf/jsonpb"
"go.mongodb.org/mongo-driver/bson"
"go.mongodb.org/mongo-driver/bson/primitive"
proto "github.com/dinesh/api/go"
)
type ReallyBig struct {
ID string `bson:"_id,omitempty"`
DraftID string `bson:"draft_id,omitempty"`
// Marshaled ReallyBigObj proto to map[string]interface{} stored in DB
ReallyBigRaw map[string]interface{} `bson:"raw,omitempty"`
ReallyBigObj *proto.ReallyBig `bson:"-"`
CreatedAt primitive.DateTime `bson:"created_at,omitempty"`
UpdatedAt primitive.DateTime `bson:"updated_at,omitempty"`
}
func (r *ReallyBig) GetProto() (*proto.ReallyBig, error) {
if r.ReallyBigObj != nil {
return r.ReallyBigObj, nil
}
Obj, err := getProto(r.ReallyBigRaw)
if err != nil {
return nil, err
}
r.ReallyBigObj = Obj
return r.ReallyBigObj, nil
}
func getRaw(r *proto.ReallyBig) (map[string]interface{}, error) {
m := jsonpb.Marshaler{}
b := bytes.NewBuffer([]byte{})
// marshals proto to json format
err := m.Marshal(b, r)
if err != nil {
return nil, err
}
var raw map[string]interface{}
// unmarshal the raw data to an interface
err = json.Unmarshal(b.Bytes(), &raw)
if err != nil {
return nil, err
}
return raw, nil
}
func getProto(raw map[string]interface{}) (*proto.ReallyBig, error) {
b, err := json.Marshal(raw)
if err != nil {
return nil, err
}
u := jsonpb.Unmarshaler{}
var reallyBigProto proto.ReallyBig
err = u.Unmarshal(bytes.NewReader(b), &recipeProto)
if err != nil {
return nil, err
}
return &reallyBigProto, nil
}
I implemented the Marshaler and Unmarshaler interface. Since mongo driver calls MarshalBSON and UnmarshalBSON if the type implements Marshaler and Unmarshaler we also end up in infinite loop. To avoid that we create a Alias of the type. Alias in Golang inherit only the fields not the methods so we end up calling normal bson.Marshal and bson.Unmarshal
func (r *ReallyBig) MarshalBSON() ([]byte, error) {
type ReallyBigAlias ReallyBig
reallyBigRaw, err := getRaw(r.ReallyBigObj)
if err != nil {
return nil, err
}
r.ReallyBigRaw = reallyBigRaw
return bson.Marshal((*ReallyBigAlias)(r))
}
func (r *ReallyBig) UnmarshalBSON(data []byte) error {
type ReallyBigAlias ReallyBig
err := bson.Unmarshal(data, (*ReallyBigAlias)(r))
if err != nil {
return err
}
reallyBigProto, err := getProto(r.ReallyBigRaw)
if err != nil {
return err
}
r.ReallyBigObj = reallyBigProto
return nil
}

Deserialize cursor into array with mongo-go-driver and interface

I create an api using golang, i would like to create some functionnal test, for that i create an interface to abstract my database. But for that i need to be able to convert the cursor to an array without knowing the type.
func (self *KeyController) GetKey(c echo.Context) (err error) {
var res []dto.Key
err = db.Keys.Find(bson.M{}, 10, 0, &res)
if err != nil {
fmt.Println(err)
return c.String(http.StatusInternalServerError, "internal error")
}
c.JSON(http.StatusOK, res)
return
}
//THE FIND FUNCTION ON THE DB PACKAGE
func (s MongoCollection) Find(filter bson.M, limit int, offset int, res interface{}) (err error) {
ctx := context.Background()
var cursor *mongo.Cursor
l := int64(limit)
o := int64(offset)
objectType := reflect.TypeOf(res).Elem()
cursor, err = s.c.Find(ctx, filter, &options.FindOptions{
Limit: &l,
Skip: &o,
})
if err != nil {
return
}
defer cursor.Close(ctx)
for cursor.Next(ctx) {
result := reflect.New(objectType).Interface()
err := cursor.Decode(&result)
if err != nil {
panic(err)
}
res = append(res.([]interface{}), result)
}
return
}
Does someone have an idea?
You can call directly the "All" method:
ctx := context.Background()
err = cursor.All(ctx, res)
if err != nil {
fmt.Println(err.Error())
}
For reference:
https://godoc.org/go.mongodb.org/mongo-driver/mongo#Cursor.All
i think you want to encapsulate the Find method for mongo query.
Using the reflect package i have improved your code by adding an additional parameter that serves as a template to instantiate new instances of slice items.
func (m *MongoDbModel) FindAll(database string, colname string, obj interface{}, parameter map[string]interface{}) ([]interface{}, error) {
var list = make([]interface{}, 0)
collection, err := m.Client.Database(database).Collection(colname).Clone()
objectType := reflect.TypeOf(obj).Elem()
fmt.Println("objectype", objectType)
if err != nil {
log.Println(err)
return nil, err
}
filter := bson.M{}
filter["$and"] = []bson.M{}
for key, value := range parameter {
filter["$and"] = append(filter["$and"].([]bson.M), bson.M{key: value})
}
cur, err := collection.Find(context.Background(), filter)
if err != nil {
log.Fatal(err)
}
defer cur.Close(context.Background())
for cur.Next(context.Background()) {
result := reflect.New(objectType).Interface()
err := cur.Decode(result)
if err != nil {
log.Println(err)
return nil, err
}
list = append(list, result)
}
if err := cur.Err(); err != nil {
return nil, err
}
return list, nil
}
The difference is that FindAll method returns []interface{}, where err := cur.Decode(result) directly consumes a pointer like the result variable.

Optional Find in .Find() MongoDB query

My listing may receive a filter parameter, but this parameter is mandatory.
status := r.FormValue("status")
var bet []*Bet
if err := db.C(collectionName).Find(bson.M{"status": status}).Sort("-data-criacao").All(&bet); err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
If the parameter was not informed, the query returns no result.
To return all the results, I used to do the following
var bet []*Bet
if err := db.C(collectionName).Find(nil).Sort("-data-criacao").All(&bet); err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
How can I meet both alternatives?
Simply use an if statement to construct your query based on whether the parameter is supplied.
Something like this:
status := r.FormValue("status")
var bet []*Bet
var filter bson.M
if status != "" {
filter = bson.M{"status": status}
}
err := db.C(collectionName).Find(filter).Sort("-data-criacao").All(&bet)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}

how to implement a quoted printable decoder into reading the body of an email?

The following code (partial of the full code), creates a reader, then read the body of an email and store them into 'final body'. this final body is then passed into mongoldb for archiving. However, the message body that is read is quoted printable, and I want the body passed into mongodb to be decoded into utf8. how to implement quoted printable package into this code, and where exactly?
// Creates a reader.
mediaType, params, err := mime.ParseMediaType(contentType)
if err != nil {
log.Println("Unable to read the type of the content.")
log.Println(err)
return
}
reader := multipart.NewReader(msg.Body, params["boundary"])
// Reads the body
finalBody := ""
if strings.HasPrefix(mediaType, "multipart/") {
for {
p, err := reader.NextPart()
if err == io.EOF {
break
}
if err != nil {
log.Println(err)
return
}
slurp, err := ioutil.ReadAll(p)
if err != nil {
log.Println(err)
return
}
finalBody += string(slurp)
}
} else {
txt, err := ioutil.ReadAll(msg.Body)
if err != nil {
log.Fatal(err)
}
finalBody += string(txt)
}
and this segment passed the final body into mongodb
importMsg := &importer.Mail{
Body: finalBody }
// Saves in MongoDB
dal := importer.NewMailDAO(c, mongo)
dal.Save(importMsg)
}
Use the quotedprintable package. When the encoding of a body element is quotedprintable, slurp up the text with:
slurp, err := ioutil.ReadAll(quotedprintable.NewReader(r))
where r is either the body or a part.