I have some text in file.Text like below:
#cat tmp
host = "192.168.2.80"
port = 5432
user = "pnmsuser"
password = "PNMS$$$$$$"
dbname = "pnms"
Just I want text like below after trimming:
"192.168.2.80"
5432
"pnmsuser"
"PNMS$$$$$$"
"pnms"
I try to trim like below
func dbFileTrimming() {
dat, err := ioutil.ReadFile("tmp")
check(err)
for key, line := range strings.Split(strings.TrimRight(string(dat), "\n"), "\n") {
// println(key, line)
if key == 3 {
line := string([]rune(line)[11:])
fmt.Println(line)
} else if key == 4 {
line := string([]rune(line)[9:])
fmt.Println(line)
} else {
line := string([]rune(line)[7:])
fmt.Println(line)
}
}
}
Is there a simple method for this?
Chop the line after the =:
for _, line := range strings.Split(strings.TrimRight(string(dat), "\n"), "\n") {
line = line[strings.Index(line, " = ")+3:]
fmt.Println(line)
}
This looks like an INI file, or similar enough so that libraries like go-ini could work.
Alternatively, try Strings.split() and putting the result in a map. Quick and dirty / untested:
result := map[string]string
for _, line := range strings.Split(string(dat), "\n") {
split := strings.Split(line, "=")
key := strings.Trim(split[0])
value := strings.Trim(split[0])
result[key] = value
}
Related
I have an endpoint where users can filter a mongo collection using query parameters. If I have just one query parameter e.g. title, I can do this -
filter := bson.M{}
if params.Title != "" {
filter = bson.M{"title": params.Title}
}
However, if I have more than one query parameter, I can't seem to get how to append to the bson object.
I tried this -
filter := []bson.M{}
if params.Title != "" {
filter = append(filter, bson.M{"title": params.Title})
}
if params.Description != "" {
filter = append(filter, bson.M{"description": params.Description})
}
but I got this error - cannot transform type []primitive.M to a BSON Document: WriteArray can only write a Array while positioned on a Element or Value but is positioned on a TopLevel
How do I solve this?
bson.M{} is underlined map[string]interface{} in go-mongo-driver. So if you need to add more elemnets, you can not append. Just assign that value to map's key as below.
filter := bson.M{}
if params.Title != "" {
//filter = bson.M{"title": params.Title}
filter["title"] = params.Title
}
if params.Description != "" {
filter["description"] = params.Description
}
Consider a collection test with a document: { "_id" : 1, "Title" : "t-1", "Description" : "d-1" }
And, you can use the following:
title := "t-1"
description := "" // or "d-1"
filter := bson.M{}
if Title != "" {
filter["Title"] = title
}
if Description != "" {
filter["Description"] = description
}
//fmt.Println(filter);
var result bson.M
collection := client.Database("test").Collection("test")
err := collection.FindOne(context.TODO(), filter).Decode(&result)
if err != nil {
log.Fatal(err)
}
fmt.Printf("Found a single document: %+v\n", result)
sql
CREATE TABLE public.tiantang_page (
href varchar NOT NULL,
status int4 NOT NULL,
description varchar NOT NULL,
urls url[] NULL
);
CREATE TYPE url AS (
url varchar,
status int4);
insert composite type array
type url struct {
url string
status int
}
var urls [1]url
urls[0] = url{
url: "",
status: 0,
}
update := "UPDATE \"public\".\"tiantang_page\" SET \"urls\"=$1 where \"href\"=$2;"
r, err := db.Exec(update, pq.Array(urls),href)
if err != nil {
log.Fatal(err)
}
error
sql: converting argument $1 type: unsupported type parsetest.url, a struct
library
https://godoc.org/github.com/lib/pq
Note that custom composite types are not fully supported by lib/pq.
If all you want is to be able to store the urls then the simplest approach would be to implement the driver.Valuer interface on the url type and then use it as you do with pq.Array:
func (u url) Value() (driver.Value, error) {
return fmt.Sprintf("(%s,%d)", u.url, u.status), nil
}
// ...
r, err := db.Exec(update, pq.Array(urls), href)
more info on that can be found here: https://github.com/lib/pq/issues/544
Note that I haven't tried this with arrays, only with slices, so you may have to switch from using an array to using a slice, i.e. instead of var urls [1]url you would use var urls = make([]url, 1).
If you also want to be able to retrieve the array of urls back from the db, then you'll have to implement the sql.Scanner interface, however here the pq.Array is not very reliable and you'll have to implement the scanner on the slice type and do all the parsing yourself.
The general format of composite types is (val1, val2, ...) note that you have to put double quotes around values that contain commas or parentheses. For example to construct a value of the url type you would use the literal expression: (http://example.com,4). More info in the docs.
The format for an array of composite types is {"(val1, val2, ...)" [, ...]}, note that in this case if you need to put double quotes around the values you need to escape them. For example {"(http://example.com,4)","(\"http://example.com/?list=foo,bar,baz\",3)"}
So as you can see the more complex the data in the composite type the more complex will be the parsing as well.
Here's a crude example (does not handle quoted values):
type urlslice []url
func (s *urlslice) Scan(src interface{}) error {
var a []byte // the pq array as bytes
switch v := src.(type) {
case []byte:
a = v
case string:
a = []byte(v)
case nil:
*s = nil
return nil
default:
return fmt.Errorf("urlslice.Scan unexpected src type %T", src)
}
a = a[1 : len(a)-1] // drop curly braces
for i := 0; i < len(a); i++ {
if a[i] == '"' && (len(a) > (i+1) && a[i+1] == '(') { // element start?
i += 2 // move past `"(`
j := i // start of url.url
u := url{}
for ; i < len(a) && a[i] != ','; i++ {
}
u.url = string(a[j:i])
i += 1 // move past `,`
j = i // start of url.status
for ; i < len(a) && a[i] != ')'; i++ {
}
i64, err := strconv.ParseInt(string(a[j:i]), 10, 64)
if err != nil {
return err
}
u.status = int(i64)
*s = append(*s, u)
i += 2 // move past `)",`
}
}
return nil
}
for completeness, here's the Valuer interface implemented by the slice type, again not handling proper quoting of values that may require it:
func (s urlslice) Value() (driver.Value, error) {
data := []byte{'{'}
for _, url := range s {
data = append(data, '"', '(')
data = append(data, []byte(url.url)...)
data = append(data, ',')
data = strconv.AppendInt(data, int64(url.status), 10)
data = append(data, ')', '"', ',')
}
data[len(data)-1] = '}' // replace last ',' with '}' to close the array
return data, nil
}
With the urlslice implementing the two interfaces directly you can stop using pq.Array.
var urls = urlslice{{
url: "http://example.com",
status: 4,
}}
update := `UPDATE "public"."tiantang_page" SET "urls"=$1 where "href"=$2`
r, err := db.Exec(update, urls, href)
if err != nil {
log.Fatal(err)
}
var urls2 urlslice
selurls := `SELECT "urls" FROM "public"."tiantang_page" where "href" = $1`
if err := db.QueryRow(selurls, href).Scan(&urls2); err != nil {
log.Fatal(err)
}
Please keep in mind that both of the above examples should be considered only as hints of the direction to take in solving this problem. Not only are the two examples incomplete in that they don't handle quoted values, but they are also not very elegant implementations.
Reasonably complete composite literal parser:
type parseState int
const (
state_initial parseState = iota // start
state_value_start // no bytes read from value yet
state_value // unquoted value
state_quoted // inside quote
state_value_end // after a close quote
state_end // after close paren
)
func parseComposite(in []byte) ([]string, error) {
state := state_initial
ret := []string{}
val := []byte{}
for _, b := range in {
switch state {
case state_initial:
if b != '(' {
return nil, fmt.Errorf("initial character not ')': %v", in)
} else {
state = state_value_start
}
case state_value_start:
if b == '"' {
state = state_quoted
continue
}
fallthrough
case state_value:
if b == ',' {
ret = append(ret, string(val))
val = nil
state = state_value_start
} else if b == ')' {
ret = append(ret, string(val))
val = nil
state = state_end
} else {
val = append(val, b)
}
case state_quoted:
if b == '"' {
ret = append(ret, string(val))
val = nil
state = state_value_end
} else {
val = append(val, b)
}
case state_value_end:
if b == ',' {
state = state_value_start
} else if b == ')' {
state = state_end
} else {
return nil, fmt.Errorf("invalid delimiter after closing quote: %v", in)
}
case state_end:
return nil, fmt.Errorf("trailing bytes: %v", in)
}
}
if state != state_end {
return nil, fmt.Errorf("unterminated value: %v", in)
}
return ret, nil
}
By using go api I'm retrieving the an array object. like given below:-
[
{0 1 Sunday 1 21600 25200 1}
{0 1 Sunday 2 28800 32400 2}
{0 1 Sunday 3 36000 39600 1}
]
This data will be arranged using struct:-
type ProviderSpot struct {
Id int `json:"_id" bson:"_id"`
PId int `json:"pid" bson:"pid"`
Day string `json:"day" bson:"day"`
TimeSlug int `json:"time_slug" bson:"time_slug"`
StartTime int64 `json:"start_time" bson:"start_time"`
EndTime int64 `json:"end_time" bson:"end_time"`
Count int `json:"count" bson:"count"`
}
type ProviderSpots []ProviderSpot
See in the array object I have an count values in each object 1,2,1 then I have to store this record like that those record having count they will store in the available_spot only one time means that the upper record will save in the collection only one time having there count value 1 after that the left record will remains there count value 0,1,0 then
those record having there count field value more than 0 they will save that number of times having count value in the addition_spot. The code of golang I'm using is this:-
func SaveProviderSpot(c *gin.Context) {
response := ResponseController{}
values := c.PostForm("array")
var err error
byt := []byte(values)
var result models.ProviderSpots
if err = json.Unmarshal(byt, &result); err != nil{
fmt.Println(err)
}
fmt.Println(result)
for i := 0; i < len(result); i++ {
lastValue :=result[i].Count-1
if lastValue != -1 {
providerspot.PId = result[i].PId
providerspot.Day = result[i].Day
providerspot.TimeSlug = result[i].TimeSlug
providerspot.StartTime = result[i].StartTime
providerspot.EndTime = result[i].EndTime
providerspot.Count = result[i].Count - lastValue
id, _ := models.GetAutoIncrementCounter(config.ProvidersSpotsCounterId, config.ProvidersSpotsCollection)
providerspot.Id = id
fmt.Println("Here We go now :- ", &providerspot)
err = models.AddProviderSpot(&providerspot)
}
}
}
Give some example of this which will solve this. Thanks for your valueable time spending on this question.
I solved that question answer but Can anyone tell me that will right for my code or not:-
func SaveProviderSpot(c *gin.Context) {
response := ResponseController{}
values := c.PostForm("array")
var err error
byt := []byte(values)
var result models.ProviderSpots
if err = json.Unmarshal(byt, &result); err != nil{
fmt.Println(err)
}
fmt.Println(result)
for i := 0; i < len(result); i++ {
for j := 1; j <= result[i].Count; j++ {
// lastValue := result[i].Count-1
// if lastValue != -1 {
if j == 1{
providerspot.PId = result[i].PId
providerspot.Day = result[i].Day
providerspot.TimeSlug = result[i].TimeSlug
providerspot.StartTime = result[i].StartTime
providerspot.EndTime = result[i].EndTime
providerspot.Count = 1//result[i].Count - lastValue
id, _ := models.GetAutoIncrementCounter(config.ProvidersSpotsCounterId, config.ProvidersSpotsCollection)
providerspot.Id = id
fmt.Println("Here We go now :- ", &providerspot)
err = models.AddProviderSpot(&providerspot)
}else{
providerspot.PId = result[i].PId
providerspot.Day = result[i].Day
providerspot.TimeSlug = result[i].TimeSlug
providerspot.StartTime = result[i].StartTime
providerspot.EndTime = result[i].EndTime
providerspot.Count = 1//result[i].Count - lastValue
id, _ := models.GetAutoIncrementCounter(config.AdditionalProviderCounterSpot, config.AdditionalProviderSpot)
providerspot.Id = id
err = models.AddAdditionalProviderSpot(&providerspot)
}
}
}
}
This will do want I want but I'm confused that it is right for me or not.
I'm parsing a form in Go and I frequently find groups of checkboxes which need to be processed into text like so:
[ ] Foo
[x] Bar
[ ] Baz
[x] Bat
where the output should be a comma-separated list "BarText, BatText" corresponding to the checked items, or "None" if none of the items are checked. What is a good way to handle this situation? Repeating the logic each time seems like a bad idea.
In the spirit of YAGNI there's no need to handle possible future changes like translations into other languages (actually this example is highly unlikely to be useful in the present context).
Efficiency is unimportant for this application.
Edit: code looks like this (source):
func handleCheckboxesForm(w http.ResponseWriter, r *http.Request) {
b := func(name string) string { // helper function for boolean values in the form
return r.FormValue(name) == "on"
}
text := "Header stuff here"
mytext := ""
if b("nfpa-alk") {
mytext += ", alkaline"
}
if b("nfpa-acid") {
mytext += ", acid"
}
if b("nfpa-w") {
mytext += ", reacts violently with water"
}
if b("nfpa-alk") || b("nfpa-acid") || b("nfpa-w") {
text += mytext[2:] + "\n"
} else {
text += "none\n"
}
// lots of other checkbox groups here
// do stuff with text
}
There are many repeating code in yours which can be optimized out.
Your code must contain at least the following "fragments":
The mappings from entry name to entry text, which can be stored in a map, e.g.
var mappings = map[string]string {
"Foo": "Foo text",
"Bar": "Bar text",
"Baz": "Baz text",
"Bat": "Bat text",
// ... other mappings
}
And the list of keys belonging to a group, which can be stored in a slice, e.g.
var group1 = []string{"Foo", "Bar", "Baz", "Bat"}
Once you defined these, you can have a helper method which handles a group:
func handleGroup(r *http.Request, group []string) (res string) {
for _, v := range group {
if r.FormValue(v) == "on" {
res := ", " + mappings[v]
}
}
if res == "" {
return "none\n"
}
return res[2:] + "\n"
}
That's all. After this your handler can be this simple:
func checkboxHandler(w http.ResponseWriter, r *http.Request) {
// Handle group1:
res1 := handleGroup(r, group1)
// Handle group2:
res2 := handleGroup(r, group2)
}
Notes:
It wasn't your requirement, but this solution handles translations very easily: each translation can have its own mappings map, and that's all. Nothing else needs to be changed.
Performance also wasn't your concern, but appending strings isn't very efficient this way. If performance is at least a little concern, you can improve it without adding complexity by utilizing bytes.Buffer:
func handleGroup(r *http.Request, group []string) string {
buf := &bytes.Buffer{}
for _, v := range group {
if r.FormValue(v) == "on" {
buf.WriteString(", ")
buf.WriteString(mappings[v])
}
}
if buf.Len() == 0 {
return "none\n"
}
buf.WriteString("\n")
return string(buf.Bytes()[2:])
}
This will store the form values into an array.
Then, it will iterate the array into a string with appending "," at the end of each name.
Then, it will put very last ", " (2 bytes) if it's longer than 2, otherwise, print "None"
func(w http.ResponseWriter, r *http.Request) {
r.ParseMultipartForm(0)
arr := []string{}
if r.FormValue("Foo") {
arr = append(arr, "Foo")
}
if r.FormValue("Bar") {
arr = append(arr, "Bar")
}
if r.FormValue("Baz") {
arr = append(arr, "Baz")
}
if r.FormValue("Bat") {
arr = append(arr, "Bat")
}
out := ""
for _, title := range arr {
out += title +", "
}
if len(out) > 2 {
out := out[0: len(out)-2]
} else {
out = "None"
}
fmt.Println(out)
}
If you want to iterate,
for k, vs:= range r.Form {
for _, v:= range vs{
fmt.Println(k, v)
}
}
I have the following text file structure (the text file is pretty big, around 100,000 lines):
A|a1|111|111|111
B|111|111|111|111
A|a2|222|222|222
B|222|222|222|222
B|222|222|222|222
A|a3|333|333|333
B|333|333|333|333
...
I need to extract a piece of text related to a given key. For example, if my key is A|a2, I need to save the following as a string:
A|a2|222|222|222
B|222|222|222|222
B|222|222|222|222
For my C++ and Objective C projects, I used the C++ getline function as follows:
std::ifstream ifs(dataPathStr.c_str());
NSString* searchKey = #"A|a2";
std::string search_string ([searchKey cStringUsingEncoding:NSUTF8StringEncoding]);
// read and discard lines from the stream till we get to a line starting with the search_string
std::string line;
while( getline( ifs, line ) && line.find(search_string) != 0 );
// check if we have found such a line, if not report an error
if( line.find(search_string) != 0 )
{
data = DATA_DEFAULT ;
}
else{
// we need to form a string that would include the whole set of data based on the selection
dataStr = line + '\n' ; // result initially contains the first line
// now keep reading line by line till we get an empty line or eof
while(getline( ifs, line ) && !line.empty() )
{
dataStr += line + '\n'; // append this line to the result
}
data = [NSString stringWithUTF8String:navDataStr.c_str()];
}
As I am doing a project in Swift, I am trying to get rid of getline and replace it with something "Cocoaish". But I cannot find a good Swift solution to address the above problem. If you have an idea, I would really appreciate it. Thanks!
Using the StreamReader class from Read a file/URL line-by-line in Swift, you could do that it Swift like this:
let searchKey = "A|a2"
let bundle = NSBundle.mainBundle()
let pathNav = bundle.pathForResource("data_apt", ofType: "txt")
if let aStreamReader = StreamReader(path: pathNav!) {
var dataStr = ""
while let line = aStreamReader.nextLine() {
if line.rangeOfString(searchKey, options: nil, range: nil, locale: nil) != nil {
dataStr = line + "\n"
break
}
}
if dataStr == "" {
dataStr = "DATA_DEFAULT"
} else {
while let line = aStreamReader.nextLine() {
if countElements(line) == 0 {
break
}
dataStr += line + "\n"
}
}
aStreamReader.close()
println(dataStr)
} else {
println("cannot open file")
}