http://play.golang.org/p/SKtaPFtnKO
func md(str string) []byte {
h := md5.New()
io.WriteString(h, str)
fmt.Printf("%x", h.Sum(nil))
// base 16, with lower-case letters for a-f
return h.Sum(nil)
}
All I need is Hash-key string that is converted from an input string. I was able to get it in bytes format usting h.Sum(nil) and able to print out the Hash-key in %x format. But I want to return the %x format from this function so that I can use it to convert email address to Hash-key and use it to access Gravatar.com.
How do I get %x format Hash-key using md5 function in Go?
Thanks,
If I understood correctly you want to return the %x format:
you can import "encoding/hex" and use the EncodeToString method
str := hex.EncodeToString(h.Sum(nil))
or just Sprintf the value:
func md(str string) string {
h := md5.New()
io.WriteString(h, str)
return fmt.Sprintf("%x", h.Sum(nil))
}
note that Sprintf is slower because it needs to parse the format string and then reflect based on the type found
http://play.golang.org/p/vsFariAvKo
You should avoid using the fmt package for this. The fmt package uses reflection, and it is expensive for anything other than debugging. You know what you have, and what you want to convert to, so you should be using the proper conversion package.
For converting from binary to hex, and back, use the encoding/hex package.
To Hex string:
str := hex.EncodeToString(h.Sum(nil))
From Hex string:
b, err := hex.DecodeString(str)
There are also Encode / Decode functions for []byte.
When you need to convert to / from a decimal use the strconv package.
From int to string:
str := strconv.Itoa(100)
From string to int:
num, err := strconv.Atoi(str)
There are several other functions in this package that do other conversions (base, etc.).
So unless you're debugging or formatting an error message, use the proper conversions. Please.
Related
I am trying to insert/update data in PostgreSQL using jackc/pgx into a table that has column of custom type. This is the table type written as a golan struct:
// Added this struct as a Types in PSQL
type DayPriceModel struct {
Date time.Time `json:"date"`
High float32 `json:"high"`
Low float32 `json:"low"`
Open float32 `json:"open"`
Close float32 `json:"close"`
}
// The 2 columns in my table
type SecuritiesPriceHistoryModel struct {
Symbol string `json:"symbol"`
History []DayPriceModel `json:"history"`
}
I have written this code for inserting data:
func insertToDB(data SecuritiesPriceHistoryModel) {
DBConnection := config.DBConnection
_, err := DBConnection.Exec(context.Background(), "INSERT INTO equity.securities_price_history (symbol) VALUES ($1)", data.Symbol, data.History)
}
But I am unable to insert the custom data type (DayPriceModel).
I am getting an error
Failed to encode args[1]: unable to encode
The error is very long and mostly shows my data so I have picked out the main part.
How do I INSERT data into PSQL with such custom data types?
PS: An implementation using jackc/pgx is preferred but database/SQL would just do fine
I'm not familiar enough with pgx to know how to setup support for arrays of composite types. But, as already mentioned in the comment, you can implement the driver.Valuer interface and have that implementation produce a valid literal, this also applies if you are storing slices of structs, you just need to declare a named slice and have that implement the valuer, and then use it instead of the unnamed slice.
// named slice type
type DayPriceModelList []DayPriceModel
// the syntax for array of composites literal looks like
// this: '{"(foo,123)", "(bar,987)"}'. So the implementation
// below must return the slice contents in that format.
func (l DayPriceModelList) Value() (driver.Value, error) {
// nil slice? produce NULL
if l == nil {
return nil, nil
}
// empty slice? produce empty array
if len(l) == 0 {
return []byte{'{', '}'}, nil
}
out := []byte{'{'}
for _, v := range l {
// This assumes that the date field in the pg composite
// type accepts the default time.Time format. If that is
// not the case then you simply provide v.Date in such a
// format which the composite's field understand, e.g.,
// v.Date.Format("<layout that the pg composite understands>")
x := fmt.Sprintf(`"(%s,%f,%f,%f,%f)",`,
v.Date,
v.High,
v.Low,
v.Open,
v.Close)
out = append(out, x...)
}
out[len(out)-1] = '}' // replace last "," with "}"
return out, nil
}
And when you are writing the insert query, make sure to add an explicit cast right after the placeholder, e.g.
type SecuritiesPriceHistoryModel struct {
Symbol string `json:"symbol"`
History DayPriceModelList `json:"history"` // use the named slice type
}
// ...
_, err := db.Exec(ctx, `INSERT INTO equity.securities_price_history (
symbol
, history
) VALUES (
$1
, $2::my_composite_type[])
`, data.Symbol, data.History)
// replace my_composite_type with the name of the composite type in the database
NOTE#1: Depending on the exact definition of your composite type in postgres the above example may or may not work, if it doesn't, simply adjust the code to make it work.
NOTE#2: The general approach in the example above is valid, however it is likely not very efficient. If you need the code to be performant do not use the example verbatim.
I need to Print a Unicode Character whose value should not be hard-coded
This is how to Print unicode in General
print!("\u{2518}");
now the 2518 should not be hard-coded, i need to provide it like that
print!("\u{}", 0x2518);
I've tried print!("\u{{}}", 0x2518); but didn't work
Thanks in advance
You can use std::char::from_u32 for this. Because not all values of a u32 represent valid Unicode Scalar Values, you need to handle an error-case:
fn main() {
let i = 0x2518;
println!(
"{}",
match std::char::from_u32(i) {
Some(c) => c,
None => '�',
}
);
}
I'm receiving via a REST API a string which contains unicode encoded characters in form of \uXXXX
e.g. Ain\u2019t which should be Ain’t
Is there a nice way to convert these?
You can use \u{my_unicode}:
print("Ain\u{2019}t this a beautiful day")
/* Prints "Ain’t this a beautiful day"
From the Language Guide - Strings and Characters - Unicode:
String literals can include the following special characters:
...
An arbitrary Unicode scalar, written as \u{n}, where n is a 1–8 digit
hexadecimal number with a value equal to a valid Unicode code point
You can apply a string transform StringTransform:
extension String {
var decodingUnicodeCharacters: String { applyingTransform(.init("Hex-Any"), reverse: false) ?? "" }
}
let string = #"Ain\u2019t"#
print(string.decodingUnicodeCharacters) // "Ain’t\n"
The sqlx package has a MapScan function that's quite handy in that it returns a row as a map (map[string]interface{}) but all string columns come out as runes (if I'm not mistaken). Is there a way to have it just return as strings instead?
sqlx - github.com/jmoiron/sqlx
I have encountered a similar issue when dealing with sql in go. Some googling lead me to the go driver docs. Here is what they have for a Value type returned from a query.
Value is a value that drivers must be able to handle. It is either nil or an instance of one of these types:
int64
float64
bool
[]byte
string [*] everywhere except from Rows.Next.
time.Time
Strings are returned as byte slices (when you attempt to encode []bytes into json it base64s it). I have not found a way within the framework of sqlx or sql/db to return strings instead of []byte slices, but did come up with a quick and dirty conversion for slices in a map. There is limited type checking, but it is a good start.
func convertStrings(in map[string]interface{}) {
for k, v := range in {
t := reflect.TypeOf(v)
if t != nil {
switch t.Kind() {
case reflect.Slice:
in[k] = fmt.Sprintf("%s", v)
default:
// do nothing
}
}
}
}
I am trying to read and write data from a net.Conn but since I have only Read([]byte) and Write([]byte) functions, I am finding quite hard to find helper functions to do this job.
I need to read and write the following types:
uint64
byte
uint32
UTF-8 encoded string ( first a uint32 length and the string data after)
In Short
Is there anything like Java's DataInputStream and DataOutputStream in Go's packages ?
Thanks and regards
You need to decide on a format to marshal to and from. Your choices are to either roll your own format or to use one that was already made. I highly recommend the latter.
I have previously posted about many of the formats supported in the go standard library here: https://stackoverflow.com/a/13575325/727643
If you decide to roll your own, then uints can be encoded and decoded from []byte using encoding/binary. It gives you the option of both little and big endian. Strings can be converted directly to []byte using []byte(str). Finally, bytes can just be sent as bytes. No magic needed.
I will stress that making up your own format is normally a bad idea. I tend to use JSON by default and use others only when I can get a significant performance increase and I believe it worth the time to do it.
One little secret of binary encoding is that you can write and read entire data structures:
From the Playground
buf := new(bytes.Buffer)
err := binary.Write(buf, binary.LittleEndian, &MyMessage{
First: 100,
Second: 0,
Third: 100,
Message: MyString{0, [10]byte{'H', 'e', 'l', 'l', 'o', '\n'}},
})
if err != nil {
fmt.Printf("binary.Read failed:", err)
return
}
// <<--- CONN -->>
msg := new(MyMessage)
err2 := binary.Read(buf, binary.LittleEndian, msg)
if err2 != nil {
fmt.Printf("binary.Read failed:", err2)
return
}
Pay attention at the kind of types that you can use:
from binary/encoding docs:
A fixed-size value is either a fixed-size arithmetic type (int8, uint8, int16, float32, complex64, ...) or an array or struct containing only fixed-size values.
notice then that you have to use [10] byte and can't use []byte
Fabrizio's answer is good and I would like to add that you should probably wrap your socket with a buffered reader and buffered writer from the bufio package:
http://golang.org/pkg/bufio/