I need to Print a Unicode Character whose value should not be hard-coded
This is how to Print unicode in General
print!("\u{2518}");
now the 2518 should not be hard-coded, i need to provide it like that
print!("\u{}", 0x2518);
I've tried print!("\u{{}}", 0x2518); but didn't work
Thanks in advance
You can use std::char::from_u32 for this. Because not all values of a u32 represent valid Unicode Scalar Values, you need to handle an error-case:
fn main() {
let i = 0x2518;
println!(
"{}",
match std::char::from_u32(i) {
Some(c) => c,
None => '�',
}
);
}
Related
I have a set of strings I need to sort in an order which is not Latin alphabetic.
Specifically, I have a string "AiyawbpfmnrhHxXzsSqkgtTdD" which specifies the sorting order, i.e., "y" comes before "a", but after "A". In case you are interested, this is the sort order for ancient Egyptian hieroglyphs as specified in the Manuel de Codage.
In Swift, is there a convenient way to specify a predicate or other approach for this type of collation order?
First, turn your alphabet into a Dictionary that maps each Character to its integer position in the alphabet:
import Foundation
let hieroglyphAlphabet = "AiyawbpfmnrhHxXzsSqkgtTdD"
let hieroglyphCodes = Dictionary(
uniqueKeysWithValues: hieroglyphAlphabet
.enumerated()
.map { (key: $0.element, value: $0.offset) }
)
Next, extend StringProtocol with a property that returns an array of such alphabetic positions:
extension StringProtocol {
var hieroglyphEncoding: [Int] { map { hieroglyphCodes[$0] ?? -1 } }
}
I'm turning non-alphabetic characters into -1, so they will be treated as less-than alphabetic characters. You could turn them into .max to treat them as greater-than, or use a more complex type than Int if you need more special treatment.
Now you can sort an array of strings by hieroglyphEncoding, using the lexicographicallyPrecedes method of Sequence:
let unsorted = "this is the sort order".components(separatedBy: " ")
let sorted = unsorted.sorted {
$0.hieroglyphEncoding.lexicographicallyPrecedes($1.hieroglyphEncoding)
}
print(sorted)
Output:
["order", "is", "sort", "the", "this"]
It is not efficient to recompute the hieroglyphEncoding of each string on demand during the sort, so if you have many strings to sort, you should wrap each string and its encoding into a wrapper for sorting or use a Schwartzian transform.
I'd like to assume a given type implements some trait (e.g. Default) with a method (e.g. default()). I want to call that method and store its value into a local variable. Here is a general idea of it:
macro_rules! get_default {
( $x:ty = $alias:ident ) => {
let $alias = $x::default();
};
}
fn main() {
// get_default!(i32 = z);
// println!("get_default! {:?} ", z);
println!("i32 default {:?} ", i32::default());
}
Playground link.
When I try that I get an error:
error: expected expression, found `i32`
--> <anon>:3:22
|>
3 |> let $alias = $x::default();
|> ^^
I understand it's because it expects an expression, but I want to limit input to types only. Is there a way to turn $x from ty to expr, or a way to call a method on a type (even if it's potentially missing).
You were almost there. You can hint the expected default type to the compiler and then just use the universal function call syntax:
macro_rules! get_default {
( $x:ty = $alias:ident ) => {
let $alias = <$x as Default>::default();
};
}
fn main() {
get_default!(i32 = z);
println!("get_default! {:?} ", z);
println!("i32 default {:?} ", i32::default());
}
(Playground link)
The key bit is this:
let $alias = <$x as Default>::default();
This casts $x to the Default trait and then invokes the default() method, as you needed.
You can also use a shorthand when you don't need to disambiguate between traits:
let $alias = <$x>::default();
(Playground link)
More General Usage of UFCS
Using UFCS as shown above, you can disambiguate between traits that implement the same methods. This is the 'angle-bracket form' which is useful if the default() method is implemented in two traits.
In this specific scenario, you can also use UFCS more specifically, like so:
let $alias: $x = Default::default();
That alone provides enough information for Rust to infer the correct impl.
(Playground link)
Macro variables are escaped in Rust macros by default. Is there any way to have them not escaped?
macro_rules! some {
( $var:expr ) => ( "$var" );
}
some!(1) // returns "$var", not "1"
This is useful for concatenating compile-time strings and such.
It sounds like you want stringify!:
macro_rules! some {
( $var:expr ) => ( stringify!($var) );
}
fn main() {
let s = some!(1);
println!("{}", s);
}
And you will probably want concat! too.
See also:
How to create a static string at compile time
I find myself reading large CSV files and collecting the numerical elements into a Vec<&str>. Thereafter I have to convert them to numeric types and the simplest way I've found to do that is to implement a function like this:
fn to_u32(str: &str) -> u32
{
let str_num: Option<u32> = str.parse();
match str_num
{
Some(num) => num,
None => panic!("Failed to read number"),
}
}
This seems like a fairly common operation so I've sifted through the reference docs but haven't found anything that matches it. Is there a cleaner way to do this?
The Option type has a large variety of adapter methods which can be used to munge the data around more nicely than repeated matchs.
For example, there's unwrap and expect for just extracting the data out of a Some, panicking if the Option is None. The expect method is actually the closest to your written code: str.parse().expect("Failed to read number.").
However, it can often makes sense to use other the functions listed there, to propagate errors, avoiding the hard "crash" of a panic and allowing users (or yourself) to handle errors more centrally and with more control. It also often makes sense to use Result for this, which gives you the chance to pass along more information in the error case, and also allows one to use the try! macro, that said, one can easily define an equivalent of try! for Option:
macro_rules! option_try {
($e: expr) => {
match $e {
Some(x) => x,
None => return None
}
}
}
Well, you can use unwrap() to avoid pattern matching, but you should do it sparingly - with unwrap() you can't handle the actual parse error, so if the string does not represent a number, it'll panic:
let str_num: u32 = str.parse().unwrap();
if let is also an option:
if let Some(str_num) = str.parse::<u32>() {
// ...
}
You can also use unwrap_or() if you want to specify some default value:
let str_num: u32 = str.parse().unwrap_or(42);
Or you can use unwrap_or_default() which employs Default instance for u32:
let str_num: u32 = str.parse().unwrap_or_default();
http://play.golang.org/p/SKtaPFtnKO
func md(str string) []byte {
h := md5.New()
io.WriteString(h, str)
fmt.Printf("%x", h.Sum(nil))
// base 16, with lower-case letters for a-f
return h.Sum(nil)
}
All I need is Hash-key string that is converted from an input string. I was able to get it in bytes format usting h.Sum(nil) and able to print out the Hash-key in %x format. But I want to return the %x format from this function so that I can use it to convert email address to Hash-key and use it to access Gravatar.com.
How do I get %x format Hash-key using md5 function in Go?
Thanks,
If I understood correctly you want to return the %x format:
you can import "encoding/hex" and use the EncodeToString method
str := hex.EncodeToString(h.Sum(nil))
or just Sprintf the value:
func md(str string) string {
h := md5.New()
io.WriteString(h, str)
return fmt.Sprintf("%x", h.Sum(nil))
}
note that Sprintf is slower because it needs to parse the format string and then reflect based on the type found
http://play.golang.org/p/vsFariAvKo
You should avoid using the fmt package for this. The fmt package uses reflection, and it is expensive for anything other than debugging. You know what you have, and what you want to convert to, so you should be using the proper conversion package.
For converting from binary to hex, and back, use the encoding/hex package.
To Hex string:
str := hex.EncodeToString(h.Sum(nil))
From Hex string:
b, err := hex.DecodeString(str)
There are also Encode / Decode functions for []byte.
When you need to convert to / from a decimal use the strconv package.
From int to string:
str := strconv.Itoa(100)
From string to int:
num, err := strconv.Atoi(str)
There are several other functions in this package that do other conversions (base, etc.).
So unless you're debugging or formatting an error message, use the proper conversions. Please.