Purescript: convert Char to Lowercase version - purescript

What is the better way to convert a given Char to lowercase version?
One method could be with converting to String:
charToLower :: Char -> Char
charToLower char =
Maybe.fromMaybe char
$ CodeUnits.charAt 0
$ String.toLower
$ CodeUnits.singleton char
Is there a more appropriate way in terms of performance/simplicity?
There is also a toLower` from CodePoint.Unicode package but I wonder if it is a preferred way.

I guess the fastest way is just to use FFI. One can easily write toLower this way:
// Foreign/Char.js
module.exports.toLower = function charToLower(c) {
return c.toLower();
}
-- Foreign/Char.purs
module Foreign.Char where
foreign import toLower :: Char -> Char
Or if you want to express it in purescript only:
import Unsafe.Coerce (unsafeCoerce)
import Data.String as S
toLower :: Char -> Char
toLower = unsafeCoerce >>> S.toLower >>> unsafeCoerce
That eventually boils down to the same js code. You cannot break Char's invariants as it's initially just a subset of String and has the same runtime representation.

Related

D language unsigned hash of string

I am a complete beginner with the D language.
How to get, as an uint unsigned 32 bits integer in the D language, some hash of a string...
I need a quick and dirty hash code (I don't care much about the "randomness" or the "lack of collision", I care slightly more about performance).
import std.digest.crc;
uint string_hash(string s) {
return crc320f(s);
}
is not good...
(using gdc-5 on Linux/x86-64 with phobos-2)
While Adams answer does exactly what you're looking for, you can also use a union to do the casting.
This is a pretty useful trick so may as well put it here:
/**
* Returns a crc32Of hash of a string
* Uses a union to store the ubyte[]
* And then simply reads that memory as a uint
*/
uint string_hash(string s){
import std.digest.crc;
union hashUnion{
ubyte[4] hashArray;
uint hashNumber;
}
hashUnion x;
x.hashArray = crc32Of(s); // stores the result of crc32Of into the array.
return x.hashNumber; // reads the exact same memory as the hashArray
// but reads it as a uint.
}
A really quick thing could just be this:
uint string_hash(string s) {
import std.digest.crc;
auto r = crc32Of(s);
return *(cast(uint*) r.ptr);
}
Since crc32Of returns a ubyte[4] instead of the uint you want, a conversion is necessary, but since ubyte[4] and uint are the same thing to the machine, we can just do a reinterpret cast with the pointer trick seen there to convert types for free at runtime.

Chisel poke() print format

Is it possible to configure the print format of poke() function in Chisel test class ?
I want to 'poke()' an unsigned long (64bits) int and Chisel print it like a signed long int when I launch this code:
poke(c.io.masterwrite.wdata, 0xbebecacacafedecaL)
The result :
POKE AvlMasterWrite.io_masterwrite_wdata <- -0x4141353535012136
I can't add the letter 'U' like in C to force unsigned :
0xbebecacacafedecaUL
That doesn't compile.
The following should work:
import java.math._
poke (c.io.masterwrite.wdata, new BigInteger("bebecacacafedeca", 16)
The input port c.io.masterwrite.wdata should be of type UInt and 64-bit long.

store string in char array assignment makes integer from pointer without a cast

#include <stdio.h>
int main(void){
char c[8];
*c = "hello";
printf("%s\n",*c);
return 0;
}
I am learning pointers recently. above code gives me an error - assignment makes integer from pointer without a cast [enabled by default].
I read few post on SO about this error but was not able to fix my code.
i declared c as any array of 8 char, c has address of first element. so if i do *c = "hello", it will store one char in one byte and use as many consequent bytes as needed for other characters in "hello".
Please someone help me identify the issue and help me fix it.
mark
i declared c as any array of 8 char, c has address of first element. - Yes
so if i do *c = "hello", it will store one char in one byte and use as many consequent bytes as needed for other characters in "hello". - No. Value of "hello" (pointer pointing to some static string "hello") will be assigned to *c(1byte). Value of "hello" is a pointer to string, not a string itself.
You need to use strcpy to copy an array of characters to another array of characters.
const char* hellostring = "hello";
char c[8];
*c = hellostring; //Cannot assign pointer to char
c[0] = hellostring; // Same as above
strcpy(c, hellostring); // OK
#include <stdio.h>
int main(void){
char c[8];//creating an array of char
/*
*c stores the address of index 0 i.e. c[0].
Now, the next statement (*c = "hello";)
is trying to assign a string to a char.
actually if you'll read *c as "value at c"(with index 0),
it will be more clearer to you.
to store "hello" to c, simply declare the char c[8] to char *c[8];
i.e. you have to make array of pointers
*/
*c = "hello";
printf("%s\n",*c);
return 0;
}
hope it'll help..:)

How to convert a single character into Scala's Char object?

What is the simplest and the most concise method of converting a single character into Scala's Char object?
I have found the following solution, but somehow it seems unsatisfying to me, as I believe it should have been possible to resolve this problem in a more elegant Scala-way without using unnecessary conversions and array-specific operations:
scala> "A".toCharArray.head
res0: Char = A
There are a huge number of ways of doing this. Here are a few:
'A' // Why not just write a char to begin with?
"A"(0) // Think of "A" like an array--this is fast
"A".charAt(0) // This is what you'd do in Java--also fast
"A".head // Think of "A" like a list--a little slower
"A".headOption // Produces Option[Char], useful if the string might be empty
If you use Scala much, the .head version is the cleanest and clearest; no messy stuff with numbers that might get off by one or which have to be thought about. But if you really need to do this a lot, head runs through a generic interface that has to box the char, while .charAt(0) and (0) do not, so the latter are about 3x faster.
You can use scala's charAt.
scala> var a = "this"
a: String = this
scala> a.charAt(0)
res3: Char = t
Additionally, the following is valid and may be what you are looking for:
scala> "a".charAt(0)
res4: Char = a

Logical to char

I have a char array representing a binary number for example
bit <1x8 char> '00110001'
I want to replace the last char with a logical value. The following error is triggered: Conversion to char from logical is not possible.
This is my code:
bit(end:end) = hiddenImg(i,j);
I checked that hiddenImg(i,j) is in fact a logical value.
This may not be optimal but should do what you want (convert the logical to a char):
>> bit = '10010100'
bit =
10010100
>> bit(end)=num2str(true)
bit =
10010101