I understand that System.Security.Cryptography has a MD5 hashing method in MD5.ComputeHash. However, the method takes and returns bytes. I don't understand how to work with this method using String key and hashes. I try to work around by doing this,
var hash = MD5.Create().ComputeHash(Encoding.UTF8.GetBytes(#"text".ToCharArray()));
foreach(byte h in hash)
{
Console.Write((char)h);
}
However the resulting output is gibberish string. For comparison, in this website, entering "text" will result in "1cb251ec0d568de6a929b520c4aed8d1"
writing this code will give the same result as the website:
var hash = MD5.Create().ComputeHash(Encoding.UTF8.GetBytes(#"text".ToCharArray()));
foreach(byte h in hash)
{
Console.Write(h.ToString("x2"));
}
The trick is to print each byte as 2 hexadecimal digits (hence x2)
Related
I'm trying to take each character (individual number, letter, or symbol) from a string file name without the extension and put each one into an array index as an integer of the utf-8 code (i.e. if the file name is "A1" without the extension, I would want "A" as an int "41" in first index, and "1" as int "31" in second index)
Here is the code I have but I'm getting this error "No exact matches in call to instance method 'append'", my guess is because .utf8 still keeps it as a string type:
for i in allNoteFiles {
var CharacterArray : [Int] = []
for character in i {
var utf8Character = String(character).utf8
CharacterArray.append(utf8Character) //error is here
}
....`//more code down here within the for in loop using CharacterArray indexes`
I'm sure the answer is probably simple, but I'm very new to Swift.
I've tried appending var number instead with:
var number = Int(utf8Character)
and
var number = (utf8Character).IntegerValue
but I get errors "No exact matches in call to initializer" and "Value of type 'String.UTF8View' has no member 'IntegerValue'"
Any help at all would be greatly appreciated. Thanks!
The reason
var utf8Character = String(character).utf8
CharacterArray.append(utf8Character)
doesn't work for you is because utf8Character is not a single integer, but a UTF8View: a lightweight way to iterate over the UTF-8 codepoints in a string. Every Character in a String can be made up of any number of UTF-8 bytes (individual integers) — while ASCII characters like "A" and "1" map to a single UTF-8 byte, the vast majority of characters do not: every UTF-8 code point maps to between 1 and 4 individual bytes. The Encoding section of UTF-8 on Wikipedia has a few very illustrative examples of how this works.
Now, assuming that you do want to split a string into individual UTF-8 bytes (either because you can guarantee your original string is ASCII-only, so the assumption that "character = byte" holds, or because you actually care about the bytes [though this is rarely the case]), there's a short and idiomatic solution to what you're looking for.
String.UTF8View is a Sequence of UInt8 values (individual bytes), and as such, you can use the Array initializer which takes a Sequence:
let characterArray: [UInt8] = Array(i.utf8)
If you need an array of Int values instead of UInt8, you can map the individual bytes ahead of time:
let characterArray: [Int] = Array(i.utf8.lazy.map { Int($0) })
(The .lazy avoids creating and storing an array of values in the middle of the operation.)
However, do note that if you aren't careful (e.g., your original string is not ASCII), you're bound to get very unexpected results from this operation, so keep that in mind.
I'm trying to create a password digest with this formula to get the following variables and my code is just not matching. Not sure what I'm doing wrong, but I'll admit when I need help. Hopefully someone is out there who can help.
Formula from documentation: Base64(SHA1(NONCE + TIMESTAMP + SHA1(PASSWORD)))
Correct Password Digest Answer: +LzcaRc+ndGAcZIXmq/N7xGes+k=
ColdFusion Code:
<cfSet PW = "AMADEUS">
<cfSet TS = "2015-09-30T14:12:15Z">
<cfSet NONCE = "secretnonce10111">
<cfDump var="#ToBase64(Hash(NONCE & TS & Hash(PW,'SHA-1'),'SHA-1'))#">
My code outputs:
Njk0MEY3MDc0NUYyOEE1MDMwRURGRkNGNTVGOTcyMUI4OUMxM0U0Qg==
I'm clearly doing something wrong, but for the life of me cannot figure out what. Anyone? Bueller?
The fun thing about hashing is that even if you start with the right string(s), the result can still be completely wrong, if those strings are combined/encoded/decoded incorrectly.
The biggest gotcha is that most of these functions actually work with the binary representation of the input strings. So how those strings are decoded makes a big difference. Notice the same string produces totally different binary when decoded as UTF-8 versus Hex? That means the results of Hash, ToBase64, etcetera will be totally different as well.
// Result: UTF-8: 65-65-68-69
writeOutput("<br>UTF-8: "& arrayToList(charsetDecode("AADE", "UTF-8"), "-"));
// Result: HEX: -86--34
writeOutput("<br>HEX: "& arrayToList(binaryDecode("AADE", "HEX"), "-"));
Possible Solution:
The problem with the current code is that ToBase64 assumes the input string is encoded as UTF-8. Whereas Hash() actually returns a hexadecimal string. So ToBase64() decodes it incorrectly. Instead, use binaryDecode and binaryEncode to convert the hash from hex to base64:
resultAsHex = Hash( NONCE & TS & Hash(PW,"SHA-1"), "SHA-1");
resultAsBase64 = binaryEncode(binaryDecode(resultAsHex, "HEX"), "base64");
writeDump(resultAsBase64);
More Robust Solution:
Having said that, be very careful with string concatenation and hashing. As it does not always yield the expected results. Without knowing more about this specific API, I cannot be completely certain what it expects. However, it is usually safer to only work with the binary values. Unfortunately, CF's ArrayAppend() function lacks support for binary arrays, but you can easily use Apache's ArrayUtils class, which is bundled with CF.
ArrayUtils = createObject("java", "org.apache.commons.lang.ArrayUtils");
// Combine binary of NONCE + TS
nonceBytes = charsetDecode(NONCE, "UTF-8");
timeBytes = charsetDecode(TS, "UTF-8");
combinedBytes = ArrayUtils.addAll(nonceBytes, timeBytes);
// Combine with binary of SECRET
secretBytes = binaryDecode( Hash(PW,"SHA-1"), "HEX");
combinedBytes = ArrayUtils.addAll(combinedBytes, secretBytes);
// Finally, HASH the binary and convert to base64
resultAsHex = hash(combinedBytes, "SHA-1");
resultAsBase64 = binaryEncode(binaryDecode(resultAsHex, "hex"), "base64");
writeDump(resultAsBase64);
I am having one array. For example :
array('a'=>'abc','b'=>'pqr','c'=>'xyz');
fron this i have encoded the key 'c'
now i am getting encoded value for this key.
next i need to put this encoded value in place of the original value of encoded key...
example i wnt output like this :
array('a'=>'abc','b'=>'pqr','c'=>H162);
please anybody help me.
Just assign a value to element 'c' of the array. e.g.
$arr = array('a'=>'abc','b'=>'pqr','c'=>'xyz');
print_r($arr);
$arr['c'] = 'H162';
print_r($arr);
array_merge — Merge one or more arrays
if you just want to change the value of 'c' and you know the key, you can simply call something like this
$your_array['c'] = NEW_VALUE
But this hasn't something to do with combining arrays. If you wan't to combine 2 arrays have a look at http://php.net/manual/de/function.array-combine.php
I believe this is what you are trying to achieve:
$array1 = array('a'=>'abc','b'=>'pqr','c'=>'xyz');
$array2 = array('xyz'=>'test');
foreach($array1 as $key=>$element){
if(array_key_exists($element, $array2)){
$array1[$key] = $array2[$element];
}
}
Very broad question. I put an example with numbers, several encoding/decoding methods exist for strings.
First, define your encode/decode functions. (Note: In this example i work with positive numbers. You could write you own encoding methods for strings). When you access your items, you must always know whether the value is encoded or not, so we always represent encoded numbers as negative numbers, and we assume negative numbers are encoded numbers. (For strings you can precede normal strings with "0" and encoded strings with "1" for example. Other methods exist.)
//Very simple functions, should be complex functions.
function encode($x) { return - $x * 2; }
function decode($x) { return - $x / 2; }
Now imagine an array:
$arr = array('a'=>123,'b'=>456,'c'=>789);
To encode the 'c':
$arr['c'] = encode($arr['c']);
...or encoding all items in your array:
foreach($arr as $key=>$val)
$arr[$key] = encode($arr[$key]);
For accessing the array members:
function getArrayMember($key)
{
if ($arr[$key] < 0) //This is an encoded number...
return decode($arr[$key]); //...decode it.
else //Normal numbers...
return $arr[$key]; //...return as is.
}
This was for numbers. You could implement or find suitable encoding/decoding methods for strings.
I'm trying to calculate SHA1 hash of a unicode string using T-SQL. The below code works fine with ASCII strings:
declare #input varchar(50)
set #input = 'some text'
print 'SHA1 Hash: ' + UPPER(master.dbo.fn_varbintohexsubstring(0, HashBytes('SHA1', #input), 1, 0))
but it calculates wrong hash when I replace first line of code with declare #input nvarchar(50).
Calculated hash (nvarchar): BBA91B680CE2685E9465DE24967E425CF055B10F
Calculated hash by a tool : 37AA63C77398D954473262E1A0057C1E632EDA77
How can I calculate SHA1 hash of a nvarchar ?
[EDIT]:
Below C# code generate same hash as the tool I use for hashing:
// Computes SHA1 hash of a given string
string ComputeHash(string input)
{
string result = string.Empty;
byte[] hash;
byte[] bytes = Encoding.GetBytes(input);
using (var sha = SHA1Managed.Create())
hash = sha.ComputeHash(bytes);
foreach (var b in hash)
result += b.ToString("X2");
return result;
}
Are you sure that the hash returned by your tool is using UTF16 or Unicode encoding when you compare it with the one returned by SQL Server?...SHA1 (and other encoding formats) depends on the data type, so it should return different values when given as an input. Take a look at this link for a more detailed explanation.
The sha1 hash of "abc" is
a9993e364706816aba3e25717850c26c9cd0d89d
The only way to get Mathematica to tell you that with its Hash function is
Hash[abc, "SHA"] // IntegerString[#, 16]&
(The IntegerString thing is just to output it in hex like most implementations do.)
Note that
Hash["abc", "SHA"]
gives the hash of "\"abc\"" -- not what you want!
In fact, the only reason we could get the correct hash of "abc" was because the Mathematica representation of the symbol abc happens to be the string "abc".
For the vast majority of strings, this will not be the case.
So how do you take the hash of an arbitrary string in Mathematica?
You can do it less kludgily by using StringToStream and the fact that FileHash can take an input stream as an argument. Then your sha1 function becomes:
sha1[s_String] := Module[{stream = StringToStream[s], hash},
hash = FileHash[stream,"SHA"];
Close[stream];
hash]
Here's a kludge that works. Write the string to a temp file and use FileHash:
sha1[s_String] := Module[{stream, file, hash},
stream = OpenWrite[];
WriteString[stream, s];
file = Close[stream];
hash = FileHash[file, "SHA"];
DeleteFile[file];
hash]
You might also want to define
hex = IntegerString[#, 16]&;
and return hex#hash in the above function.