It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 11 years ago.
How can I decode unicode characters from an URL?. I specified response.charset="UTF-8" in my request, and I received unicode characters like %e3%81%a4%e3%82%8c%e3%. How can I convert these to something I can display on my form?
RFC 3986 specifies how to interpret this. You first decode the percent-escaped byte values in the standard way. Then you interpret the byte stream as UTF-8 to reconstruct the characters. You can find more information here.
Related
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 9 years ago.
La> ila>ha illAlla>hu wah}dahu> la> shari>ka lahu, lahul mulku wa lahul h}amdu, wa huwa ‘ala> kulli shai’in nadir.
This is transliteration of ayah. The font used is Times New Arabic . After applying this font i didn't see any changes.Characters like ">" didn't disappears. any solution?
use
NSString *ayah = #"La> ila>ha illAlla>hu wah}dahu> la> shari>ka lahu, lahul mulku wa lahul h}amdu, wa huwa ‘ala> kulli shai’in kadir.";
ayah = [ayah stringByReplacingOccurrencesOfString:#">"
withString:#""];
ayah = [ayah stringByReplacingOccurrencesOfString:#"}"
withString:#""];
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
I am working on an iOS application. In this application I am trying to convert NSString to hexadecimal. But in some cases NSString contains special $, ¥, etc. This is where I am facing problem. These characters don't convert to hexadecimal.
Is there any way to convert special characters to hexadecimal?
Convert the string to NSData
[NSString dataUsingEncoding:]
Then using the data object you can output any base you want: octal, hex, decimal, binary.
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
I have in my database records which contains Polish characters like: ś, ć, ż, ź.
It happen to be a problem for me when I try to execute some of the SELECT statements..
Because I get my text but instead of characters I wrote above I get: <c4><85>.
I bet there is a way I can change encoding for example for utf-8, but how can I do that for simple query like select * from table?
As you've indicated this is on the console, you must first check your console encoding before starting psql.
See Unicode characters in Windows command line - how? for details of how to do this in windows.
This must be done because even if you do get psql to read / write in UTF8 your console won't necessarily understand the characters and will not display them correctly.
Once you've confirmed that your console can accept UTF-8 Encoding then makesure that psql has picked this encoding up:
show client_encoding;
client_encoding
-----------------
UTF8
(1 row)
If that doesn't show UTF-8 then you can use:
set client_encoding = UTF8;
As a general rule; if your program is expecting to use UTF8 then there is no harm in setting the client encoding blindly (without checking what it is to start with).
http://www.postgresql.org/docs/current/static/multibyte.html
Note:
The above link is for the current version. As the OP has asked for version 8.0, here is the link for the 8.0 manual:
See http://www.postgresql.org/docs/8.0/static/multibyte.html
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
As seen here, they have the line ZG9udGJlYWhhdGVyc3RhcnR1cCtoYWNrZXJuZXdzQGdtYWlsLmNvbQ==.
How would one go about decoding this line of Base64?
Base64-decode it. For example, put it in this online decoder: http://www.opinionatedgeek.com/dotnet/tools/base64decode/
BTW this is not encryption, it's encoding.
This is a simple base64 encoding, one way to decode it is to use openssl
echo 'ZG9udGJlYWhhdGVyc3RhcnR1cCtoYWNrZXJuZXdzQGdtYWlsLmNvbQ==' | openssl base64 -d
Use a base64 decoder. Or - specify a language you would like to use and I can give you some example code.
BTW: I decoded this using this online decoder:
http://www.convertstring.com/EncodeDecode/Base64Decode
It decodes to
dontbeahaterstartup+hackernews#gmail.com
Base64 encoding is explained here: Base64 decoder
ZG9udGJlYWhhdGVyc3RhcnR1cCtoYWNrZXJuZXdzQGdtYWlsLmNvbQ== => dontbeahaterstartup+hackernews#gmail.com
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 9 years ago.
what's 8bitmime? waht's the defference of 7bit and 8bit?
How to understand them?
SMTP was originally specified using the pure ASCII character set. What a lot of people forget (or never get taught) is that the original ASCII is a 7-bit character set.
With much of the computing world using octets (8-bit bytes) or multiples thereof, some applications started, very unwisely, using the 8th bit for internal use, and so SMTP never got the chance to easily move to an 8-bit character set.
8BITMIME, which you can read about in excruciating detail in RFC 1652, or in a decent summary at wikipedia, is a way for SMTP servers that support it to transmit email using 8-bit character sets in a standards-compliant way that won't break old servers.
In practice, most of the concerns that led to this sort of thing are obsolete, and a lot of SMTP servers will even happily send/receive 8-bit character sets in "plain" SMTP mode (though that's not exactly the wisest decision in the world, either), but we're left with this legacy because, well, "if it ain't broke" (for very strict definitions of "broke")...