How to represent a letter not supported by IOS in swift - swift

I am trying to represent the Amharic letter "ሀ" as this
let letter = "\u{1200}"
but when I run the app it gives me a question mark.
any one who knows how to represent unicode characters not supported by IOS in swift?

If the rendering of the character is the problem, make sure you use a font that is capable of that character.
If this is no duplicate, it's about the font, not swift or representation in code.

Related

Converting emoji from hex code to unicode

I want to use emojis in my iOS and Android app. I checked the list of emojis here and it lists out the hex code for the emojis. When I try to use the hex code such as U+1F600 directly, I don't see the emoji within the app. I found one other way of representing emoji which looks like \uD83D\uDE00. When using this notation, the emoji is seen within the app without any extra code. I think this is a Unicode string for the emoji. I think this is more of a general question that specific to emojis. How can I convert an emoji hex code to the Unicode string as shown above. I didn't find any list where the Unicode for the emojis is listed.
It seems that your question is really one of "how do I display a character, knowing its code point?"
This question turns out to be rather language-dependent! Modern languages have little trouble with this. In Swift, we do this:
$ swift
Welcome to Apple Swift version 3.0.2 (swiftlang-800.0.63 clang-800.0.42.1). Type :help for assistance.
1> "\u{1f600}"
$R0: String = "😀"
In JavaScript, it is the same:
$ node
> "\u{1f600}"
'😀'
In Java, you have to do a little more work. If you want to use the code point directly you can say:
new StringBuilder().appendCodePoint(0x1f600).toString();
The sequence "\uD83D\uDE00" also works in all three languages. This is because those "characters" are actually what Unicode calls surrogates and when they are combined together a certain way they stand for a single character. The details of how this all works can be found on the web in many places (look for UTF-16 encoding). The algorithm is there. In a nutshell you take the code point, subtract 10000 hex, and spread out the 20 bits of that difference like this: 110110xxxxxxxxxx110111xxxxxxxxxx.
But rather than worrying about this translation, you should use the code point directly if your language supports it well. You might also be able to copy-paste the emoji character into a good text editor (make sure the encoding is set to UTF-8). If you need to use the surrogates, your best best is to look up a Unicode chart that shows you something called the "UTF-16 encoding."
In Delphi XE #$1F600 is equivalent to #55357#56832 or D83D DE04 smile.
Within a program, I use it in the following way:
const smilepage : array [1..3] of WideString =(#$1F600,#$1F60A,#$2764);
JavaScript - two way
let hex = "😀".codePointAt(0).toString(16)
let emo = String.fromCodePoint("0x"+hex);
console.log(hex, emo);

Why Julia returns "\uf8ff" when I use  (Apple logo) unicode?

I thought Julia supports raw unicode input, such as:
julia> test = "π£¢∞§"
"π£¢∞§"
julia> 😘 = 1 ;
julia> print(😘 )
1
However, it seems julia does not support  (Apple logo).
julia>  = 123
ERROR: syntax: invalid character ""
julia> test = ""
"\uf8ff"
I wonder what's the underlying reason for that, and whether there is a way I can use  character in Julia?
I believe this link more properly explains the case of the unicode character that you see as apple's logo.
The problem is that the unicode value used is one of several that is set aside for private use. That means that each operating system, or application, or implementation is free to use those unicode characters for anything they want. It just so happens that Apple has chosen to use unicode character U+F8FF (decimal value 63743, or on the web as either  or ) as the Apple Logo. But some Windows fonts put in a Windows logo. And some other fonts put in a Klingon Mummification glyph. Or elven script. Or anything they want. And if it isn't defined in your local font, you'll just see a square.
My opinion is that Julia simply doesn't use this special value for anything. This also explains why your "π£¢∞§" characters work nicely - they are proper unicode characters, more largely supported by different platforms.
As a side note, i too see a simple square instead of the apple logo on this instance.
Edit
Here is a list of unicode characters supported by Julia.
To expand on Alex's answer...
Apple's logo () isn't an official Unicode symbol. I think there are very few commercial logos and symbols in the main Unicode tables.
However, Unicode provides some 'anything goes' areas (called PUAs - private use areas) that companies and individuals can fill with their own symbols, so that their users can access certain special glyphs. The main PUA is U+E000 to U+F8FF. Depending on which font you're using, you'll find all kinds of stuff assigned to these codes. On a Mac, I can usually get the Apple logo at "\uf8ff", with the right font selected, but not the Ubuntu symbol or the Windows logo, unless I choose another font. (There's also a fallback mechanism, whereby if you request a code point that the current font doesn't have, the OS will find a suitable substitute in another font and use that.)
[
In Julia, you can only use certain Unicode characters for variable names. Julia wouldn't allow anything from the private use area anyway, unless some fonts were distributed to every computer and everyone agreed on who had which Unicode point. (Mathematica makes extensive use of PUA symbols in their notebooks, because they can and do install their own fonts, and can then access various glyphs from the PUA in the notebook with guaranteed results.)
You are allowed to use emoji characters as variable names, so you could try the Emoji apple, rather than the Apple apple:

Does IOS support all Unicode emojies?

Hello All,
I have a problem regarding Unicode characters. I'm able to append Apple Art Work Unicode Characters in UITextView.
Like this : -
self.textView.text = #"\ue00A";
It is Okay.
But now i have many Unicodes Characters which're not in Apple art work.
One of them is U+1F3C7
Now I'm trying to show it in UITextView.
self.textView.text = #"\u1f3c7";
Then it is showing me an Special Character instead of Emoji.
This is the Emoji Icon of this Unicode But it is showing me Ἴ7.
Apple doesn't support all Unicode Characters ?
How can I add my own emojies in my application ?
Let me know if my question is not clear for you.
Doesn't Objective-C use UTF-16 internally, like Java and C#?
If so, then U+1F3C7 wouldn't be "\u1f3c7", but the surrogate-pair, "\uD83C\uDFC7".
Otherwise, there has to be some way to indicate a higher character, because "\u1f3c7" is the same as "\u1f3c" + "7", which is Ἴ7 (capital iota with psili and oxia, then 7).
Edit: After some discussion between the OP and myself, we figured out that the way to do this in Objective C is one I know as the C++ way:
"\U0001F3C7"
(\uXXXX with a small u and 4 hex digits works if it fits in thos 4 hex digits, \UXXXXXXXX with a capital U and 8 hex digits works for everything, but is longer to type).
Now our friend just needs to deal with the matter of font support, which alas is another problem in getting this to actually look as he wants.

How to convert unicode escape code to character in Objective C (on iPhone)

I have a string that contains unicode escape codes, eg. #"D\u017cem" (\u017c is code for ż). I would like to convert that string to the one containg actual characters. In the example that would be #"Dżem".
Is there any method in SDK or library that can do such replacement AND work on iPhone?
(Obviously I can do the replacement myself, changing characters one by one, but it is rather cumbersome)
According to Apple,
It is not safe is to include high-bit characters in your source code
Note that the "universal character name" \u017c is replaced at compile time with an implementation-defined value which in practice is the UTF8 representation, so the end result is the same as you would get if you (correctly) did the replacement you are talking about. If you're having a problem with some other source-processing tool, you might be better served by teaching that tool to recognize C99 universal character names.
I suggest to start using NSLocalizedString()
http://www.pushplay.net/2009/08/developing-localized-iphone-applications/
http://developer.apple.com

How to safely encode µ or the micro-symbol into an NSString programmatically?

I need a representation of µ or "micro". That funny small u with the long tail on the left side. Maybe you can see it here: µ
Some weeks ago I was reading in the docs, that it's a bad idea to type any special characters into the source code. So to prevent problems, could I encode that special character µ somehow like web folks do with , in an NSString? And if so, is there an overview of these codes or a way to get the correct code?
Take a look at this thread:
How do I escape a Unicode character in my Objective-C source code?
NSString *stuff = #"The Greek letter Beta looks like this: \u03b2"
For the iPhone, and for Mac OS X using the Xcode 3.x tool chain (targeting 10.2 or later, which you must be if you're using Xcode 3.x), it is safe and supported to use a literal µ in the string constant. The only caveat is that you must set the -finput-charset command-line option if your source files are not UTF-8 or UTF-16.