I have an sqlite string column that is assigned to a string. I need to make sure it isn't nil before assigning. I'm doing this:
char *isNil = sqlite3_column_text(selectstmt, 2);
if(isNil != nil){
myName = [NSString stringWithUTF8String:(char *)sqlite3_column_text(selectstmt, 2)];}
which gives the warning:
warning: initialization discards qualifiers from pointer target type
What is the proper way to do it?
You're getting the warning as you're ignoring the const. The API is defined:
const unsigned char *sqlite3_column_text(sqlite3_stmt*, int iCol);
You're assigning the return to a char*, so you're dropping the const. That is why you get the warning. You should respect the const.
const unsigned char *isNil = ...
I'm not really a huge objective-c guy, but I think stylistically it's common practice to compare primative types against NULL rather than nil. Also there is no need to call column_text twice.
const char *columnText = (const char *)sqlite3_column_text(selectstmt, 2);
if(columnText != NULL)
{
myName = [NSString stringWithUTF8String: columnText ];
}
You can see above I've cast the const unsigned char pointer to a const signed char pointer. You should make sure you know when you cast away a warning that it is the right thing to do. In this case it is safe to cast to an signed char. In general, never cast away const though, as whoever made that API could be doing something that requires you to treat the data as const.
Related
I'd like to return a String back to Swift using this code:
MyFile.h:
+ (char *) myCoolCode;
MyFile.mm:
+(string)myCoolCode {
string myCoolString = "";
myCoolString += "123";
return myCoolString;
}
MyFile.swift:
let superCoolString = MyBridge.myCoolCode()
print(superCoolString)
But obviously it doesn't seems working the right way because it's crashing somewhere deep inside.
As others have already pointed out in the comments you should fix the return type of your .mm file to char * and not string. You should always keep those two types the same. An example of your function implementation can be:
- (char *)myCoolCode
{
char str[] = "foobar";
//You can do whatever you want with str here. Just make sure it's null terminated
return str;
}
Then in your swift code:
let myCoolString = String(cString: MyBridge.myCoolCode())
print(myCoolString)
Reference on the string constructor is here.
The reason your code was crashing is probably because you were returning an instance of std::string which doesn't really work in Swift. You can use std::string but you have to convert it to char * when returning it. You can do so as is shown here.
I just started with with Swift this week, specifically Swift 4, and I'm using a C library through a bridging header, liblo, which handles sending/receiving OSC (Open Sound Control) formatted messages over a network socket.
I'm able to start a server thread, receive OSC messages via C callback->Swift closure, and read the numeric argument values with Swift just fine. I'm running into trouble, however, with reading string values.
The liblo message argument type lo_arg is a C typedef for a union and the string argument types are declared as simple chars which are mapped to Swift as Int8.
In C, you can grab the string via &argv[i]->s from your callback's lo_arg **argv array. In an Obj-C project with liblo, I use:
// get string value of first argument
lo_arg arg = argv[0];
NSString *s = [NSString stringWithUTF8String:&arg->s];
// do something with s
In Swift, I've tried getting the address of the Int8 and feeding it to String which works, but only grabs the first character:
// get string value of first argument
if var arg : lo_arg = argv?.pointee![0] {
withUnsafePointer(to: &arg.s) {
let s = String(cString: $0)
// so something with s
}
}
Am I doing something wrong? I would think these would be equivalent but passing $0 to strlen() ala print("strlen: \(strlen($0)") only prints a length of "1". I've verified a multi-character string is indeed being sent with a non-Swift test program. I'm wondering now if Swift is somehow assuming the string is a single character instead of C string head address and/or I need some further pointer conversion.
After some digging, I can confirm Swift truncates the lo_arg->s & lo_arg->S string values to 8 bytes on my 64 bit system aka sizeof(char). This happens when trying to read the string from an lo_arg coming from Swift. Reading the same value in C works fine, so Swift seems to reserve/allow reading from only the space for a single char. Forwarding the lo_arg from Swift to C and printing the string via printf() also shows truncated strings up to 8 characters.
The quick fix is to avoid reading the strings from the lo_arg generated by Swift, grab an lo_arg from the raw lo_message in C, and cast the char "pointer" to a const char* Swift will understand as a variable length string. Here are some working utility functions I added to my bridging header:
/// return an lo_message argv[i]->s in a format Swift can understand as a String
const char* lo_message_get_string(lo_message message, int at) {
return (const char *)&lo_message_get_argv(message)[at]->s;
}
/// return an lo_message argv[i]->S in a format Swift can understand as a String
const char* lo_message_get_symbol(lo_message message, int at) {
return (const char *)&lo_message_get_argv(message)[at]->S;
}
In Swift, I can then convert to a String:
let s = String(cString: lo_message_get_string(msg, 0))
// do something with s
I am getting a casting error. My app is reading a text file from a webpage using 'stringWithContentsOfURL' method. I want to parse the individual lines into separate components. This is a snippet of the code.
int parameterFive_1 = 0;
parameterFive_1_range = NSMakeRange(0,10)
lines = [response componentsSeparatedByString:#"\r"];
parameterFive_1 = CFStringGetIntValue([[lines objectAtIndex:i] substringWithRange:parameterFive_1_range]);
I am getting the following error message:
" Implicit conversion of an Objective-C pointer to 'CFStringRef' (aka 'const struct __CFString *') is disallowed with ARC"
I thought it might be the compiler option but changing it to the default is not making a difference. Can anyone provide any insight?
Just cast the NSString* to CFStringRef to satisfy ARC:
parameterFive_1 = CFStringGetIntValue((__bridge CFStringRef)[[lines objectAtIndex:i] substringWithRange:parameterFive_1_range]);
The __bridge keyword here lets ARC know that it doesn't need to transfer ownership of the string.
I'm trying to extract a string (which contains an integer) from an array and then use it as an int in a function. I'm trying to convert it to a int using intValue.
Here's the code I've been trying.
NSArray *_returnedArguments = [serverOutput componentsSeparatedByString:#":"];
[_appDelegate loggedIn:usernameField.text:passwordField.text:(int)[[_returnedArguments objectAtIndex:2] intValue]];
I get this error:
passing argument 3 of 'loggedIn:::' makes pointer from integer
without a cast
What's wrong?
I really don't know what was so hard about this question, but I managed to do it this way:
[myStringContainingInt intValue];
It should be noted that you can also do:
myStringContainingInt.intValue;
You can just convert the string like that [str intValue] or [str integerValue]
integerValue
Returns the NSInteger value of the receiver’s text.
(NSInteger)integerValue
Return Value
The NSInteger value of the receiver’s text, assuming a decimal representation and skipping whitespace at the beginning of the string. Returns 0 if the receiver doesn’t begin with a valid decimal text representation of a number.
for more information refer here
NSArray *_returnedArguments = [serverOutput componentsSeparatedByString:#":"];
_returnedArguments is an array of NSStrings which the UITextField text property is expecting. No need to convert.
Syntax error:
[_appDelegate loggedIn:usernameField.text:passwordField.text:(int)[[_returnedArguments objectAtIndex:2] intValue]];
If your _appDelegate has a passwordField property, then you can set the text using the following
[[_appDelegate passwordField] setText:[_returnedArguments objectAtIndex:2]];
Basically, the third parameter in loggedIn should not be an integer, it should be an object of some kind, but we can't know for sure because you did not name the parameters in the method call. Provide the method signature so we can see for sure. Perhaps it takes an NSNumber or something.
Keep in mind that international users may be using a decimal separator other than . in which case values can get mixed up or just become nil when using intValue on a string.
For example, in the UK 1.23 is written 1,23, so the number 1.777 would be input by user as 1,777, which, as .intValue, will be 1777 not 1 (truncated).
I've made a macro that will convert input text to an NSNumber based on a locale argument which can be nil (if nil it uses device current locale).
#define stringToNumber(__string, __nullable_locale) (\
(^NSNumber *(void){\
NSLocale *__locale = __nullable_locale;\
if (!__locale) {\
__locale = [NSLocale currentLocale];\
}\
NSString *__string_copy = [__string stringByReplacingOccurrencesOfString:__locale.groupingSeparator withString:#""];\
__string_copy = [__string_copy stringByReplacingOccurrencesOfString:__locale.decimalSeparator withString:#"."];\
return #([__string_copy doubleValue]);\
})()\
)
If I understood you correctly, you need to convert your NSString to int? Try this peace of code:
NSString *stringWithNumberInside = [_returnedArguments objectAtIndex:2];
int number;
sscanf([stringWithNumberInside UTF8String], "%x", &flags);
I am trying to convert a unsigned char* to int * on in Objective-C on the iPhone. Is there any API that can help with the conversion?
Here's my best attempt:
-(BOOL)MyFunc:Version:(int *)nVer
{
unsigned char * uszVerSize;
//trying to assign string to int
nVer = uszVerSize[0] ;
}
Dear Lord, I think you have bigger problems than the one stated above.
You need to convert the chars to an int and return that.
return [[NSNumber numberWithUnsignedChar:uszVerSize] intValue];
You should also learn about pointers and how ints and chars differ in memory. You can't just assign a pointer to a char to a pointer to an int.
const char *asciiCString = [#"abc-zABC-Z" cString];
int cStringLen = [#"abc-zABC-Z" length];
for(i=0;i<cStringLen;i++) {
[asciiMArray addObject: [[NSNumber alloc] initWithInteger: asciiCString[i]]];
printf("%d\n",asciiCString[i]);
}
for(i=0;i<cStringLen;i++) {
NSLog(#"%#",[asciiMArray objectAtIndex: i]);
printf("%d\n",asciiCString[i]);
}
This is a code i wrote yesterday, to test some code of my learning face
It may look naive...but if it helps you....
asciiCString[i] returns you ASCII value of the char referenced at the index..
asciiMArray is a NSMutableArray Object