Why is C99's bool a macro rather than a typedef? - boolean

Why does the boolean type support introduced in C99 use the preprocessor rather than the language's own facilities? Specifically, why do we have:
#define bool _Bool
#define true 1
#define false 0
in <stdbool.h> rather than:
typedef _Bool bool;
enum { false = 0, true = 1 };
I guess the enum can be seen as a matter of taste. But - why not have a typedef?

From section 7.18/3 of the C11 specification:
The remaining three macros are suitable for use in #if preprocessing directives.
The specification lists true, false and __bool_true_false_are_defined.
The specification also continues to state (in 7.18/4) that the bool, true and false macros may be undefined by a program.
The last part, about undefining them, is (I guess) because of much legacy code when C99 was published used their own definitions and variations of the boolean types and values. So it would not invalidate existing code.
So they are macros so they can be used in preprocessor conditions, and so they can be undefined by a program.

Related

Converting Pascal's variant record to Swift

I am converting a program written in Pascal to Swift and some Pascal features do not have direct Swift equivalents such as variant records and defining sets as types. A variant record in Pascal enables you to assign different field types to the same area of memory in a record. In other words, one particular location in a record could be either of type A or of type B. This can be useful in either/or cases, where a record can have either one field or the other field, but not both. What are the Swift equivalents for a variant record and a set type like setty in the Pascal fragment?
The Pascal code fragment to be converted is:
const
strglgth = 16;
sethigh = 47;
setlow = 0;
type
setty = set of setlow..sethigh;
cstclass = (reel,pset,strg);
csp = ^constant; /* pointer to constant type */
constant = record case cclass: cstclass of
reel: (rval: packed array [1..strglgth] of char);
pset: (pval: setty);
strg: (slgth: 0..strglgth;
sval: packed array [1..strglgth] of char)
end;
var
lvp: csp
My partial Swift code is
let strglgth = 16
let sethigh = 47
let setlow = 0
enum cstclass : Int {case reel = 0, pset, strg}
var lvp: csp
Any advice is appreciated. Thanks in advance.
Variant records in Pascal are very simular to unions in C.
So this link will probably be helpful:
https://developer.apple.com/documentation/swift/imported_c_and_objective-c_apis/using_imported_c_structs_and_unions_in_swift
In case the link ever goes dead, here's the relevant example:
union SchroedingersCat {
bool isAlive;
bool isDead;
};
In Swift, it’s imported like this:
struct SchroedingersCat {
var isAlive: Bool { get set }
var isDead: Bool { get set }
init(isAlive: Bool)
init(isDead: Bool)
init()
}
That would be more like a functional port. It does not seem to take care of the fact that Variant records are actually meant to use the same piece of memory in different ways, so if you'd have some low-level code that reads from a stream or you have a pointer to such a structure, this might not help you.
In that case you might want to try just reserving some bytes, and write different getters/setters to access them. That would even work if you'd have to port more complex structures, like nested variant types.
But overall, if possible, I'd recommend to avoid porting such structures too literally, and use idioms that match Swift better.

How to implement Custom Log Levels in CocoaLumberJack from Swift?

I am using CocoaLumberjack for a Swift project. I would like to implement custom log levels/flags, as I would like to use 6 rather than the default 5, and would prefer different names.
The documentation for doing this is not helpful. It is only a solution for Objective-C.
The fact that DDLogFlag is defined as NS_OPTIONS means I actually could simply ignore the pre-defined values here, create my own constants, and just write some wrapping code to convert from one to the other.
However, DDLogLevel is defined as NS_ENUM, which means Swift won't be very happy with me trying to instantiate something to say 0b111111, which isn't an existing value in the enum. If it were an NS_OPTIONS, like DDLogFlag, I could just ignore the pre-existing definitions from the library and use whatever valid UInt values I wanted to.
As far as I can tell, I just have to write some Objective-C code to define my own replacements for DDLogLevel, DDLogFlag, and write a custom function to pass this in to and access these properties on DDLogMessage. But this feels bad.
How can I use my own custom logging levels in Swift with CocoaLumberjack?
This is indeed only possible in Objective-C right now - and there also only for the #define Log Macros. Even then I could imagine that the "modern" ObjC compiler will warn about the types that are passed to DDLogMessage.
The docs are indeed a bit outdated here and stem from a time where Objective-C was closer to C that it is to Swift nowadays... :-)
Nevertheless, in the end DDLogLevel and DDLogFlag are both stored as NSUInteger. Which means it can theoretically take any NSUInteger value (aka UInt in Swift).
To define your own levels, you would simply create an enum MyLogLevel: UInt { /*...*/ } and then write your own logging functions.
Those functions can actually forward to the existing functions:
extension DDLogFlag {
public static let fatal = DDLogFlag(rawValue: 0x0001)
public static let failure = DDLogFlag(rawValue: 0x0010)
}
public enum MyLogLevel: UInt {
case fatal = 0x0001
case failure = 0x0011
}
extension MyLogLevel {
public static var defaultLevel: MyLogLevel = .fatal
}
#inlinable
public func LogFatal(_ message: #autoclosure () -> Any,
level: MyLogLevel = .defaultLevel,
context: Int = 0,
file: StaticString = #file,
function: StaticString = #function,
line: UInt = #line,
tag: Any? = nil,
asynchronous async: Bool = asyncLoggingEnabled,
ddlog: DDLog = .sharedInstance) {
_DDLogMessage(message(), level: unsafeBitCast(level, to: DDLogLevel.self), flag: .fatal, context: context, file: file, function: function, line: line, tag: tag, asynchronous: async, ddlog: ddlog)
}
The unsafeBitCast here works, because in the end it's just an UInt and _DDLogMessage does not switch over the level, but instead does a bit mask check against the flag.
Disclaimer: I'm a CocoaLumberjack maintainer myself.
We don't recommend using a custom log level in Swift. There's not much benefit from it and logging frameworks like swift-log also use predefined log levels.
However, I personally could also imagine declaring DDLogLevel with NS_OPTIONS instead of NS_ENUM. The OSLog Swift overlay also uses an extensible OSLogType.
If this is something you'd like to see, please open a PR so we can discuss it with the team. We need to be a bit careful with API compatibility, but like I said it's totally doable.
On a side-note: May I ask what you need custom levels for?

Swift short syntax of execution

I am looking for the way to write short syntax.
For instance. In JS, PHP and etc.
var a = 1 ;
function Foo ()-> void {}
a && Foo() ;
if a exists, run Foo.
a and Foo itself already mean exist or not, the syntax is away better looks....
However, in Swift, the typing checking is kinda of tough.
var a = 1 ;
func Foo ()-> Foid {} ;
a && Foo();
will generate neither are Bool returning error.
a != nil && Foo() ;
this can resolve and variable condition, but what if the better bypass for the function condition? I just dont want to write something like
if( a != nil ) { Foo() } ;
Yet what is the better syntax for Not Exist?
if ( !a ) or !a //is easy and better looks...
I found not similar thing in swift...
if( a == nil ) // will throws error when its not Optional Typing.
guard var b = xxx else {} // simply for Exist and very long syntax.
Thank you for your advice!
As mentioned by other contributors, Swift emphasizes readability and thus, explicit syntax. It would be sacrilege for the Swift standard library to support Python-style truth value testing.
That being said, Swift’s extensibility allows us to implement such functionality ourselves—if we really want to.
prefix func !<T>(value: T) -> Bool {
switch T.self {
case is Bool.Type:
return value as! Bool
default:
guard Double(String(describing: value)) != 0
else { return false }
return true
}
}
prefix func !<T>(value: T?) -> Bool {
guard let unwrappedValue = value
else { return false }
return !unwrappedValue
}
var a = 1
func foo() -> Void { }
!a && !foo()
Or even define our own custom operator:
prefix operator ✋
prefix func ✋<T>(value: T) -> Bool {
/// Same body as the previous example.
}
prefix func ✋<T>(value: T?) -> Bool {
guard let unwrappedValue = value
else { return false }
return ✋unwrappedValue
}
var a = 1
func foo() -> Void { }
✋a && ✋foo()
The expectations you've developed from dynamic languages like PHP and JS (and Ruby, Python for that matter) are almost universally inapplicable to static languages like Swift.
Swift is a statically compiled language. If you reference a variable that doesn't exist, it's not legal Swift code, and the compiler will fail your build. Given that, the question of "how do I check if a variable is undefined?" is completely moot in Swift. If you have a successfully compiling program that references a variable a, then a exists. There's absolutely no reason for a check, and so a mechanism for it doesn't even exist.
Static vs Dynamic typing
Static type systems are like mathematical proof systems. They produce rigerous proofs that certain aspects of your program are valid. This has trade-offs. The rigidity buys you many guarantees. For example, you'll never have a valid Swift program where you accidentally pass an Int where a Bool is expected. The static type system makes that class of error literally impossible, so it's not something you have to remember to check for yourself.
On the other hand, many truths are easier to intuit than to prove. Thus, there's great utility in scripting and dynamic languages, because they don't demand the rigorous proofs of your claims that static languages require. On the down side, their type systems "do" much less. For example, JS happily lets you reference an undefined variable. To remedy this, JS provides a way for you to do a run-time check to see whether a variable is defined or not. But this isn't a problem Swift has, so the "solution" is absent.
When static typing is too hard
Swift actually takes a middle ground position. If you find yourself with a statement that's obviously true, but hard to prove to the compiler, various "escape hatches" exist that allow you to leave the safety of the type system, and go into dynamic land. For example, if you look at an IBOutlet, and see that it's connected to an element in a storyboard, you can intuitively be sure that the IBOutlet is not nil. But that's not something you can prove to the compiler, and hence when you see implicitly unwrapped optionals being used for IBOutlets.
Implicitly unwrapped optionals are one such "escape hatch". The Any type is another, as is unsafeBitcast(_:to:), withoutActuallyEscaping(_:), as!, try!, etc.
Swift takes type safety very seriously. Unlike C or JS we can not use anything that doesn't resolve to Bool value type in If statement in Swift. So there won't be a short hand for that(at-least that I know of). Regarding below code
if( a == nil ) // will throws error when its not Optional Typing.
Swift doesn't allow you to set nil to non optional types. So there is no need to check for nil. By the way both Obj-C and Swift use verbose syntax, we need to get use to that.
In this case you are trying to force Swift to work in a way that you are used to with other languages like JavaScript or PHP, as you say in your comment. There are a few reasons why your code won't compile, but it mainly falls on the fact that Swift doesn't do the same truthy and falsy stuff JS does.
var a = 1
if a {
print("won't compile")
}
//'Int' is not convertible to 'Bool'
In Swift it's better to use an actual Bool value if that's what it's supposed to be, or if it's truly supposed to be an Int you're just going to have to check the value
var a = true
if a {
print("this compiles")
}
or
var a = 1
if a > 0 {
print("this compiles too")
}
Swift really isn't meant to be as loose as JS, so you should just embrace that and take advantage of the safety and readability.
Here is one way most similar to what you designed.
You may have to set the type of a to Int?:
var a: Int? = 1
func foo ()-> Void {}
a.map{_ in foo()}

What is DarwinBoolean type in Swift

I have written Boolean instead of Bool in some Swift code and Xcode offered me to replace it with DarwinBoolean.
The question is, what exactly is DarwinBoolean?
What are the differences comparing to Bool and ObjCBool types.
What is it's purpose?
Short answer:
Bool is the native Swift type for truth values.
DarwinBoolean is the Swift mapping of the "historic" C type Boolean.
ObjCBool is the Swift mapping of the Objective-C type BOOL.
You would use Bool in your Swift code unless one of the other types
is required for interoperability with existing Core Foundation or
Objective-C functions.
More about DarwinBoolean:
DarwinBoolean is defined in Swift as
/// The `Boolean` type declared in MacTypes.h and used throughout Core
/// Foundation.
///
/// The C type is a typedef for `unsigned char`.
public struct DarwinBoolean : BooleanType, BooleanLiteralConvertible {
public init(_ value: Bool)
/// The value of `self`, expressed as a `Bool`.
public var boolValue: Bool { get }
/// Create an instance initialized to `value`.
public init(booleanLiteral value: Bool)
}
and is the Swift mapping of the "historic" C type Boolean from
MacTypes.h:
/********************************************************************************
Boolean types and values
Boolean Mac OS historic type, sizeof(Boolean)==1
bool Defined in stdbool.h, ISO C/C++ standard type
false Now defined in stdbool.h
true Now defined in stdbool.h
*********************************************************************************/
typedef unsigned char Boolean;
See also the Xcode 7 Release Notes:
The type Boolean in MacTypes.h is imported as Bool in contexts that
allow bridging between Swift and Objective-C types.
In cases where the representation is significant, Boolean is imported
as a distinct DarwinBoolean type, which is BooleanLiteralConvertible
and can be used in conditions (much like the ObjCBool type).
(19013551)
As an example, the functions
void myFunc1(Boolean b);
void myFunc2(Boolean *b);
are imported to Swift as
public func myFunc1(b: Bool)
public func myFunc2(b: UnsafeMutablePointer<DarwinBoolean>)
In myFunc1 there is an automatic conversion between the native
Swift type Bool and the Mac Type Boolean.
This is not possible in myFunc2 because the address of a variable
is passed around, here DarwinBoolean is exactly the Mac Type Boolean.
In previous versions of Swift – if I remember correctly – this mapped
type was called Boolean, and has been renamed to DarwinBoolean later.
More about ObjCBool:
ObjCBool is the Swift mapping of the Objective-C type BOOL,
this can be signed char or the C/C++ bool type, depending on the
architecture. For example, the NSFileManager method
- (BOOL)fileExistsAtPath:(NSString *)path
isDirectory:(BOOL *)isDirectory
is imported to Swift as
func fileExistsAtPath(_ path: String,
isDirectory isDirectory: UnsafeMutablePointer<ObjCBool>) -> Bool
Here the BOOL return value is converted to Bool automatically,
but the (BOOL *) is kept as UnsafeMutablePointer<ObjCBool>
because it is the address of a variable.
Unfortunately there is no real documentation of the DarwinBoolean except the part that was posted in the former answer.
But generally the Darwin package which can be imported seperatelly or accessed through the Foundation package is used when you want to use mathematical functions.
The value of the DarwinBoolean is the same as the normal bool, true or false.

Is there a correct way to determine that an NSNumber is derived from a Bool using Swift?

An NSNumber containing a Bool is easily confused with other types that can be wrapped in the NSNumber class:
NSNumber(bool:true).boolValue // true
NSNumber(integer: 1).boolValue // true
NSNumber(integer: 1) as? Bool // true
NSNumber(bool:true) as? Int // 1
NSNumber(bool:true).isEqualToNumber(1) // true
NSNumber(integer: 1).isEqualToNumber(true) // true
However, information about its original type is retained, as we can see here:
NSNumber(bool:true).objCType.memory == 99 // true
NSNumber(bool:true).dynamicType.className() == "__NSCFBoolean" // true
NSNumber(bool:true).isEqualToValue(true) || NSNumber(bool:true).isEqualToValue(false) //true
The question is: which of these approaches is the best (and/or safest) approach to determining when a Bool has been wrapped within an NSNumber rather than something else? Are all equally valid? Or, is there another, better solution?
You can ask the same question for Objective-C, and here is an answer in Objective-C - which you can call from, or translate into, Swift.
NSNumber is toll-free bridged to CFNumberRef, which is another way of saying an NSNumber object is in fact a CFNumber one (and vice-versa). Now CFNumberRef has a specific type for booleans, CFBooleanRef, and this is used when creating a boolean CFNumberRef aka NSNumber *... So all you need to do is check whether your NSNumber * is an instance of CFBooleanRef:
- (BOOL) isBoolNumber:(NSNumber *)num
{
CFTypeID boolID = CFBooleanGetTypeID(); // the type ID of CFBoolean
CFTypeID numID = CFGetTypeID((__bridge CFTypeRef)(num)); // the type ID of num
return numID == boolID;
}
Note: You may notice that NSNumber/CFNumber objects created from booleans are actually pre-defined constant objects; one for YES, one for NO. You may be tempted to rely on this for identification. However, though is currently appears to be true, and is shown in Apple's source code, to our knowledge it is not documented so should not be relied upon.
HTH
Addendum
Swift code translation (by GoodbyeStackOverflow):
func isBoolNumber(num:NSNumber) -> Bool
{
let boolID = CFBooleanGetTypeID() // the type ID of CFBoolean
let numID = CFGetTypeID(num) // the type ID of num
return numID == boolID
}
The first one is the correct one.
NSNumber is an Objective-C class. It is built for Objective-C. It stores the type using the type encodings of Objective-C. So in Objctive-C the best solution would be:
number.objCType[0] == #encoding(BOOL)[0] // or string compare, what is not necessary here
This ensures that a change of the type encoding will work after re-compile.
AFAIK you do not have #encoding() in Swift. So you have to use a literal. However, this will not break, because #encoding() is replaced at compile time and changing the encodings would break with compiled code. Unlikely.
The second approach uses a internal identifier. This is likely subject of change.
I think the third approach will have false positives.
Don't rely on the class name as it likely belongs to a class cluster, and it is an implementation detail (and therefore subject to change).
Unfortunately, the Objective-C BOOL type was originally a just typedef for a signed char in C, which is always encoded as c (this is the 99 value you are seeing, since c in ASCII is 99).
In modern Objective-C, I believe the BOOL type is an actual Boolean type (i.e. no longer just a typedef for signed char) but for compatibility, it still encodes as c when given to #encode().
So, there's no way to tell whether the 99 originally referred to a signed char or a BOOL, as far as NSNumber is concerned they are the same.
Maybe if you explain why you need to know whether the NSNumber was originally a BOOL, there may be a better solution.