How does memcached C source code use bool? - memcached

Memcached source code is written in C but uses bool type and literals (https://github.com/memcached/memcached/blob/master/memcached.c#L155)
static volatile bool allow_new_conns = true;
How is boolean supported in C?
CLion complains about the use of bool. Is there any recommendation on how to configure CLion for memcached?

Related

Access control in swift 4

While upgrading to Swift4 from Swift3, I got some issues related to access control.
Here is the sample code. Which was there in Swift3, working fine in past times -
open class MyClass {
private let value: Int
static var defaultValue: Int { return 10 }
public init(value: Int = MyClass.defaultValue) {
self.value = value
}
}
To make the code run in Swift4, I have to change access control for defaultValue to public.
Here is the Swift4, compiling version
open class MyClass {
private let value: Int
static public var defaultValue: Int { return 10 }
public init(value: Int = MyClass.defaultValue) {
self.value = value
}
}
While I was wondering what is going on, I tried to remove open access control for MyClass, it allowed me to remove access identifier for defaultValue. Even can put it to private.
class MyClass {
private let value: Int
private static var defaultValue: Int { return 10 }
public init(value: Int = MyClass.defaultValue) {
self.value = value
}
}
I understand all the access identifiers, but I am not able to understand this behaviour. Especially the first case where xcode forced me to change access control of defaultValue to public.
Please help.
My original answer (shown below) is now mostly outdated – the beginnings of the resilience model are to be implemented in Swift 4.2 with the introduction of the #inlinable and #usableFromInline attributes, corresponding to the old #_inlineable and #_versioned attributes.
In addition, and more importantly, the rule for what default arguments of publically accessible functions can reference has changed again. To recap the previous rules:
In Swift 3 there was no enforcement of what access level such default argument expressions could reference (allowing your first example where defaultValue is internal).
In Swift 4, such a default argument could only refer to declarations exposed as a part of the module's interface, including those that aren't otherwise directly visible to users in another module (i.e #_versioned internal).
However in Swift 4.2, with the implementation of SE-0193, the rule is now that the default argument expression of a publicly accessible function can only refer to publicly accessible declarations (not even #inlinable internal or #usableFromInline internal).
I believe this is paving the way for the displaying of default argument expressions in a module's generated interface file. Currently Swift just shows an unhelpful = default, but I believe this will change to actually show the default argument. This can only realistically happen with this new access-control restriction in place (Edit: This is now happening).
Old answer (Swift 4)
This change is due to the work towards a resilience model that is already available via underscored attributes (#_inlineable, #_versioned, #_fixed_layout), but is yet to be officially finalised (so you probably shouldn't be using these attributes yourself yet). You can read about the full proposed details of the resilience model here, as well as the the Swift evolution discussion on it here.
In short, an inlineable function is one whose implementation, as well as declaration, is exposed as a part of a module's interface and can therefore be inlined when called from another module. An inlineable function must therefore also be publically accessible to begin with (i.e public or higher).
What you're running into is a change that makes default argument expressions for publically accessible functions inlineable, meaning that they must be available to be evaluated directly in the calling module's binary. This reduces the overhead of calling a function with default parameter values from another module, as the compiler no longer needs to do a function call for each default argument; it already knows the implementation.
I don't believe this change is officially documented in the release of Swift 4 itself, but it is confirmed by Swift compiler engineer Slava Pestov, who says:
Swift 3.1 added resilience diagnostics for inlineable code, which is not an officially supported feature, but in Swift 4 we switched these checks on for default argument expressions as well.
So if you have a publically accessible function with a default argument expression (such as MyClass.defaultValue in your case), that expression can now only refer to things that are also a part of that module's interface. So you need to make defaultValue publically accessible.
Unfortunately, there's currently no way to make a private function's declaration part of a module's interface (which would allow for your usage of it in a default argument expression). The attribute that would facilitate this is #_versioned, but it is forbidden with (file)private due to the following reasons given by Slava Pestov:
It would be a trivial change to allow #_versioned on private and
fileprivate declarations, but there are two pitfalls to keep in mind:
Private symbols are mangled with a ‘discriminator’ which is basically a hash of the file name. So now it would be part of the ABI,
which seems fragile — you can’t move the private function to another
source file, or rename the source file.
Similarly, right now a #_versioned function becoming public is an ABI compatible change. This would no longer work if you could have
private #_versioned functions, because the symbol name would change if
it became public.
For these reasons we decided against “private versioned” as a concept.
I feel like internal is enough here.
You could achieve this with a #_versioned var defaultValue though:
open class MyClass {
private let value: Int
#_versioned static var defaultValue: Int {
return 10
}
public init(value: Int = MyClass.defaultValue) {
self.value = value
}
}
The declaration of MyClass.defaultValue is now exported as a part of the module's interface, but still cannot be directly called from another module's code (as it's internal). However, the compiler of that module can now call it when evaluating the default argument expression. But, as said earlier, you probably shouldn't be using an underscored attribute here; you should wait until the resilience model has been finalised.

Swift - Bool vs Boolean

I am teaching myself to program using Swift 3, and I am currently learning about booleans. I noticed that if I want to explicitly declare my variable of type bool, I have two options
Bool
or
Boolean
I was wondering why we have these two options if they are the same? Well, are they the same? This is what I'm confused about.
Thanks in advance.
Bool is Swift's boolean data type. Boolean hasn't existed since the early days of Swift.

How to make an alias for an existing type in Apple Swift?

In C++, I do using A = B; to make an alias.
How would I do this in Swift?
From the Apple's manual:
typealias AudioSample = UInt16
Exactly same with C++ stuff.

What are "standard C++ types" and "C++/CX constructs"?

Bear with me if this is a dumb question as I've recently started learning C++/CX. I was going through MSDN documentation on value classes and ref classes and I came across these exceprts:
Because all members of a value class or value struct are public and are emitted into metadata, standard C++ types are not allowed.
and
[A ref class] may contain as members C++/CX constructs or scalar types such as enum class, ref class, float64, and so on. It may also contain standard C++ types. C++/CX constructs may have public, protected, internal, private, or protected private accessibility. Public or protected members are emitted to metadata. Standard C++ types must have private, internal, or protected private accessibility, which prevents them from being emitted to metadata.
My question is: what are the definitions of "C++/CX constructs" and "standard C++ types"?
If my guess is correct, C++/CX constructs include ref classes and structs and enum classes and structs, and standard C++ types include int, bool, float, double, etc. Is that right?
When the documentation says "C++/CX constructs," it means Windows Runtime types. When programming using C++/CX, there are two categories of types:
C++ Types: The set of C++ types includes all of the types that you can use in ordinary C++ code: fundamental types (like int or double), enumerations, pointers, references, class types, etc.
Windows Runtime Types: These are types that can be used across the Windows Runtime ABI boundary. These include reference types (ref class), Windows Runtime value types (value class, numeric types, Windows Runtime enumerations, etc.) and delegates.
Note that there is a bit of overlap between these categories: numeric types are in both.
You can use C++ types anywhere in your code except in the public surface of any public components that you write. Only Windows Runtime types can be used across the Windows Runtime ABI boundary. For example:
public ref class C sealed
{
public:
// Ok: int is a fundamental WinRT type
void F(int x) { }
// Not ok: std::string is not a WinRT type
void G(std::string s) { }
private:
// Ok: _s is private; private members are implementation details, so you
// may use ordinary C++ types for private members.
std::string _s;
};
These two categories of types are not unique to building Windows Runtime components in C++: if you build a component in .NET, you can use .NET-specific types (e.g., concrete generic types) and .NET-specific constructs (e.g. generic methods), which are not valid Windows Runtime types.

Using C++ struct with iPhone app

As I heard it is possible to use C++ Code within an iPhone (Objective C) project, I want to use an encryption library which is written in C++. However, the library uses a C++ type struct which uses a constructor, that I can't get right.
The Struct looks like this:
struct SBlock
{
//Constructors
SBlock(unsigned int l=0, unsigned int r=0) : m_uil(l), m_uir(r) {}
//Copy Constructor
SBlock(const SBlock& roBlock) : m_uil(roBlock.m_uil), m_uir(roBlock.m_uir) {}
SBlock& operator^=(SBlock& b) { m_uil ^= b.m_uil; m_uir ^= b.m_uir; return *this; }
unsigned int m_uil, m_uir;
};
full source is available here: http://www.codeproject.com/KB/security/blowfish.aspx
what's the easiest way to get around that issue?
I've read the article about using c++ code on apple's developer site, but that didn't help much.
It's definitely possible and the trick is extremely simple: when you are going to use C++ code in your Objective-C++ applications, name your files .mm instead of .m.
So if you have YourViewController.h and YourViewController.m, rename the latter to be YourViewController.mm. It will cause XCODE to use C++ compiler instead of C compiler with your Objective-C++ code.
YourViewController.mm:
- (void) yourMessage {
// will compile just fine and call the appropriate C++ constructor
SBlock sb(1,1);
}
Just change the filename extension of your .m file to .mm and include the C++ headers. Wow, I type too slow, lol.