How to make an alias for an existing type in Apple Swift? - swift

In C++, I do using A = B; to make an alias.
How would I do this in Swift?

From the Apple's manual:
typealias AudioSample = UInt16
Exactly same with C++ stuff.

Related

What is the equivalent in Swift of offsetof(struct, member) in C?

Here's a struct in Swift:
struct A {
var x
var y
var z
}
What should I do to get the offset of y in the struct A, just like offsetof(A, y) in C?
Thanks :)
MemoryLayout.offset(of:)
was added in Swift 4.2, with the implementation of
SE-0210 Add an offset(of:) method to MemoryLayout
Example:
struct A {
var x: Int8
var y: Int16
var z: Int64
}
MemoryLayout.offset(of: \A.x) // 0
MemoryLayout.offset(of: \A.y) // 2
MemoryLayout.offset(of: \A.z) // 8
Remark: This should work as well, but (as of Swift 4.2) does not compile (bug SR-8335):
MemoryLayout<A>.offset(of: \.y)
// error: Expression type 'Int?' is ambiguous without more context
Swift doesn't currently define a standard memory layout for structs defined in Swift. That means any code you write to decompose a struct via pointer math is not guaranteed to work on future releases of, or even minor updates to the Swift compiler. (As #robmayoff notes, this is part of Swift getting a stable ABI, one of the Swift project's goals for version 5 this year.)
If your struct is defined in C and imported via a bridging header, though, Swift respects the C binary layout of its memory. In such cases, the MemoryLayout type provides equivalents to the C sizeof operator and similar utilities. There's still no offsetof equivalent, but you can do the pointer math yourself by adding the strides of the member types in C declaration order.
For example, the Apple docs for SCNGeometrySource have an example of creating a 3D object from a buffer of interleaved vertex/normal/texture-coordinate data. But it's (currently) written in ObjC, using sizeof and strideof to tell SceneKit where to find each kind of data in the buffer. The Swift equivalent would use MemoryLayout<Float>.size in place of sizeof(float), etc, and in place of offsetof(MyVertex, nx) you'd need to observe that there are three floats in the struct before the nx field and write MemoryLayout<Float>.stride * 3.
You can't get the offset of y in a reliable way, because Swift does not yet support that operation. Eventually, there will be support after “ABI stability” is implemented.
The current recommended approach is to define the struct in C, and also define C functions to return the offsets of the fields.

Are Int, String etc. considered to be 'primitives' in Swift?

Types representing numbers, characters and strings implemented using structures in swift.
An excerpt from the official documentation:
Data types that are normally considered basic or primitive in other
languages—such as types that represent numbers, characters, and
strings—are actually named types, defined and implemented in the Swift
standard library using structures.
Does that mean the following:
Int
Float
String
// etc
... are not considered as primitives?
Yes and no...
As other answers have noted, in Swift there's no difference at the language level between the things one thinks of as "primitives" in other languages and the other struct types in the standard library or the value types you can create yourself. For example, it's not like Java, where there's a big difference between int and Integer and it's not possible to create your own types that behave semantically like the former. In Swift, all types are "non-primitive" or "user-level": the language features that define the syntax and semantics of, say, Int are no different from those defining CGRect or UIScrollView or your own types.
However, there is still a distinction. A CPU has native instructions for tasks like adding integers, multiplying floats, and even taking vector cross products, but not those like insetting rects or searching lists. One of the things people talk about when they name some of a language's types "primitively" is that those are the types for which the compiler provides hooks into the underlying CPU architecture, so that the things you do with those types map directly to basic CPU instructions. (That is, so operations like "add two integers" don't get bogged down in object lookups and function calls.)
Swift still has that distinction — certain standard library types like Int and Float are special in that they map to basic CPU operations. (And in Swift, the compiler doesn't offer any other means to directly access those operations.)
The difference with many other languages is that for Swift, the distinction between "primitive" types and otherwise is an implementation detail of the standard library, not a "feature" of the language.
Just to ramble on this subject some more...
When people talk about strings being a "primitive" type in many languages, that's a different meaning of the word — strings are a level of abstraction further away from the CPU than integers and floats.
Strings being "primitive" in other languages usually means something like it does in C or Java: The compiler has a special case where putting something in quotes results in some data getting built into the program binary, and the place in the code where you wrote that getting a pointer to that data, possibly wrapped in some sort of object interface so you can do useful text processing with it. (That is, a string literal.) Maybe the compiler also has special cases so that you can have handy shortcuts for some of those text processing procedures, like + for concatenation.
In Swift, String is "primitive" in that it's the standard string type used by all text-related functions in the standard library. But there's no compiler magic keeping you from making your own string types that can be created with literals or handled with operators. So again, there's much less difference between "primitives" and user types in Swift.
Swift does not have primitive types.
In all programming languages, we have basic types that are available part of the language. In swift, we have these types availed through the Swift standard library, created using structures. These include the types for numbers, characters and strings.
No, they're not primitives in the sense other languages define primitives. But they behave just like primitives, in the sense that they're passed by value. This is consequence from the fact that they're internally implemented by structs. Consider:
import Foundation
struct ByValueType {
var x: Int = 0;
}
class ByReferenceType {
var x: Int = 0;
}
var str: String = "no value";
var byRef: ByReferenceType = ByReferenceType();
var byVal: ByValueType = ByValueType();
func foo(var type: ByValueType) {
type.x = 10;
}
func foo(var type: ByReferenceType) {
type.x = 10;
}
func foo(var type: String) {
type = "foo was here";
}
foo(byRef);
foo(byVal);
foo(str);
print(byRef.x);
print(byVal.x);
print(str);
The output is
10
0
no value
Just like Ruby, Swift does not have primitive types.
Int per example is implemented as structand conforms with the protocols:
BitwiseOperationsType
CVarArgType
Comparable
CustomStringConvertible
Equatable
Hashable
MirrorPathType
RandomAccessIndexType
SignedIntegerType
SignedNumberType
You can check the source code of Bool.swift where Bool is implemented.
There are no primitives in Swift.
However, there is a distinction between "value types" and "reference types". Which doesn't quite fit with either C++ or Java use.
They are partial primitive.
Swift not exposing the primitive like other language(Java). But Int, Int16, Float, Double are defined using structure which behaves like derived primitives data type(Structure) not an object pointer.
Swift Documentation favouring to primitive feature :
1.They have Hashable extension say
Axiom: x == y implies x.hashValue == y.hashValue.
and hashValue is current assigned value.
2.Most of the init method are used for type casting and overflow check (remember swift is type safe).
Int.init(_ other: Float)
3. See the below code Int have default value 0 with optional type. While NSNumber print nil with optional type variable.
var myInteger:Int? = Int.init()
print(myInteger) //Optional(0)
var myNumber:NSNumber? = NSNumber.init()
print(myNumber) //nil
Swift Documentation not favouring to primitive feature :
As per doc, There are two types: named types and compound types. No concept of primitives types.
Int support extension
extension Int : Hashable {}
All these types available through the Swift standard library not like standard keyword.

How can I call NSStringFromProtocol with a Swift protocol?

Assuming some
protocol MyCoolProtocol {
....
}
the following code refuses to compile (as of Swift 2.1):
let protocolName = NSStringFromProtocol(MyCoolProtocol)
because MyCoolProtocol is not of type Protocol. (This, at first glance, seems really weird, but if you dig enough it [unfortunately] makes sense)
How can I get the name of my Swift protocol in a String?
There are two ways:
The most common suggestion I can find is to declare your protocol as #objc. This seems odd when you have no intention of referring to this protocol from Objective-C code.
You can use let protocolName = String(MyCoolProtocol). As of the current version of Swift, this gives exactly what you'd expect ("MyCoolProtocol") and is still checked at compile-time.
This is how I accomplished it:
let protocolString = String("\(MyProtocol.self)")

Does Vala have typedefs?

Does Vala have some capability to do something similar to typedef in C or alias in D? I've checked its sporadic documentation, and couldn't find anything related to this.
Not a typedef as such, but you can extend most types, including the primitives:
public struct foo : int8 { }
This will generate a typedef in the generated C, but, strictly speaking, it isn't one in Vala because it isn't a type alias (i.e., int8 and foo are not automatically interconvertible).
This doesn't work for delegates.

Using C++ struct with iPhone app

As I heard it is possible to use C++ Code within an iPhone (Objective C) project, I want to use an encryption library which is written in C++. However, the library uses a C++ type struct which uses a constructor, that I can't get right.
The Struct looks like this:
struct SBlock
{
//Constructors
SBlock(unsigned int l=0, unsigned int r=0) : m_uil(l), m_uir(r) {}
//Copy Constructor
SBlock(const SBlock& roBlock) : m_uil(roBlock.m_uil), m_uir(roBlock.m_uir) {}
SBlock& operator^=(SBlock& b) { m_uil ^= b.m_uil; m_uir ^= b.m_uir; return *this; }
unsigned int m_uil, m_uir;
};
full source is available here: http://www.codeproject.com/KB/security/blowfish.aspx
what's the easiest way to get around that issue?
I've read the article about using c++ code on apple's developer site, but that didn't help much.
It's definitely possible and the trick is extremely simple: when you are going to use C++ code in your Objective-C++ applications, name your files .mm instead of .m.
So if you have YourViewController.h and YourViewController.m, rename the latter to be YourViewController.mm. It will cause XCODE to use C++ compiler instead of C compiler with your Objective-C++ code.
YourViewController.mm:
- (void) yourMessage {
// will compile just fine and call the appropriate C++ constructor
SBlock sb(1,1);
}
Just change the filename extension of your .m file to .mm and include the C++ headers. Wow, I type too slow, lol.