Why the enum size is different. in Swift - swift

I am making an app that connect between iPhone.
It use enum specify UInt32. but sizeof() is different size.
Why not equal a UInt32?
enum Message:UInt32{
case A = 1
case B = 2
case C = 3
}
sizeof(UInt32) // 4
sizeof(Message) // 1

They must be optimising the storage, the following works:
sizeofValue(Message.A.toRaw()) // 4

When calling an instance of an enum object with no additional method, it returns the type of enum it is (UInt32, in this case).
In reference to this answer, Types in objective-c on iPhone , an Unsigned Int is 32 4 bytes.

Related

Swift Variable Declaration and Initialize

Is there a difference between how following bits of code work?
let x: Int = 4
and
let x: Int
x = 4
This one:
let x: Int = 4
creates a non-optional variable x and initialises it to 4. x can be used without issue.
This one:
let x: Int
// Cannot do anything with x yet
x = 4
creates a non-optional variable x with no defined value. It cannot be used without first assigning it to a value, either directly (as in your example) or by the result of some other statement. If you do try and use it, you'll get a compile-time error.
The only difference is that on the first one you are declaring a variable and assigning it at the same time, and the second one you declare it first and then assign it.
But there is no mayor difference.

Does Swift guarantee the storage order of fields in classes and structs?

In C, the order in which you define fields in a struct is the order in which they will be instantiated in memory. Taking into account memory alignment, the following struct will have a size of 8 bytes in memory as shown, but only 6 bytes if the fields are reversed as there doesn't need to be any alignment padding.
struct s {
int32_t a;
/* 2 bytes of padding to align a 64 bit integer */
int64_t b;
}
This ordering guarantee is present in C structs, C++ classes (and structs), and Objective-C classes.
Is the order of storage similarly guaranteed for fields in Swift classes and structs? Or (given that the language doesn't support pointers in the same way as the others listed), does the compiler optimally re-arrange them for you at compile-time?
Yes, the order of the struct elements in memory is the order of
their declaration. The details can be found
in Type Layout
(emphasis added). Note however the use of "currently", so this
may change in a future version of Swift:
Fragile Struct and Tuple Layout
Structs and tuples currently share the same layout algorithm, noted as the "Universal" layout algorithm in the compiler implementation. The algorithm is as follows:
Start with a size of 0 and an alignment of 1.
Iterate through the
fields, in element order for tuples, or in var declaration order for
structs. For each field:
Update size by rounding up to the alignment
of the field, that is, increasing it to the least value greater or
equal to size and evenly divisible by the alignment of the field.
Assign the offset of the field to the current value of size.
Update
size by adding the size of the field.
Update alignment to the max of
alignment and the alignment of the field.
The final size and alignment
are the size and alignment of the aggregate. The stride of the type is
the final size rounded up to alignment.
The padding/alignment is different from C:
Note that this differs from C or LLVM's normal layout rules in that size and stride are distinct; whereas C layout requires that an embedded struct's size be padded out to its alignment and that nothing be laid out there, Swift layout allows an outer struct to lay out fields in the inner struct's tail padding, alignment permitting.
Only if a struct is imported from C then it is guaranteed to have
the same memory layout. Joe Groff from Apple writes at
[swift-users] Mapping C semantics to Swift
If you depend on a specific layout, you should define the struct in C and import it into Swift for now.
and later in that discussion:
You can leave the struct defined in C and import it into Swift. Swift will respect C's layout.
Example:
struct A {
var a: UInt8 = 0
var b: UInt32 = 0
var c: UInt8 = 0
}
struct B {
var sa: A
var d: UInt8 = 0
}
// Swift 2:
print(sizeof(A), strideof(A)) // 9, 12
print(sizeof(B), strideof(B)) // 10, 12
// Swift 3:
print(MemoryLayout<A>.size, MemoryLayout<A>.stride) // 9, 12
print(MemoryLayout<B>.size, MemoryLayout<B>.stride) // 10, 12
Here var d: UInt8 is layed out in the tail padding of var sa: A.
If you define the same structures in C
struct CA {
uint8_t a;
uint32_t b;
uint8_t c;
};
struct CB {
struct CA ca;
uint8_t d;
};
and import it to Swift then
// Swift 2:
print(sizeof(CA), strideof(CA)) // 9, 12
print(sizeof(CB), strideof(CB)) // 13, 16
// Swift 3:
print(MemoryLayout<CA>.size, MemoryLayout<CA>.stride) // 12, 12
print(MemoryLayout<CB>.size, MemoryLayout<CB>.stride) // 16, 16
because uint8_t d is layed out after the tail padding of struct CA sa.
As of Swift 3, both size and stride return the same value
(including the struct padding) for structures imported from C,
i.e. the same value as sizeof in C would return.
Here is a simple function which helps to demonstrate the above (Swift 3):
func showMemory<T>(_ ptr: UnsafePointer<T>) {
let data = Data(bytes: UnsafeRawPointer(ptr), count: MemoryLayout<T>.size)
print(data as NSData)
}
The structures defined in Swift:
var a = A(a: 0xaa, b: 0xbbbbbbbb, c: 0xcc)
showMemory(&a) // <aa000000 bbbbbbbb cc>
var b = B(sa: a, d: 0xdd)
showMemory(&b) // <aa000000 bbbbbbbb ccdd>
The structures imported from C:
var ca = CA(a: 0xaa, b: 0xbbbbbbbb, c: 0xcc)
showMemory(&ca) // <aa000000 bbbbbbbb cc000000>
var cb = CB(ca: ca, d: 0xdd)
showMemory(&cb) // <aa000000 bbbbbbbb cc000000 dd000000>
It seems that the order is guaranteed.
Given the following struct:
struct s {
var a: UInt32
var c: UInt8
var b: UInt64
}
sizeof(s) // 16
Alignment is still happening. There's a lost 8 bits there between c and b.
Of course, it's not clear whether these bits are actually between c & b or just tacked at the end... until we rearrange:
struct s {
var a: UInt32
var b: UInt64
var c: UInt8
}
sizeof(s) // 17
I believe this behavior matches the other languages you mentioned in your question.

How to obtain the exponent value of a number?

Basically I would like to know if it is possible to get the exponent value in a number, ex:
number = 2.6e3
I want to get the value 3 of the exponent. I have been searching for quite a while now and have not found the answer to this. I am new to programming so I may not know exactly what to look for (which methods, etc).
Any help is much appreciated! Thanks!
Assuming I am interpreting your question correctly this is what you want to do:
B = A^X where A and B are known values. Solve for X.
1000 = 10^X (In this case, X = 3.)
The below code will work for any base. It requires either Foundation or UIKit. The function arguments "value" and "base" are B, A respectively. Try the code out in the Xcode Playground!
func getExponentForValueAndBase(value: Double, base: Double) -> Double {
return log(value)/log(base)
}
getExponentForValueAndBase(1000, base: 10) // = 3
Assuming this is your question: Given a number as an integer, find the interger value of the log base 10 of it.
import Foundation
func log(Int number) -> Int
{
return floor(log10(number))
}

Swift: Compare generic Types in generic class

I'm creating a Matrix class in swift, which I want it to be Generic, so I can use it Like this:
let matrix: Matrix<Character>; // Or any other type
I created my class like this:
class Matrix<Template>: NSObject {}
I am crating a function that Apply Gravity to matrix that takes an emptyKey of type Template, and drags every element not equal to emptyKey to the bottom of the matrix
// For example emptyKey is "_" and Template is String.
1 _ 2 1 _ _
3 4 5 == To ==> 3 _ 2
6 _ _ 6 4 5
The problem is: when I am trying to compare the value in matrix at a specific location which of type Template with the emptyKey which also of type Template, it fails to compile and gave me the error:
Binary operator '==' cannot be applied to two 'Template?' operands
I am using xcode 7.3.1 with Swift 2.2
You need to constrain Template to Equatable.
class Matrix<Template:Equatable> ...
(Also I would advise that you avoid Optionals. I don't know where you're using them, but your error message suggests that you are, and they are going to get in your way.)

Storing iOS UI Component internal Usage Flags

Most of the Apples implementation, I could see they use a struct to store the flag bit, why they are doing like that?. Why can't we use the BOOL to handle this instead?.
See the Apple's sample code below from tableview,
struct {
unsigned int delegateheightForHeaderInSection:1;
unsigned int dataSourceCellForRow:1;
unsigned int delegateHeightForRow:1;
unsigned int style:1;
} _tableFlags;
and internally they may be using them something similar to this,
_tableFlags.delegateheightForHeaderInSection = [delegate respondsToSelector:#selector(tableView:heightForHeaderInSection:)];
use the "_tableFlags.delegateheightForHeaderInSection" everywhere to check the whether the user has implemented this delegate method.
So instead ofhaving the struct to store the flag, can't we implement like below.
BOOL delegateheightForHeaderInSection;
use it like this,
delegateheightForHeaderInSection = [delegate respondsToSelector:#selector(tableView:heightForHeaderInSection:)];
What difference does this two approaches have?.
unsigned int delegateheightForHeaderInSection:1;
defines a bit field of length 1 in the structure (see e.g. http://en.wikipedia.org/wiki/Bit_field).
Bit fields can be used to save space. So in this case, the four members delegateheightForHeaderInSection, ..., style are stored in contiguous bits of one integer.
Note that no space is saved in this particular case. The size of _tableFlags is the size of unsigned int, which is 4. The size of four BOOL (aka unsigned char) members is also 4.
But for example, 32 bit fields of length 1 also take 4 bytes, whereas 32 BOOL members would take 32 bytes.