How to interpret document.state == 12 (iCloud) - iphone

Whenever I load a UIDocument from iCloud, I check its state like so:
NSLog(#"Library loadFromContents: state = %d", self.documentState);
In some cases I have received documentState 8 or 12 which have caused crashes. I am now wondering what exactly the 8 and 12 stands for. As far as I'm aware, documentState is a bit field, so it has many different flags. The docs reveal that:
enum {
UIDocumentStateNormal = 0,
UIDocumentStateClosed = 1 << 0,
UIDocumentStateInConflict = 1 << 1,
UIDocumentStateSavingError = 1 << 2,
UIDocumentStateEditingDisabled = 1 << 3 };
typedef NSInteger UIDocumentState;
However, I have no idea how to interpret this in my situation. How do I find out what 8 and 12 stand for?

Inside the enum they do some bit-shifting. They could have also written it like this:
enum {
UIDocumentStateNormal = 0,
UIDocumentStateClosed = 1,
UIDocumentStateInConflict = 2,
UIDocumentStateSavingError = 4,
UIDocumentStateEditingDisabled = 8 };
typedef NSInteger UIDocumentState;
A bit shift to the left is basically 2 to the power of whatever number is given after the shift operator... 1<<1 means 2^1, 1<<2 means 2^2, etc ...
A state of 8 means UIDocumentStateEditingDisabled and 12 means UIDocumentStateEditingDisabled and UIDocumentStateSavingError

The suggested way to deal with these notifications is to not check for if(state == UIDocumentStateInConflict) but with a logical AND like this:
if (state & UIDocumentStateInConflict) {
// do something...
}
see "An Example: Letting the User Pick the Version" in "Document-based app programming guide" from the official docs

Related

EditorGuiLayout.MaskField issue with large enums

I'm working on an input system that would allow the user to translate input mappings between different input devices and operating systems and potentially define their own.
I'm trying to create a MaskField for an editor window where the user can select from a list of RuntimePlatforms, but selecting individual values results in multiple values being selected.
Mainly for debugging I set it up to generate an equivalent enum RuntimePlatformFlags that it uses instead of RuntimePlatform:
[System.Flags]
public enum RuntimePlatformFlags: long
{
OSXEditor=(0<<0),
OSXPlayer=(0<<1),
WindowsPlayer=(0<<2),
OSXWebPlayer=(0<<3),
OSXDashboardPlayer=(0<<4),
WindowsWebPlayer=(0<<5),
WindowsEditor=(0<<6),
IPhonePlayer=(0<<7),
PS3=(0<<8),
XBOX360=(0<<9),
Android=(0<<10),
NaCl=(0<<11),
LinuxPlayer=(0<<12),
FlashPlayer=(0<<13),
LinuxEditor=(0<<14),
WebGLPlayer=(0<<15),
WSAPlayerX86=(0<<16),
MetroPlayerX86=(0<<17),
MetroPlayerX64=(0<<18),
WSAPlayerX64=(0<<19),
MetroPlayerARM=(0<<20),
WSAPlayerARM=(0<<21),
WP8Player=(0<<22),
BB10Player=(0<<23),
BlackBerryPlayer=(0<<24),
TizenPlayer=(0<<25),
PSP2=(0<<26),
PS4=(0<<27),
PSM=(0<<28),
XboxOne=(0<<29),
SamsungTVPlayer=(0<<30),
WiiU=(0<<31),
tvOS=(0<<32),
Switch=(0<<33),
Lumin=(0<<34),
BJM=(0<<35),
}
In this linked screenshot, only the first 4 options were selected. The integer next to "Platforms: " is the mask itself.
I'm not a bitwise wizard by a large margin, but my assumption is that this occurs because EditorGUILayout.MaskField returns a 32bit int value, and there are over 32 enum options. Are there any workarounds for this or is something else causing the issue?
First thing I've noticed is that all values inside that Enum is the same because you are shifting 0 bits to left. You can observe this by logging your values with this script.
// Shifts 0 bits to the left, printing "0" 36 times.
for(int i = 0; i < 36; i++){
Debug.Log(System.Convert.ToString((0 << i), 2));
}
// Shifts 1 bits to the left, printing values up to 2^35.
for(int i = 0; i < 36; i++){
Debug.Log(System.Convert.ToString((1 << i), 2));
}
The reason inheriting from long does not work alone, is because of bit shifting. Check out this example I found about the issue:
UInt32 x = ....;
UInt32 y = ....;
UInt64 result = (x << 32) + y;
The programmer intended to form a 64-bit value from two 32-bit ones by shifting 'x' by 32 bits and adding the most significant and the least significant parts. However, as 'x' is a 32-bit value at the moment when the shift operation is performed, shifting by 32 bits will be equivalent to shifting by 0 bits, which will lead to an incorrect result.
So you should also cast the shifting bits. Like this:
public enum RuntimePlatformFlags : long {
OSXEditor = (1 << 0),
OSXPlayer = (1 << 1),
WindowsPlayer = (1 << 2),
OSXWebPlayer = (1 << 3),
// With literals.
tvOS = (1L << 32),
Switch = (1L << 33),
// Or with casts.
Lumin = ((long)1 << 34),
BJM = ((long)1 << 35),
}

Decomposing a number into powers of 2

Hi I need to decompose a number into powers of 2 in swift 5 for an iOS app I'm writing fro a click and collect system.
The backend of this system is written in c# and uses the following to save a multi-pick list of options as a single number in the database eg:
choosing salads for a filled roll on an order system works thus:
lettuce = 1
cucumber = 2
tomato = 4
sweetcorn = 8
onion = 16
by using this method it saves the options into the database for the choice made as (lettuce + tomato + onion) = 21 (1+4+16)
at the other end I use a c# function to do this thus:
for(int j = 0; j < 32; j++)
{
int mask = 1 << j;
}
I need to convert this function into a swift 5 format to integrate the decoder into my iOS app
any help would be greatly appreciated
In Swift, these bit fields are expressed as option sets, which are types that conform to the OptionSet protocol. Here is an example for your use case:
struct Veggies: OptionSet {
let rawValue: UInt32
static let lettuce = Veggies(rawValue: 1 << 0)
static let cucumber = Veggies(rawValue: 1 << 1)
static let tomato = Veggies(rawValue: 1 << 2)
static let sweetcorn = Veggies(rawValue: 1 << 3)
static let onion = Veggies(rawValue: 1 << 4)
}
let someVeggies: Veggies = [.lettuce, .tomato]
print(someVeggies) // => Veggies(rawValue: 5)
print(Veggies.onion.rawValue) // => 16
OptionSets are better than just using their raw values, for two reasons:
1) They standardize the names of the cases, and gives a consistent and easy way to interact with these values
2) OptionSet derives from the SetAlgebra protocol, and provides defaulted implementations for many useful methods like union, intersection, subtract, contains, etc.
I would caution against this design, however. Option sets are useful only when there's a really small number of flags (less than 64), that you can't forsee expanding. They're really basic, can't store any payload besides "x exists, or it doesn't", and they're primarily intended for use cases that have very high sensitivity for performance and memory use, which quick rare these days. I would recommend using regular objects (Veggie class, storing a name, and any other relevant data) instead.
You can just use a while loop, like this :
var j = 0
while j < 32 {
var mask = 1 << j
j += 1
}
Here is a link about loops and control flow in Swift 5.
Hi I figured it out this is my final solution:
var salads = "" as String
let value = 127
var j=0
while j < 256 {
let mask=1 << j
if((value & mask) != 0) {
salads.append(String(mask) + ",")
}
j += 1
}
salads = String(salads.dropLast()) // removes the final ","
print(salads)
This now feeds nicely into the in clause in my SQL query, thanks you all for your help! :)

Issue with circular shift BITMASK

i'm having one enum
typedef NS_ENUM(NSInteger, Node) {
NodeTop ,
NodeLeft ,
NodeBottom ,
NodeRight ,
} ;
and property as,
#property Node node;
now in my controller i'm assigning node multiple values using pipeline,
node =top | left | bottom | right ;
(Q-1 does node have 0000,0011 kind of values using NodeTop,Left OR simply final result of ORing of top|left|bottom|right?)
this way NSLog("%d",node); giving result 1.
now if node contains 0001 , i want to left shift it by 1 so i tried
node<<1;
which changes node value 1 to 2 but is it does not seem really changing 0001 to 0010?
in short i want node to have value like 0001,
and later on i want to shift its value like,
0010
0100
1000
0001
0010
...
Any help? let me know if question is not clear!
You can use NS_OPTIONS to create a bit mask where each value is defined as 1<<n. This NSHipster post explain the usage of both NS_ENUM and NS_OPTIONS.
typedef NS_OPTIONS(NSInteger, Node) {
NodeTop = 1 << 0, // 1
NodeLeft = 1 << 1, // 2
NodeBottom = 1 << 2, // 4
NodeRight = 1 << 3 // 8
};
That way when you write
NodeTop | NodeBottom
it will be the same as
...0001 | ...0100 = ....0101
And shifting the bit will move it to the next option
Node node = NodeBottom<<1; // = NodeRight
since ...0100 << ...1000.
which changes node value 1 to 2 but is it does not seem really changing 0001 to 0010?
It does. 0010 binary is 2 decimal.
You can only bitwise OR these values together in a meaningful way if you make each value correspond to a different bit, e.g.
typedef NS_ENUM(NSInteger, Node) {
NodeTop = 1,
NodeLeft = 2,
NodeBottom = 4,
NodeRight = 8
};
node = NodeTop; // 0b0001 = 0x01 = 1
node <<= 1; // 0b0010 = 0x02 = 2 = NodeLeft
node <<= 1; // 0b0100 = 0x04 = 4 = NodeBottom
Well as for the circular bit shifting, I think the best you can do is something like this (quickly thrown together C code)
enum test {
test1 = 1,
test2 = 2,
test4 = 4,
test8 = 8
};
static enum test val(enum test input) {
while(input >= 16) //Or whatever max bit you want
input = input >> 4; //Be sure to shift by the appropriate number
return input;
}
Then when you run this code:
enum test foo = test1;
foo = val(foo << 5);
foo will be test2

iPhone SDK << meaning?

Hi another silly simple question. I have noticed that in certain typedefs in Apple's frameworks use the symbols "<<" can anyone tell me what that means?:
enum {
UIViewAutoresizingNone = 0,
UIViewAutoresizingFlexibleLeftMargin = 1 << 0,
UIViewAutoresizingFlexibleWidth = 1 << 1,
UIViewAutoresizingFlexibleRightMargin = 1 << 2,
UIViewAutoresizingFlexibleTopMargin = 1 << 3,
UIViewAutoresizingFlexibleHeight = 1 << 4,
UIViewAutoresizingFlexibleBottomMargin = 1 << 5
};
typedef NSUInteger UIViewAutoresizing;
Edit: Alright so I now understand how and why you would use the left bit-shift, my next question is how would I test to see if the the value had a certain trait using and if/then statement or a switch/case method?
This is a way to create constants that would be easy to mix. For example you can have an API to order an ice cream and you can choose any of vanilla, chocolate and strawberry flavours. You could use booleans, but that’s a bit heavy:
- (void) iceCreamWithVanilla: (BOOL) v chocolate: (BOOL) ch strawerry: (BOOL) st;
A nice trick to solve this is using numbers, where you can mix the flavours using simple adding. Let’s say 1 for vanilla, 2 for chocolate and 4 for strawberry:
- (void) iceCreamWithFlavours: (NSUInteger) flavours;
Now if the number has its rightmost bit set, it’s got vanilla flavour in it, another bit stands for chocolate and the third bit from right is strawberry. For example vanilla + chocolate would be 1+2=3 decimal (011 in binary).
The bitshift operator x << y takes the left number (x) and shifts its bits y times. It’s a good tool to create numeric constants:
1 << 0 = 001 // vanilla
1 << 1 = 010 // chocolate
1 << 2 = 100 // strawberry
Voila! Now when you want a view with flexible left margin and flexible right margin, you can mix the flags using bitwise or: FlexibleRightMargin | FlexibleLeftMargin → 1<<2 | 1<<0 → 100 | 001 → 101. On the receiving end the method can mask the interesting bit using logical and:
// 101 & 100 = 100 or 4 decimal, which boolifies as YES
BOOL flexiRight = givenNumber & FlexibleRightMargin;
Hope that helps.
The << means that all bits in the expression on the left side are shifted left by the amount on the right side of the operator
so 1 << 1 means:
0001 becomes 0010 (those are binary numbers)
another example:
0001 0100 << 2 = 0101 0000
most of the time shift left is the same as multiply by 2.
exception:
when high bits are set and you shift them left (in a 16 bit integer 1000 0000 0000 0000 << 1) they will be discarded or wrapped around (i don't know how it is done in each language)
Its a bit shift.
In C-inspired languages, the left and
right shift operators are "<<" and
">>", respectively. The number of
places to shift is given as the second
argument to the shift operators.
Bit Shift!!!
For example
500 >> 4 = 31,
Original: 111110100
1st Shift:011111010
2nd Shift:001111101
3rd Shift:000111110
4th Shift:000011111 which equals 31.
Same as
500/16 = 31
500/2^4 = 31
Bitwise shift left. For more info see the Wikipedia article.

Three boolean values saved in one tinyint

probably a simple question but I seem to be suffering from programmer's block. :)
I have three boolean values: A, B, and C. I would like to save the state combination as an unsigned tinyint (max 255) into a database and be able to derive the states from the saved integer.
Even though there are only a limited number of combinations, I would like to avoid hard-coding each state combination to a specific value (something like if A=true and B=true has the value 1).
I tried to assign values to the variables so (A=1, B=2, C=3) and then adding, but I can't differentiate between A and B being true from i.e. only C being true.
I am stumped but pretty sure that it is possible.
Thanks
Binary maths I think. Choose a location that's a power of 2 (1, 2, 4, 8 etch) then you can use the 'bitwise and' operator & to determine the value.
Say A = 1, B = 2 , C= 4
00000111 => A B and C => 7
00000101 => A and C => 5
00000100 => C => 4
then to determine them :
if( val & 4 ) // same as if (C)
if( val & 2 ) // same as if (B)
if( val & 1 ) // same as if (A)
if((val & 4) && (val & 2) ) // same as if (C and B)
No need for a state table.
Edit: to reflect comment
If the tinyint has a maximum value of 255 => you have 8 bits to play with and can store 8 boolean values in there
binary math as others have said
encoding:
myTinyInt = A*1 + B*2 + C*4 (assuming you convert A,B,C to 0 or 1 beforehand)
decoding
bool A = myTinyInt & 1 != 0 (& is the bitwise and operator in many languages)
bool B = myTinyInt & 2 != 0
bool C = myTinyInt & 4 != 0
I'll add that you should find a way to not use magic numbers. You can build masks into constants using the Left Logical/Bit Shift with a constant bit position that is the position of the flag of interest in the bit field. (Wow... that makes almost no sense.) An example in C++ would be:
enum Flags {
kBitMask_A = (1 << 0),
kBitMask_B = (1 << 1),
kBitMask_C = (1 << 2),
};
uint8_t byte = 0; // byte = 0b00000000
byte |= kBitMask_A; // Set A, byte = 0b00000001
byte |= kBitMask_C; // Set C, byte = 0b00000101
if (byte & kBitMask_A) { // Test A, (0b00000101 & 0b00000001) = T
byte &= ~kBitMask_A; // Clear A, byte = 0b00000100
}
In any case, I would recommend looking for Bitset support in your favorite programming language. Many languages will abstract the logical operations away behind normal arithmetic or "test/set" operations.
Need to use binary...
A = 1,
B = 2,
C = 4,
D = 8,
E = 16,
F = 32,
G = 64,
H = 128
This means A + B = 3 but C = 4. You'll never have two conflicting values. I've listed the maximum you can have for a single byte, 8 values or (bits).