rust calling failure::fail_bounds_check with no-landing-pads flag enabled - operating-system

I have been trying to write a basic kernel in rust and the link script fails with the following error:
roost.rs:(.text.kmain+0x12a): undefined reference to 'failure::fail_bounds_check::hee3207bbe41f708990v::v0.11.0'
I compile the rust source files with the following flags:
-O --target i686-unknown-linux-gnu -Z no-landing-pads --crate-type lib --emit=obj
If I understand the rust compiler correctly the -Z no-landing-pads option should stop the compiler from generating the failure functions. From testing I can tell that the failure function is only generated when the kmain function calls my function io::write_char(c: char)
This is the definition of io::write_char(c: char)
pub fn write_char(c: char) {
unsafe {
vga::place_char_at(c, vga::cursor_x, vga::cursor_y);
vga::cursor_y =
if vga::cursor_x >= vga::VGA_WIDTH {
vga::cursor_y + 1
} else {
vga::cursor_y
};
vga::cursor_x =
if vga::cursor_x >= vga::VGA_WIDTH {
0
} else {
vga::cursor_x + 1
};
vga::set_cursor_location(vga::cursor_x, vga::cursor_y);
}
}
How can I stop rust from trying to call the nonexistant function failure::fail_bounds_check?
Edit: further testing indicates that the vga::place_char_at function is the cause. Here is the code:
pub fn place_char_at(c: char, x: u8, y: u8) {
let tmpx =
if x >= VGA_WIDTH {
VGA_WIDTH - 1
} else {
x
};
let tmpy =
if y >= VGA_HEIGHT {
VGA_HEIGHT - 1
} else {
y
};
unsafe {
(*SCREEN)[(tmpy as uint) * 80 + (tmpx as uint)].char = c as u8;
}
}
From what I can tell the issue is that rust wants to bound check the array access I'm doing, is there a way to turn the assure the compiler that the checks have been done or turn off the feature for that function?
Edit2: So I solved it after some work. After digging around in the docs I found that rust has a function for vector access that bypasses bound checking. To use it I changed the place_char_at function to this:
pub fn place_char_at(c: char, x: u8, y: u8) {
let tmpx =
if x >= VGA_WIDTH {
VGA_WIDTH - 1
} else {
x
};
let tmpy =
if y >= VGA_HEIGHT {
VGA_HEIGHT - 1
} else {
y
};
unsafe {
(*SCREEN).unsafe_mut_ref((tmpy as uint) * 80 + (tmpx as uint)).char = c as u8;
}
}

Make sure you're linking to libcore. Also libcore has one dependency: a definition of failure. Make sure you mark a function #[lang="begin_unwind"] somewhere in your exception code. The requirement is that begin_unwind not return. See here for my example.
is there a way to ... turn off the feature for that function?
Nope. In the words of bstrie, if there were a compiler flag to eliminate array bounds checks, then bstrie would fork the language and make the flag delete your hard drive. In other words, safety is paramount.

You haven't described the type of SCREEN but if it implements MutableVector trait, what you probably want is to use an unsafe_set ( http://doc.rust-lang.org/core/slice/trait.MutableVector.html#tymethod.unsafe_set ):
unsafe fn unsafe_set(self, index: uint, val: T)
This performs no bounds checks, and it is undefined behaviour if index is larger than the length of self. However, it does run the destructor at index. It is equivalent to self[index] = val.

Related

Type 'Int' does not conform to protocol 'BooleanType'?

I know there is another thread with the same question, but it doesn't tell what is actually causing the problem
Im new to swift, so Im a bit confused on this.
I wrote a very simple program that is supposed to start with a default number of followers (0) and assign that to 'defaultfollowers' and once that becomes 1 its supposed become "followers", but I get the error "Type 'Int' does not conform to protocol 'BooleanType'". What is causing this and why
var followerdeafault = 0
var followers = 0
if (followerdeafault++){
var followers = followerdeafault
}
In Swift you can't implicitly substitute Int instead of Bool. This was done to prevent confusion and make code more readable.
So instead of this
let x = 10
if x { /* do something */ }
You have to write this:
let x = 10
if x != 0 { /* do something */ }
Also you can't pass an Optional instead of Bool to check if it's nil, as you would do in Objective-C. Use explicit comparison instead:
if myObject != nil { /* do something */ }
As the comments said, you're trying to use an Int in a Bool comparison statement. What you're looking for is probably something like this:
if followerdeafuaut++ == 1 { ... }
Also side note: the ++ operator is deprecated, moving towards using +=

Simple Pointer Operations in Swift?

Let's say I do the following in C++:
int i = 1;
int* ptr = &i;
*ptr = 2;
cout << i << '\n';
And I want to do something similar in swift. Could I do the following?
var i : Int = 1
var iptr : UnsafeMutablePointer<Int> = &i
iptr.memory = 2
print(i)
And achieve the same result?
Yes-ish.
You can't do it exactly as you've attempted in the question. It won't compile. Swift won't let you directly access the address of a value like this. At the end of the day, the reason is mostly because there's simply no good reason to do so.
We do see the & operator in Swift however.
First of all, there is the inout keyword when declaring function parameters:
func doubleIfPositive(inout value: Float) -> Bool {
if value > 0 {
value *= 2
return true
}
return false
}
And to call this method, we'd need the & operator:
let weMadeARadian = doubleIfPositive(&pi)
We can see it similarly used when we have a function which takes an argument of type UnsafeMutablePointer (and other variants of these pointer structs). In this specific case, it's primarily for interoperability with C & Objective-C, where we could declare a method as such:
bool doubleIfPositive(float * value) -> bool {
if (value > 0) {
value *= 2;
return true;
}
return false;
}
The Swift interface for that method ends up looking somethin like this:
func doubleIfPositive(value: UnsafeMutablePointer<Float>) -> Bool
And calling this method from Swift actually looks just like it did before when using the inout approach:
let weMadeARadian = doubleIfPositive(&pi)
But these are the only two uses of this & operator I can find in Swift.
With that said, we can write a function that makes use of the second form of passing an argument into a method with the & operator and returns that variable wrapped in an unsafe mutable pointer. It looks like this:
func addressOf<T>(value: UnsafeMutablePointer<T>) -> UnsafeMutablePointer<T> {
return value
}
And it behaves about as you'd expect from your original code snippet:
var i: Int = 1
var iPtr = addressOf(&i)
iPtr.memory = 2
print(i) // prints 2
As noted by Kevin in the comments, we can also directly allocate memory if we want.
var iPtr = UnsafeMutablePointer<Int>.alloc(1)
The argument 1 here is effectively the mount of space to allocate. This says we want to allocate enough memory for a single Int.
This is roughly equivalent to the following C code:
int * iPtr = malloc(1 * sizeof(int));
BUT...
If you're doing any of this for anything other than interoperability with C or Objective-C, you're most likely not Swifting correctly. So before you start running around town with pointers to value types in Swift, please, make sure it's what you absolutely need to be doing. I've been writing Swift since release, and I've never found the need for any of these shenanigans.
Like this (not the only way, but it's clear):
var i : Int = 1
withUnsafeMutablePointer(&i) {
iptr -> () in
iptr.memory = 2
}
print(i)
Not a very interesting example, but it is completely parallel to your pseudo-code, and we really did reach right into the already allocated memory and alter it, which is what you wanted to do.
This sort of thing gets a lot more interesting when what you want to do is something like cycle thru memory just as fast as doing pointer arithmetic in C.

Why doesn't this base type extension work?

Trying to play with extensions, but am having issues getting the following to work:
let value = -13
abs(value)
extension Int {
var abs:Int {
return abs(self) // -> Cannot invoke 'abs' with an argument list of type '(Int)'
}
}
value.abs
The compile error is weird, because it demonstrably runs the abs() function directly above with an Int as an argument. I've still got some light bulbs to trigger for generics I guess. Enlighten me.
The Swift compiler is confused that you use the abs variable as a function, which it cannot do. Now you could look at all the answers and rename your variable, but these do not give insight in how Swift functions work.
Swift automatically imports the Swift framework, where it defines its static functions. To use these functions, you usually do not need to specify that it's from the framework, but in cases like this, you should specify that you want to use the abs method from the Swift framework.
So after all the explanation, here's your code, which will work:
let value = -13
abs(value)
extension Int {
var abs: Int {
return Swift.abs(self)
}
}
value.abs
It appears just a call resolution problem. This will work:
let value = -13
abs(value)
extension Int {
var abs1:Int {
return abs(self)
}
}
value.abs1
And this will work too:
extension Int {
var abs:Int {
return self < 0 ? -self : self
}
}
value.abs
The problem here is that you are extending Int to add a variable named abs -- which is also the name of the function you are calling.
When you try to call the function abs() on the Int, it sees the variable abs that you created and it is confused because it thinks you are trying to return that variable and doesn't understand why you are sending it a parameter.
If you rename your variable to absoluteValue or anything else really, it should work.
let value = -13
abs(value)
extension Int {
var absoluteValue:Int {
return abs(self)
}
}
value.abs
Update: As others have stated, you can also solve the disambiguation of the use of abs by explicitly calling the function within the Swift framework. This should work just as well as the above solution.
let value = -13
abs(value)
extension Int {
var abs:Int {
return Swift.abs(self)
}
}
value.abs
Though, personally, I would still rename my new function to absoluteValue as in the first example so that its clear that you aren't calling the Swift.abs() when you use your abs variable.
Thanks to the direction of the original two answers (clash between global free function and the var I was defining), they have to be disambiguated. Rather than do my own inline implementation of abs or be forced to use a different name, I can properly scope the inside abs() using the Swift namespace.
extension Int {
var absoluteValue:Int {
return Swift.abs(self)
}
}
This gives me the best of both worlds (IMO).

swift, optional unwrapping, reversing if condition

Let's say I have function which returns optional. nil if error and value if success:
func foo() -> Bar? { ... }
I can use following code to work with this function:
let fooResultOpt = foo()
if let fooResult = fooResultOpt {
// continue correct operations here
} else {
// handle error
}
However there are few problems with this approach for any non-trivial code:
Error handling performed in the end and it's easy to miss something. It's much better, when error handling code follows function call.
Correct operations code is indented by one level. If we have another function to call, we have to indent one more time.
With C one usually could write something like this:
Bar *fooResult = foo();
if (fooResult == null) {
// handle error and return
}
// continue correct operations here
I found two ways to achieve similar code style with Swift, but I don't like either.
let fooResultOpt = foo()
if fooResult == nil {
// handle error and return
}
// use fooResultOpt! from here
let fooResult = fooResultOpt! // or define another variable
If I'll write "!" everywhere, it just looks bad for my taste. I could introduce another variable, but that doesn't look good either. Ideally I would like to see the following:
if !let fooResult = foo() {
// handle error and return
}
// fooResult has Bar type and can be used in the top level
Did I miss something in the specification or is there some another way to write good looking Swift code?
Your assumptions are correct—there isn't a "negated if-let" syntax in Swift.
I suspect one reason for that might be grammar integrity. Throughout Swift (and commonly in other C-inspired languages), if you have a statement that can bind local symbols (i.e. name new variables and give them values) and that can have a block body (e.g. if, while, for), those bindings are scoped to said block. Letting a block statement bind symbols to its enclosing scope instead would be inconsistent.
It's still a reasonable thing to think about, though — I'd recommend filing a bug and seeing what Apple does about it.
This is what pattern matching is all about, and is the tool meant for this job:
let x: String? = "Yes"
switch x {
case .Some(let value):
println("I have a value: \(value)")
case .None:
println("I'm empty")
}
The if-let form is just a convenience for when you don't need both legs.
If what you are writing is a set of functions performing the same sequence of transformation, such as when processing a result returned by a REST call (check for response not nil, check status, check for app/server error, parse response, etc.), what I would do is create a pipeline that at each steps transforms the input data, and at the end returns either nil or a transformed result of a certain type.
I chose the >>> custom operator, that visually indicates the data flow, but of course feel free to choose your own:
infix operator >>> { associativity left }
func >>> <T, V> (params: T?, next: T -> V?) -> V? {
if let params = params {
return next(params)
}
return nil
}
The operator is a function that receives as input a value of a certain type, and a closure that transforms the value into a value of another type. If the value is not nil, the function invokes the closure, passing the value, and returns its return value. If the value is nil, then the operator returns nil.
An example is probably needed, so let's suppose I have an array of integers, and I want to perform the following operations in sequence:
sum all elements of the array
calculate the power of 2
divide by 5 and return the integer part and the remainder
sum the above 2 numbers together
These are the 4 functions:
func sumArray(array: [Int]?) -> Int? {
if let array = array {
return array.reduce(0, combine: +)
}
return nil
}
func powerOf2(num: Int?) -> Int? {
if let num = num {
return num * num
}
return nil
}
func module5(num: Int?) -> (Int, Int)? {
if let num = num {
return (num / 5, num % 5)
}
return nil
}
func sum(params: (num1: Int, num2: Int)?) -> Int? {
if let params = params {
return params.num1 + params.num2
}
return nil
}
and this is how I would use:
let res: Int? = [1, 2, 3] >>> sumArray >>> powerOf2 >>> module5 >>> sum
The result of this expression is either nil or a value of the type as defined in the last function of the pipeline, which in the above example is an Int.
If you need to do better error handling, you can define an enum like this:
enum Result<T> {
case Value(T)
case Error(MyErrorType)
}
and replace all optionals in the above functions with Result<T>, returning Result.Error() instead of nil.
I've found a way that looks better than alternatives, but it uses language features in unrecommended way.
Example using code from the question:
let fooResult: Bar! = foo();
if fooResult == nil {
// handle error and return
}
// continue correct operations here
fooResult might be used as normal variable and it's not needed to use "?" or "!" suffixes.
Apple documentation says:
Implicitly unwrapped optionals are useful when an optional’s value is confirmed to exist immediately after the optional is first defined and can definitely be assumed to exist at every point thereafter. The primary use of implicitly unwrapped optionals in Swift is during class initialization, as described in Unowned References and Implicitly Unwrapped Optional Properties.
How about the following:
func foo(i:Int) ->Int? {
switch i {
case 0: return 0
case 1: return 1
default: return nil
}
}
var error:Int {
println("Error")
return 99
}
for i in 0...2 {
var bob:Int = foo(i) ?? error
println("\(i) produces \(bob)")
}
Results in the following output:
0 produces 0
1 produces 1
Error
2 produces 99

What does error conflicting types for '' mean?

i got an error that said "error: conflicting types for '____'. What does that mean?
Quickfix:
Make sure that your functions are declared once and only once before they are called. For example, change:
main(){ myfun(3.4); }
double myfun(double x){ return x; }
To:
double myfun(double x){ return x; }
main(){ myfun(3.4); }
Or add a separate function declaration:
double myfun(double x);
main(){ myfun(3.4); }
double myfun(double x){ return x; }
Possible causes for the error
Function was called before being declared
Function defined overrides a function declared in an included header.
Function was defined twice in the same file
Declaration and definition don't match
Declaration conflict in the included headers
What's really going on
error: conflicting types for ‘foo’ means that a function was defined more than once with different type signatures.
A file that includes two functions with the same name but different return types would throw this error, for example:
int foo(){return 1;}
double foo(){return 1.0;}
Indeed, when compiled with GCC we get the following errors:
foo.c:5:8: error: conflicting types for ‘foo’
double foo(){return 1.0;}
^
foo.c:4:5: note: previous definition of ‘foo’ was here
int foo(){return 1;}
^
Now, if instead we had a file with two function definitions with the same name
double foo(){return 1;}
double foo(){return 1.0;}
We would get a 'redefinition' error instead:
foo.c:5:8: error: redefinition of ‘foo’
double foo(){return 1.0;}
^
foo.c:4:8: note: previous definition of ‘foo’ was here
double foo(){return 1;}
^
Implicit function declaration
So why does the following code throw error: conflicting types for ‘foo’?
main(){ foo(); }
double foo(){ return 1.0; }
The reason is implicit function declaration.
When the compiler first encounters foo() in the main function, it will assume a type signature for the function foo of int foo(). By default, implicit functions are assumed to return integers, and the input argument types are derived from what you're passing into the function (in this case, nothing).
Obviously, the compiler is wrong to make this assumption, but the specs for the C (and thus Objective-C) language are old, cranky, and not very clever. Maybe implicitly declaring functions saved some development time by reducing compiler complexity back in the day, but now we're stuck with a terrible feature that should have never made it into the language. In fact, implicit declarations were made illegal in C99.
That said, once you know what's going on, it should be easy to dig out the root cause of your problem.
it's probably because your function "_" already exists in your library. It happened to me with this function:
I was using stdio.h
int getline (char s[ ] , int lim)
{
int c, i;
for (i=0; i < lim-1 && (c=getchar())!=EOF && c!='\n'; ++i)
s[i] = c;
if (c == '\n') {
s[i] = c;
++i;
}
s[i] = '\0';
return i;
}
When I changed "getline" to "getlinexxx" and gcc compiled it:
int getlinexxx (char s[], int lim)
{
int c, i;
for (i=0; i < lim-1 && (c=getchar())!=EOF && c!='\n'; ++i)
s[i] = c;
if (c == '\n') {
s[i] = c;
++i;
}
s[i] = '\0';
return i;
}
And the problem was gone
What datatype is '___'?
My guess is that you're trying to initialize a variable of a type that can't accept the initial value. Like saying int i = "hello";
If you're trying to assign it from a call that returns an NSMutableDictionary, that's probably your trouble. Posting the line of code would definitely help diagnose warnings and errors in it.