Passing the correct type to addch() [ncurses, Linux] - swift

In following a tutorial on how to use ncurses with swift I have been facing the error:
main.swift:31:15: error: cannot convert value of type 'UInt?' to expected argument type 'chtype' (aka 'UInt32')
addch(UInt("*"))
^~~~~~~~~
It complains about the type, but when I change it to UInt32 the error changes to error: ambiguous use of 'init' addch(UInt32("*"))
Question
How to pass the correct value type to addch?
The entire code for reference:
import Foundation
import CNCURSES
import Glibc
enum Signal:Int32 {
case INT = 2
case WINCH = 28
}
typealias SignalHandler = __sighandler_t
func trap(signum:Signal, action:#escaping SignalHandler) {
signal(signum.rawValue, action)
}
func getmaxyx(window:UnsafeMutablePointer<WINDOW>, y:inout Int32, x:inout Int32) {
x = getmaxx(window)
y = getmaxy(window)
}
func getcuryx(window:UnsafeMutablePointer<WINDOW>, y:inout Int32, x:inout Int32) {
x = getcurx(window)
y = getcury(window)
}
func drawbox(numlines:Int32, numcols:Int32) {
for y in 0...numlines-1 {
for x in 0...numcols {
move(y, x)
if y == 0 || y == numlines-1 {
addch(UInt("*"))
} else {
if x == 0 || x == numcols {
addch(UInt("*"))
}
}
}
}
refresh()
}
[...]
initscr()
noecho()
curs_set(0)
getmaxyx(window:stdscr, y:&maxy, x:&maxx)
drawbox(numlines:maxy, numcols:maxx)
center(text:"Hello world!", numlines:maxy, numcols:maxx)
while true {
select(0, nil, nil, nil, nil)
}

Seeing the error message, you need to pass a UInt32 value to addch(_:).
The return type of UInt("*") is UInt?, and its actual value is always nil. (Simple String to UInt conversion tries to interpret the String as decimal integer.)
When you want a character code as UInt32, you may need to write something like this:
addch(("*" as UnicodeScalar).value)
If your code would have many more addch(_:) calls, you can define a simple wrapper for it.
For example:
func addCh(_ us: UnicodeScalar) {
addch(us.value)
}
addCh("*")
With explicitly annotating as UnicodeScalar, String Literals are interpreted as UnicodeScalar and its value property is of type UInt32.

The correct (and documented type) is chtype:
int addch(const chtype ch);
int waddch(WINDOW *win, const chtype ch);
int mvaddch(int y, int x, const chtype ch);
int mvwaddch(WINDOW *win, int y, int x, const chtype ch);
int echochar(const chtype ch);
int wechochar(WINDOW *win, const chtype ch);
The number of bits in chtype depends upon the system. In ncurses' header files, it is by default declared as unsigned, but that can be overridden when configuring/building ncurses.
X/Open (see addch and <curses.h>) says nothing more explicit than that.
Of course, whatever it happens to be in swift is an implementation detail of the binding, and unless documented, is subject to change.

Related

When declaring static variable for conformance to AdditiveArithmetic, cannot call instance member from same type

I know this sounds crazy for a 10-year-old, but because S4TF doesn't work for me, I'm building my own neural network library in Swift. (I haven't gotten that far.) I'm creating a structure that conforms to AdditiveArithmetic. It also uses Philip Turner's Differentiable, but that's unimportant.
Anyway, when defining the zero variable, I call another variable dimen, defined in the same structure. This raises an error: instance member 'dimen' cannot be used on type 'Electron<T>'
note: the structure I am creating is going to be used to create a multi-dimensional array for neural networks.
Total code (stripped down to remove unimportant bits):
public struct Electron<T> where T: ExpressibleByFloatLiteral, T: AdditiveArithmetic {
var energy: [[Int]: T] = [:]
var dimen: [Int]
public init(_ dim: [Int], with: ElectronInitializer) {
self.dimen = dim
self.energy = [:]
var curlay = [Int](repeating: 0, count: dimen.count)
curlay[curlay.count-1] = -1
while true {
var max: Int = -1
for x in 0..<curlay.count {
if curlay[curlay.count-1-x] == dimen[curlay.count-1-x]-1 {
max = curlay.count-1-x
}
else {break}
}
if max == 0 {break}
else if max != -1 {
for n in max..<curlay.count {
curlay[n] = -1
}
curlay[max-1] += 1
}
curlay[curlay.count-1] += 1
print(curlay)
energy[curlay] = { () -> T in
switch with {
case ElectronInitializer.repeating(let value):
return value as! T
case ElectronInitializer.random(let minimum, let maximum):
return Double.random(in: minimum..<maximum) as! T
}
}()
}
}
subscript(_ coordinate: Int...) -> T {
var convertList: [Int] = []
for conversion in coordinate {
convertList.append(conversion)
}
return self.energy[convertList]!
}
public mutating func setQuantum(_ replace: T, at: [Int]) {
self.energy[at]! = replace
}
}
extension Electron: AdditiveArithmetic {
public static func - (lhs: Electron<T>, rhs: Electron<T>) -> Electron<T> where T: AdditiveArithmetic, T: ExpressibleByFloatLiteral {
var output: Electron<T> = lhs
for value in lhs.energy {
output.energy[value.key] = output.energy[value.key]!-rhs.energy[value.key]!
}
return output
}
public static var zero: Electron<T> {
return Electron.init(dimen, with: ElectronInitializer.repeating(0.0))
}
static prefix func + (x: Electron) -> Electron {
return x
}
public static func + (lhs: Electron<T>, rhs: Electron<T>) -> Electron<T> where T: AdditiveArithmetic, T: ExpressibleByFloatLiteral {
var output: Electron<T> = lhs
for value in lhs.energy {
output.energy[value.key] = output.energy[value.key]!+rhs.energy[value.key]!
}
return output
}
}
public enum ElectronInitializer {
case random(Double, Double)
case repeating(Double)
}
Error:
NeuralNetwork.xcodeproj:59:30: error: instance member 'dimen' cannot be used on type 'Electron'
return Electron.init(dimen, with: ElectronInitializer.repeating(0.0))
I don't know what's happening, but thanks in advance. I'm new to Stack Overflow, so sorry if I did something wrong.
The root of the problem is that dimen is an instance property, while zero is a static property. In a static context, you don't have an instance from which to access dimen, and so the compiler gives you the error. static properties and methods are a lot like global variables and free-functions with respect to accessing instance properties and methods. You'd have to make an instance available somehow. For a static function, you could pass it in, but for a static computed property, you'd either have to store an instance in a stored static property, which isn't allowed for generics, or you'd have to store it in a global variable, which isn't good either, and would be tricky to make work for all the possible T.
There are ways to do what you need though. They all involve implementing some special behavior for a zero Electron rather than relying on access to an instance property in your static .zero implementation. I made some off-the-cuff suggestions in comments, which would work; however, I think a more elegant solution is to solve the problem by creating a custom type for energy, which would require very few changes to your existing code. Specifically you could make an Energy type nested in your Electron type:
internal struct Energy: Equatable, Sequence {
public typealias Value = T
public typealias Key = [Int]
public typealias Element = (key: Key, value: Value)
public typealias Storage = [Key: Value]
public typealias Iterator = Storage.Iterator
public typealias Keys = Storage.Keys
public typealias Values = Storage.Values
private var storage = Storage()
public var keys: Keys { storage.keys }
public var values: Values { storage.values }
public var count: Int { storage.count }
public init() { }
public subscript (key: Key) -> Value? {
get { storage.isEmpty ? .zero : storage[key] }
set { storage[key] = newValue }
}
public func makeIterator() -> Iterator {
storage.makeIterator()
}
}
The idea here is that when energy.storage is empty, it returns 0 for any key, which allows you to use it as a .zero value. I've made it internal, because energy defaults to internal, and so I've done a minimalist job of wrapping a Dictionary, mainly providing subscript operator, and making it conform to Sequence, which is all that is needed by code you provided.
The only changes needed in the rest of your code are to change the definition of energy
var energy: Energy
Then to set it in your initializer, by-passing the bulk of your init when dim is empty.
public init(_ dim: [Int], with: ElectronInitializer) {
self.dimen = dim
self.energy = Energy() // <- Initialize `energy`
// Empty dim indicates a zero electron which doesn't need the
// rest of the initialization
guard dim.count > 0 else { return }
var curlay = [Int](repeating: 0, count: dimen.count)
curlay[curlay.count-1] = -1
while true {
var max: Int = -1
for x in 0..<curlay.count {
if curlay[curlay.count-1-x] == dimen[curlay.count-1-x]-1 {
max = curlay.count-1-x
}
else {break}
}
if max == 0 {break}
else if max != -1 {
for n in max..<curlay.count {
curlay[n] = -1
}
curlay[max-1] += 1
}
curlay[curlay.count-1] += 1
print(curlay)
energy[curlay] = { () -> T in
switch with {
case ElectronInitializer.repeating(let value):
return value as! T
case ElectronInitializer.random(let minimum, let maximum):
return Double.random(in: minimum..<maximum) as! T
}
}()
}
}
And then of course, to change how you create it in your zero property
public static var zero: Electron<T> {
return Electron.init([], with: ElectronInitializer.repeating(0.0))
}
ElectronInitializer isn't actually used in this case. It's just a required parameter of your existing init. This suggests an opportunity to refactor initialization, so you could have an init() that creates a zero Electron in addition to your existing init(dim:with:)

Cannot convert value of type 'A' to expected argument type 'A' when using generics

I'm running into an issue using generics in Swift - I have experience with generics in Java and am working on translating my knowledge as I go. I have a method that takes a generic parameter type, defined in the protocol like so:
protocol Board {
func getPlace<T : Position>(position: T) -> Place
}
The idea is that the Board can have its own type of Position, like an XYPosition for a SquareBoard, but different types of positions for a hex board.
However, the below playground snippet has a very strange error:
/Users/Craig/projects/MyModule/Sources/SquareBoard.swift:16:39: error: cannot convert value of type 'XYPosition' to
expected argument type 'XYPosition'
let index = toIndex(position: position)
^~~~~~~~
as! XYPosition
If I try to force cast the position, it gets even weirder:
/Users/Craig/projects/MyModule/Sources/SquareBoard.swift:16:48: warning: forced cast of 'XYPosition' to same type h
as no effect
let index = toIndex(position: position as! XYPosition)
^~~~~~~~~~~~~~
/Users/Craig/projects/MyModule/Sources/SquareBoard.swift:16:48: error: cannot convert value of type 'XYPosition' to
expected argument type 'XYPosition'
let index = toIndex(position: position as! XYPosition)
~~~~~~~~~^~~~~~~~~~~~~~
as! XYPosition
Is it redefining the type a second time with a different identity? I can't seem to determine what I'm doing wrong. The issue is reproducible in the below playground:
import Cocoa
protocol Position : Equatable {
}
struct XYPosition : Position {
let x : Int
let y : Int
}
func ==(lhs: XYPosition, rhs:XYPosition) -> Bool {
return lhs.x == rhs.x && lhs.y == rhs.y
}
public class Test {
private func toIndex(position: XYPosition) -> Int {
return (position.y * 10) + position.x
}
func getPlace<XYPosition>(position: XYPosition) -> Int {
let index = toIndex(position: position as! XYPosition)
return 4
}
}
Since you're not posting your actual code it's a bit confusing. No clue what getPlace has to do with your issue and I'm unsure what exactly you're trying to accomplish
Either way, I have your playground working, hopefully you can work from there:
protocol Position : Equatable {
var x: Int { get }
var y: Int { get }
}
struct XYPosition : Position {
let x : Int
let y : Int
}
func ==(lhs: XYPosition, rhs:XYPosition) -> Bool {
return lhs.x == rhs.x && lhs.y == rhs.y
}
public class Test {
private func toIndex<T: Position>(position: T) -> Int {
return (position.y * 10) + position.x
}
func getPlace<T: Position>(position: T) -> Int {
let index = toIndex(position: position)
return index
}
}
First, in your original getPlace<XYPosition>, XYPosition is a locally defined type and is not related to your struct, so when you call as! XYPosition you're trying to cast it to the local type, not your struct.
Second, I'm guessing you're misunderstanding how to use structs. structs cannot be sub-classed, so you cannot use a struct as a generic. Only a protocol or class. If you're passing a struct, you can just pass the struct itself.

What is analogue of Objective C static variable in Swift?

It was very convenient to have static variables in Objective C (static variable's value is maintained throughout all function/method calls), however I couldn't find anything like this in Swift.
Is there anything like this?
This is an example of static variable in C:
void func() {
static int x = 0;
/* x is initialized only once across four calls of func() and
the variable will get incremented four
times after these calls. The final value of x will be 4. */
x++;
printf("%d\n", x); // outputs the value of x
}
int main() { //int argc, char *argv[] inside the main is optional in the particular program
func(); // prints 1
func(); // prints 2
func(); // prints 3
func(); // prints 4
return 0;
}
After seeing your updated answer, here is the modification for your Objective-C code:
func staticFunc() {
struct myStruct {
static var x = 0
}
myStruct.x++
println("Static Value of x: \(myStruct.x)");
}
Call is anywhere in your class
staticFunc() //Static Value of x: 1
staticFunc() //Static Value of x: 2
staticFunc() //Static Value of x: 3
Declare the variable at the top level of a file (outside any classes) which is called global variable.
variables at the top level of a file are initialised lazily! So you
can set the default value for your variable to be the result of
reading the file, and the file won't actually be read until your code
first asks for the variable's value.
Reference from HERE.
UPDATE:
From your C example you can achieve same thing in swift this way:
var x = 0 //this is global variable
func staticVar() {
x++
println(x)
}
staticVar()
x //1
staticVar()
x //2
staticVar()
x //3
Tested with playground.
From Apple Document:
In C and Objective-C, you define static constants and variables
associated with a type as global static variables. In Swift, however,
type properties are written as part of the type’s definition, within
the type’s outer curly braces, and each type property is explicitly
scoped to the type it supports.
You define type properties with the static keyword. For computed type
properties for class types, you can use the class keyword instead to
allow subclasses to override the superclass’s implementation. The
example below shows the syntax for stored and computed type
properties:
struct SomeStructure {
static var storedTypeProperty = "Some value."
static var computedTypeProperty: Int {
// return an Int value here
}
}
enum SomeEnumeration {
static var storedTypeProperty = "Some value."
static var computedTypeProperty: Int {
// return an Int value here
}
}
class SomeClass {
static var storedTypeProperty = "Some value."
static var computedTypeProperty: Int {
// return an Int value here
}
class var overrideableComputedTypeProperty: Int {
// return an Int value here
}
}
NOTE
The computed type property examples above are for read-only computed type >properties, but you can also define read-write computed
type properties with the same syntax as for computed instance
properties.

cannot convert expression type void to type integer in Swift using XCTAssertEqual

I am very new to the Swift language and XCode. Have a error message from this code:
Class Deck
class Deck {
var decks : Integer = 0
init () {
decks = 1
}
init (amountOfDecks : Integer){
decks = amountOfDecks
}
func getAmountOfCards() -> Integer {
return 0
}
}
Test Class
import XCTest
import helloWorldv2
class helloWorldv2Tests: XCTestCase {
override func setUp() {
super.setUp()
}
override func tearDown() {
super.tearDown()
}
func testDeckConstructor() {
var deck = Deck(amountOfDecks: 1)
var amount : Integer = deck.getAmountOfCards()
let expected : Integer = 52
// ERROR: Cannot convert the expression type 'void' to type 'integer'
XCTAssertEqual(expected, amount)
}
}
I set the two variables to type Integer so don't understand why I cant compare the two values...
the Type you should be using is Int (Integer is a protocol, not a Type, and there is not an implementation in Swift for == that accepts arguments conforming to the Integer protocol)
specifying the Type in the way that you are doing it is unnecessary, thanks to Swift's "type inference" - when you declare and assign a value to a variable, it will automatically give that variable the same Type as the value that is being assigned to it (and literal integers in Swift are typed as Int by default)... so let expected = 52 will automatically give your expected constant the Type Int without you having to declare it as such
Integer is a protocol, you should use Int instead as that is an actual struct.

Automatic Type Conversion with extension: What is happening here?

I'm going through the first chapter of The Swift Programming Language book and I'm at the part where it's describing the extension keyword.
I had a go at the "Experiment":
“Write an extension for the Double type that adds an absoluteValue property.”
I got it working like this:
extension Double {
var absoluteValue: Double {
if(self < 0) {
return self * -1
}
return self
}
}
(-10.5).absoluteValue // 10.5
But it also seems to work for integers:
(-4).absoluteValue // 4.0
What is happening here? Is the compiler changing the type from Int to Double because it sees that there is a absoluteValue extension on Double but not Int?
This appears to be the case because if I add another extension of the same name on Int like so:
extension Int {
var absoluteValue: Int {
return 42
}
}
That overrides the extension on Double. And (-4).absoluteValue returns 42
Is there a way to add an extension that only works on Doubles but not Ints?
Edit: Looks like it's doing a conversion at compile-time and since I didn't define a type for my literal it converted it. The following produces an error
var i:Int = -4;
i.absoluteValue
"Playground execution failed: error: :12:1: error: 'Int' does not have a member named 'absoluteValue'
i.absoluteValue
^ ~~~~~~~~~~~~~"
Edit 2: It appears to only apply to literals; the following also produces an error:
var i = -4;
i.absoluteValue
Yes, the extension you wrote is actually only for Doubles, not for Ints. Take a look at this example:
extension Double {
var absoluteValue: Double {
if (self < 0) {
return self * -1
}
return self
}
}
var double: Int = 10
double.absoluteValue // Int does not have a member named absoluteValue
But, in your code the compiler is implicitly converting your Int to a Double.
In case anyone would like an answer that conforms to the example protocol:
protocol ExampleProtocol {
var simpleDescription: String { get }
mutating func adjust()
}
extension Double: ExampleProtocol {
var simpleDescription: String {
return "The number \(self)"
}
var absoluteValue: Double {
return fabs(self)
}
mutating func adjust() {
self = round(self)
}
}
var double: Double = -12.34
double.simpleDescription
double.absoluteValue