Want achieve in swift (MacOS) the following dead-simple perl script:
use strict;
use warnings;
for my $arg (#ARGV) { #loop over arguments
next if $arg !~ /^\d+$/; #skip if not all digits
print("arg:", $arg + 2, "\n"); #print the number + 2...
}
so invoking this script as: perl script.pl 10 20 30 prints
arg:12
arg:22
arg:32
My "experiment" in swift script called arg.swift:
import Darwin // otherwise the "exit" won't works...
if CommandLine.argc < 2 {
print("Error", CommandLine.arguments[0], " No arguments are passed.")
exit(1)
}
for i in 1 ..< CommandLine.argc { // in the "0" is the program name (as above) so skip it
print("arg:", CommandLine.arguments[i] + 2) // add 2 and print the result
}
running it as swift arg.swift 10 20 30 , prints the following errors..
arg.swift:9:44: error: cannot subscript a value of type '[String]' with an index of type 'Int32'
print("arg:", CommandLine.arguments[i] + 2) // add 2 and print the result
^
arg.swift:9:44: note: overloads for 'subscript' exist with these partially matching parameter lists: (Int), (Range<Int>), (Range<Self.Index>), ((UnboundedRange_) -> ())
print("arg:", CommandLine.arguments[i] + 2) // add 2 and print the result
Honestly, absolutely don't understand what is wrong, so why CommandLine.arguments[0] works, and it complains about CommandLine.arguments[i]... Also, what(?) about overloads?
if someone needs, using:
$ swift --version
Apple Swift version 4.2.1 (swiftlang-1000.11.42 clang-1000.11.45.1)
Target: x86_64-apple-darwin17.7.0
All arguments are passed as Strings, so if you want to use them as Ints, you need to add the conversion.
Also, argc is an UInt32 and needs to be converted as well so that you can use it as a subscript index.
For example, like so:
for i in 1 ..< Int(CommandLine.argc) {
if let intValue = Int(CommandLine.arguments[i]) {
print("arg:", intValue + 2)
}
}
Related
Need to write a GPA calculator using the provided dictionary to output the gpa based on the 4 arguments of letter grades. I can get the code to run in google colab or other IDEs, but I get no output in CL. Can someone point me to what I am missing?
'''
import sys
#print (sys.argv[1])
#print (sys.argv[2])
#print (sys.argv[3])
#print (sys.argv[4])
def gpa_calculator():
grades = [sys.argv[1], sys.argv[2], sys.argv[3], sys.argv[4]]
grades_upper = [each_string.upper() for each_string in grades]
points = 0
grade_chart = {'A':4.0, 'A-':3.66, 'B+':3.33, 'B':3.0, 'B-':2.66,
'C+':2.33, 'C':2.0, 'C-':1.66, 'D+':1.33, 'D':1.00, 'D-':.66, 'F':0.00}
for grade in grades_upper:
points += grade_chart[grade]
gpa = points / len(grades)
rounded_gpa = round(gpa,2)
return rounded_gpa
print (rounded_gpa)
gpa_calculator()'''
return rounded_gpa
print (rounded_gpa)
You seem to be returning the value from the function before you reach the print statement. I'm guessing the value is returned correctly, but you don't do anything with the return value when calling the function, so nothing is output to the screen.
You should move the print(...) call above the return statement, or print out the result when calling the function:
print(gpa_calculate())
That's because you return first before print.
In jupyter notebook like google colab, each cell will print anything the last line returns (if any). That's why in such environment you get output.
Corrected code:
import sys
#print (sys.argv[1])
#print (sys.argv[2])
#print (sys.argv[3])
#print (sys.argv[4])
def gpa_calculator():
grades = [sys.argv[1], sys.argv[2], sys.argv[3], sys.argv[4]]
grades_upper = [each_string.upper() for each_string in grades]
points = 0
grade_chart = {'A':4.0, 'A-':3.66, 'B+':3.33, 'B':3.0, 'B-':2.66,
'C+':2.33, 'C':2.0, 'C-':1.66, 'D+':1.33, 'D':1.00, 'D-':.66, 'F':0.00}
for grade in grades_upper:
points += grade_chart[grade]
gpa = points / len(grades)
rounded_gpa = round(gpa,2)
print(rounded_gpa)
return rounded_gpa
gpa_calculator()
output:
C:\Users\XXXXX\Desktop>python3 a.py A B C D
2.5
C:\Users\XXXXX\Desktop>python3 a.py A A A A
4.0
When doing so I get the Error Code:
Cannot assign value of type '(String, String)' to type 'String?'
As you can see below, I have already converted the Numbers to String values which didn't help.
Ultimately I want to have an UI Label which displays a value + String depending on a switch function.
The Value is going to switch Units and the UI Label should say ( 256 Days or 3.5 hours and for the last example not 0.14 days)
Assigning the Units when printing inti the XCode Console is easy. What I would need is to copy the Console Output ( Target Output) and then paste it into a UI Label, if the other method doesnt work.
Here the Code :
switch Differenz {
case 1...9999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999 : print (Differenzstring,"Years")
case 0.0027...0.99 : print ( Differenz*365, "Days ")
case 0.000114...0.00269 : print (Differenz * 365*24,"Hours")
case 0.0000019...0.0001139 : print (Differenz * 365 * 24 * 60, "Minutes" )
case 0.000000000001 ... 0.00000189 : print (Differenz * 365 * 24 * 60 * 60,"Seconds")
case 0...0: print ( "Time (almost) stops")
default : print ( "Calculation failed" )
}
// Units not inegrated yet, Result will therefore be in Years perfect would be the Cosnole Output, provided by the Switch Statement also being pasted into "Output Label"
Output.text = String ( Differenz )
}
could you try this out?
var usedTimeLabel = "seconds" //reassign this value when doing switch
Output.text = "\(Differenz) \(usedTimeLabel)"
I am trying to build a Binary to Decimal calculator for the Apple Watch using Swift 4.
The code I am having trouble is this:
var i = 0
var labelInputInt = 0
let labelOutputString = "10010" // Random number in binary
let reverse = String(labelOutputString.reversed()) // Reversing the original string
while i <= reverse.count {
let indexOfString = reverse.index(reverse.startIndex, offsetBy: i)
if reverse[indexOfString] == "1" {
labelInputInt += 2^i * 1
}
i += 1
}
I am using a while loop to get the index indexOfString and check if in the string reverse at the specific index it is equal with "1".
The problem is that I get a runtime error when the if statement is executed.
The error looks like this:
2 libpthread.so.0 0x00007fc22f163390
3 libswiftCore.so 0x00007fc22afa88a0 _T0s18_fatalErrorMessages5NeverOs12Stati
cStringV_A2E4fileSu4lines6UInt32V5flagstFTfq4nnddn_n + 96
4 libswiftCore.so 0x00007fc22afb3323
5 libswiftCore.so 0x00007fc22afdf9a2
6 libswiftCore.so 0x00007fc22aedca19 _T0SS9subscripts9CharacterVSS5IndexVcfg
+ 9
7 libswiftCore.so 0x00007fc22f591294 _T0SS9subscripts9CharacterVSS5IndexVcfg
+ 74139780
8 swift 0x0000000000f2925f
9 swift 0x0000000000f2d402
10 swift 0x00000000004bf516
11 swift 0x00000000004ae461
12 swift 0x00000000004aa411
13 swift 0x0000000000465424
14 libc.so.6 0x00007fc22d88d830 __libc_start_main + 240
15 swift 0x0000000000462ce9
Stack dump:
0. Program arguments: /home/drkameleon/swift4/usr/bin/swift -frontend -inte
rpret tmp/XfwP0oM7FJ.swift -disable-objc-interop -suppress-warnings -module-na
me XfwP0oM7FJ
Illegal instruction (core dumped)
So, how can I get a specific character of a String and compare it with another character without getting this crash?
Your approach to get a specific character from a string is actually correct, there are two other problems in your code:
The index i should run up to and excluding reverse.count.
This is conveniently done with the "half-open range" operator (..<).
^ is the bitwise-xor operator, not exponentiation. Exponentiation is done with the pow() function, in your case
labelInputInt += Int(pow(2.0, Double(i)))
or with the "shift-left" operator << if the base is 2.
So this would be a working variant:
for i in 0 ..< reverse.count {
let indexOfString = reverse.index(reverse.startIndex, offsetBy: i)
if reverse[indexOfString] == "1" {
labelInputInt += 1 << i
}
i += 1
}
But you can simply enumerate the characters of a string in reverse order instead of subscripting (which is also more efficient):
let binaryString = "10010"
var result = 0
for (i, char) in binaryString.reversed().enumerated() {
if char == "1" {
result += 1 << i
}
}
print(result)
Even simpler with forward iteration, no reversed() or << needed:
let binaryString = "10010"
var result = 0
for char in binaryString {
result = 2 * result
if char == "1" {
result += 1
}
}
print(result)
Which suggests to use reduce():
let binaryString = "10010"
let result = binaryString.reduce(0) { 2 * $0 + ($1 == "1" ? 1 : 0) }
print(result)
But why reinvent the wheel? Just use init?(_:radix:) from the Swift standard library (with error-checking for free):
let binaryString = "10010"
if let result = Int(binaryString, radix: 2) {
print(result)
} else {
print("invalid input")
}
I'm attempting to submit the HackerRank Day 6 Challenge for 30 Days of Code.
I'm able to complete the task without issue in an Xcode Playground, however HackerRank's site says there is no output from my method. I encountered an issue yesterday due to browser flakiness, but cleaning caches, switching from Safari to Chrome, etc. don't seem to resolve the issue I'm encountering here. I think my problem lies in inputString.
Task
Given a string, S, of length N that is indexed from 0 to N-1, print its even-indexed and odd-indexed characters as 2 space-separated strings on a single line (see the Sample below for more detail).
Input Format
The first line contains an integer, (the number of test cases).
Each line of the subsequent lines contain a String, .
Constraints
1 <= T <= 10
2 <= length of S < 10,000
Output Format
For each String (where 0 <= j <= T-1), print S's even-indexed characters, followed by a space, followed by S's odd-indexed characters.
This is the code I'm submitting:
import Foundation
let inputString = readLine()!
func tweakString(string: String) {
// split string into an array of lines based on char set
var lineArray = string.components(separatedBy: .newlines)
// extract the number of test cases
let testCases = Int(lineArray[0])
// remove the first line containing the number of unit tests
lineArray.remove(at: 0)
/*
Satisfy constraints specified in the task
*/
guard lineArray.count >= 1 && lineArray.count <= 10 && testCases == lineArray.count else { return }
for line in lineArray {
switch line.characters.count {
// to match constraint specified in the task
case 2...10000:
let characterArray = Array(line.characters)
let evenCharacters = characterArray.enumerated().filter({$0.0 % 2 == 0}).map({$0.1})
let oddCharacters = characterArray.enumerated().filter({$0.0 % 2 == 1}).map({$0.1})
print(String(evenCharacters) + " " + String(oddCharacters))
default:
break
}
}
}
tweakString(string: inputString)
I think my issue lies the inputString. I'm taking it "as-is" and formatting it within my method. I've found solutions for Day 6, but I can't seem to find any current ones in Swift.
Thank you for reading. I welcome thoughts on how to get this thing to pass.
readLine() reads a single line from standard input, which
means that your inputString contains only the first line from
the input data. You have to call readLine() in a loop to get
the remaining input data.
So your program could look like this:
func tweakString(string: String) -> String {
// For a single input string, compute the output string according to the challenge rules ...
return result
}
let N = Int(readLine()!)! // Number of test cases
// For each test case:
for _ in 1...N {
let input = readLine()!
let output = tweakString(string: input)
print(output)
}
(The forced unwraps are acceptable here because the format of
the input data is documented in the challenge description.)
Hi Adrian you should call readLine()! every row . Here an example answer for that challenge;
import Foundation
func letsReview(str:String){
var evenCharacters = ""
var oddCharacters = ""
var index = 0
for char in str.characters{
if index % 2 == 0 {
evenCharacters += String(char)
}
else{
oddCharacters += String(char)
}
index += 1
}
print (evenCharacters + " " + oddCharacters)
}
let rowCount = Int(readLine()!)!
for _ in 0..<rowCount {
letsReview(str:String(readLine()!)!)
}
I have the following predefined codes that represent an index in a binary bitmap:
0 = standard
1 = special
2 = regular
3 = late
4 = early
5 = on time
6 = generic
7 = rfu
An example value I would take as an input would be 213, which becomes 11010101 in binary. Index 0, 2, 4, 6, and 7 have their bit flipped indicating that this record is:
standard + regular + early + generic + rfu.
I am trying to figure out in perl how to take that binary data and build a string, like mentioned above with code + code + code, etc.
Any help would be greatly appreciated. Thanks.
Edit: My thoughts on how I might approach this are:
Convert decimal to binary
Find length of binary string
Using substr get the value (0 or 1) index by index
If index value = 1 then add relevant code to string
Is there a better way to go about this?
You can test bits on input from 0 to 7, and take only these that are set,
my $in = 213;
my #r = ("standard","special","regular","late","early","on time","generic","rfu");
print join " + ", #r[ grep { $in & (1 << $_) } 0 .. $#r ];
# or
# print join " + ", map { $in & (1<<$_) ? $r[$_] : () } 0 .. $#r;
output
standard + regular + early + generic + rfu