Removing files with Build Phase? - iphone

Is it possible to remove a file using a build phase in xcode 4 based on if it is release or dev?
If so has anyone got an example?
I have tried :
if [ "${CONFIGURATION}" = "Debug" ]; then
find "$TARGET_BUILD_DIR" -name '*-live.*' -print0 | xargs -0 rm
fi
This prints CopyStringsFile
"build/Debug-iphonesimulator/Blue Sky.app/PortalText-live.strings" CDL/PortalText-live.strings
cd "/Users/internet/Desktop/iPhone Template/iPhonePortalTemplate/CDL.Labs"
setenv PATH "/Developer/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/usr/bin:/Developer/Applications/Xcode.app/Contents/Developer/usr/bin:/usr/bin:/bin:/usr/sbin:/sbin"
builtin-copyStrings --validate --inputencoding utf-8 --outputencoding binary --outdir "/Users/internet/Desktop/iPhone Template/iPhonePortalTemplate/CDL.Labs/build/Debug-iphonesimulator/Blue Sky.app" -- CDL/PortalText-live.strings
But does actually remove the file from the bundle.

The only way I've ever had different files, is having a separate Target, and only include certain files in certain targets.
EDIT WITH AN EXAMPLE
Ok, I've done exactly the same in another project. We had a DefaultProperties.plist file, which was included in the target.
We then had 3 copies of this, NOT included in the target, ProdProperties.plist, TestProperties.plist, UatProperties.plist.
We built for environments on the command line, using xcodebuild, as it was built using an automated build server (Bamboo).
Prior to executing xcodebuild, we would run this:
cp -vf "./Properties/Environments/${environment}Properties.plist" ./Properties/shared/DefaultProperties.plist
touch Properties/shared/DefaultProperties.plist
with $(environment) being passed into the script.
You could do something like this with the RunScript phase in Xcode.

Related

Stale file is located outside of the allowed root path when using Cuckoo

I wanted to mock some of my files, so I used Cuckoo framework. I am using Swift Package Manager, so I did every step that is shown in README of framework.
I tried to use this script
# Define output file. Change "${PROJECT_DIR}/${PROJECT_NAME}Tests" to your test's
root source folder, if it's not the default name.
OUTPUT_FILE="${PROJECT_DIR}/${PROJECT_NAME}Tests/GeneratedMocks.swift"
echo "Generated Mocks File = ${OUTPUT_FILE}"
# Define input directory. Change "${PROJECT_DIR}/${PROJECT_NAME}" to your project's root source folder, if it's not the default name.
INPUT_DIR="${PROJECT_DIR}/${PROJECT_NAME}"
echo "Mocks Input Directory = ${INPUT_DIR}"
# Generate mock files, include as many input files as you'd like to create mocks for.
"${PROJECT_DIR}/run" --download generate --testable "${PROJECT_NAME}" \
--output "${OUTPUT_FILE}" \
"${INPUT_DIR}/Common/Repository/LatestNewsRepository/LatestNewsRepositoryImpl.swift" \
# ... and so forth, the last line should never end with a backslash
# After running once, locate `GeneratedMocks.swift` and drag it into your Xcode test target group.
I also downloaded the latest run script and I had to check For install builds only.
When app is launched I am getting this error -
Stale file '.../LibraryTests/GeneratedMocks.swift' is located outside of the allowed root paths.
Things I tried -
Clean Xcode derived data
Clean build folder
Reset Xcode
Reset Packages Cache
and I am still not getting output file. Is there anything else I should try?

Standalone Shell script working fine but when used is srcs of sh_binary its not working

I have project structure as follows-
PROJECT_STRUCTURE
Now my_shbin.sh is as below -
#!/bin/bash
find ../../ \( -name "*.java" -o -name "*.xml" -o -name "*.html" -o -name "*.js" -o -name "*.css" \) | grep -vE "/node_modules/|/target/|/dist/" >> temp-scan-files.txt
# scan project files for offensive terms
IFS=$'\n'
for file in $(cat temp-scan-files.txt); do
grep -iF -f temp-scan-regex.txt $file >> its-scan-report.txt
done
This script works completely fine when invoked individually and gives required results.But when I add the below sh_binary in my BUILD file I do not see anything in temp-scan-files.txt file and thus nothing in its-scan-report.txt file
sh_binary(
name = "findFiles",
srcs = ["src/test/resources/my_shbin.sh"],
data = glob(["temp-scan-files.txt", "temp-scan-regex.txt", "its-scan-report.txt"]),
)
I ran sh_binary from intellij using the play icon and also tried running it from terminal using bazel run :findFiles. No error is shown but I cannot see data in temp-scan-files.txt.
Any help on this issue.The documentation of bazel is very confined with approx no information whatsoever except the use case.
When a binary is run using bazel run, it's run from the "runfiles tree" for that binary. The runfiles tree is a directory tree that bazel creates that contains symlinks to the binary's inputs. Try putting pwd and tree at the beginning of the shell script to see what this looks like. The reason that the runfiles tree doesn't contain any of the files in src/main is that they're not declared as inputs to the sh_binary (e.g. using the data attribute). See https://docs.bazel.build/versions/master/user-manual.html#run
Another thing to note is that the glob in data = glob(["temp-scan-files.txt", "temp-scan-regex.txt", "its-scan-report.txt"]), won't match anything, because those files are in src/test/resources relative to the BUILD file. However, the script tries to modify these files, and it's not typically possible to modify input files (if this sh_binary were being run as a build action, the inputs would be effectively read-only. This would work only because bazel run is similar to running the final binary by itself outside bazel, e.g. like bazel build //target && bazel-bin/target)
The most straight-forward way to do this might be something like this:
genrule(
name = "gen_report",
srcs = [
# This must be the first element of srcs so that
# the regex file gets passed to the "-f" of grep in cmd below.
"src/test/resources/temp-scan-regex.txt",
] + glob([
"src/main/**/*.java",
"src/main/**/*.xml",
"src/main/**/*.html",
"src/main/**/*.js",
"src/main/**/*.css",
],
exclude = [
"**/node_modules/**",
"**/target/**",
"**/dist/**",
]),
outs = ["its-scan-report.txt"],
# The first element of $(SRCS) will be the regex file, passed to -f.
cmd = "grep -iF -f $(SRCS) > $#",
)
$(SRCS) are the files in srcs delimited by a space, and $# means "the output file, if there's only one". $(SRCS) will contain the temp-scan-regex.txt file, which you probably don't want to include as part of the scan, but if it's the first element, then it will be the parameter to -f. This is maybe a bit hacky and a little fragile, but it's also kind of annoying to try to separate the file out (e.g. using grep or sed or array slicing).
Then bazel build //project/root/myPackage:its-scan-report.txt

Can we wget with file list and renaming destination files?

I have this wget command:
sudo wget --user-agent='some-agent' --referer=http://some-referrer.html -N -r -nH --cut-dirs=x --timeout=xxx --directory-prefix=/directory/for/downloaded/files -i list-of-files-to-download.txt
-N will check if there is actually a newer file to download.
-r will turn the recursive retrieving on.
-nH will disable the generation of host-prefixed directories.
--cut-dirs=X will avoid the generation of the host's subdirectories.
--timeout=xxx will, well, timeout :)
--directory-prefix will store files in the desired directorty.
This works nice, no problem.
Now, to the issue:
Let's say my files-to-download.txt has these kind of files:
http://website/directory1/picture-same-name.jpg
http://website/directory2/picture-same-name.jpg
http://website/directory3/picture-same-name.jpg
etc...
You can see the problem: on the second download, wget will see we already have a picture-same-name.jpg, so it won't download the second or any of the following ones with the same name. I cannot mirror the directory structure because I need all the downloaded files to be in the same directory. I can't use the -O option because it clashes with --N, and I need that. I've tried to use -nd, but doesn't seem to work for me.
So, ideally, I need to be able to:
a.- wget from a list of url's the way I do now, keeping my parameters.
b.- get all files at the same directory and being able to rename each file.
Does anybody have any solution to this?
Thanks in advance.
I would suggest 2 approaches -
Use the "-nc" or the "--no-clobber" option. From the man page -
-nc
--no-clobber
If a file is downloaded more than once in the same directory, >Wget's behavior depends on a few options, including -nc. In certain >cases, the local file will be
clobbered, or overwritten, upon repeated download. In other >cases it will be preserved.
When running Wget without -N, -nc, -r, or -p, downloading the >same file in the same directory will result in the original copy of file >being preserved and the second copy
being named file.1. If that file is downloaded yet again, the >third copy will be named file.2, and so on. (This is also the behavior >with -nd, even if -r or -p are in
effect.) When -nc is specified, this behavior is suppressed, >and Wget will refuse to download newer copies of file. Therefore, ""no->clobber"" is actually a misnomer in
this mode---it's not clobbering that's prevented (as the >numeric suffixes were already preventing clobbering), but rather the >multiple version saving that's prevented.
When running Wget with -r or -p, but without -N, -nd, or -nc, >re-downloading a file will result in the new copy simply overwriting the >old. Adding -nc will prevent this
behavior, instead causing the original version to be preserved >and any newer copies on the server to be ignored.
When running Wget with -N, with or without -r or -p, the >decision as to whether or not to download a newer copy of a file depends >on the local and remote timestamp and
size of the file. -nc may not be specified at the same time as >-N.
A combination with -O/--output-document is only accepted if the >given output file does not exist.
Note that when -nc is specified, files with the suffixes .html >or .htm will be loaded from the local disk and parsed as if they had been >retrieved from the Web.
As you can see from this man page entry, the behavior might be unpredictable/unexpected. You will need to see if it works for you.
Another approach would be to use a bash script. I am most comfortable using bash on *nix, so forgive the platform dependency. However the logic is sound, and with a bit of modifications, you can get it to work on other platforms/scripts as well.
Sample pseudocode bash script -
for i in `cat list-of-files-to-download.txt`;
do
wget <all your flags except the -i flag> $i -O /path/to/custom/directory/filename ;
done ;
You can modify the script to download each file to a temporary file, parse $i to get the filename from the URL, check if the file exists on the disk, and then take a decision to rename the temp file to the name that you want.
This offers much more control over your downloads.

GNU make: Can I delete obsolete files?

In my workflow, I have lots of xxx.smr files in a folder and I need to convert them into other file format xxx_step3.mat by importing some data from xxx_info.xlsx. I learned that GNU make is powerful in keep all the files up-to-date.
In a very simple "explicit" format (without sophisticated wild card usage), Makefile for this process would look like this. To handle multiple xxx.smr files and their descendants, I should be able to do that by modifying this file.
.PHONY: all clean
all: xxx_step3.mat
xxx_step3.mat: xxx_step2.mat xxx_info.xlsx
matlab -r "merge2files('xxx_step2.mat', 'xxx_info.xlsx')"
xxx_step2.mat: xxx_step1.mat
matlab -r "convertmat('xxx_step1.mat')"
xxx_info.xlsx: master.xslx
matlab -r "extractfromMasterxlsx('master.xlsx', 'xxx_info.xlsx')"
xxx_step1.mat: xxx_step0.smr
#echo "\nCreate " $#
# I can't do this step from the command line so I leave message
clean:
rm -f xxx_step1.mat xxx_step2.mat xxx_step3.mat xxx_info.xlsx
However, I realized that, when some of xxx.smr files were found to be surplus and deleted at some point, running GNU make with this Makefile does not delete the obsolete descendant files, including all the intermediate files and the final xxx_step3.mat files, that are dependent on those deleted xxx.smr files.
For example, I start with the three xxx.smr files and run Make.
A.smr, B.smr, C.smr
It will create all the descendants, including the final target files:
A_step3.mat, B_step3.mat, C_step3.mat
Later, say, I find the B.smr contained a fatal error and decided to delete from the folder.
A.smr, C.smr
Running Make at this stage will result in ... no change, because both A_step3.mat and C_step3.mat are newer than its direct prerequisites (and than A.smr and C.smr). However, actually I need to remove all the descendants of B.smr, such as B_step1.mat, B_step2.mat, B_step3.mat, and B_info.xlsx. If those obsolete files are kept, the final target B_step3.mat will be included in the subsequent analyses and affect the results.
I wonder if there is a "smart" way of removing xxx_step1.mat, xxx_step2.mat, xxx_step3.mat, xxx_info.xlsx files, when their corresponding xxx.smr files have been deleted.
Or should I just implement this with MATLAB or Python etc?
Since a Makefile is a collection of shell commands, on your clean: target, you can collect and remove all the files that correspond to your xxx.smr files using a for loop and parameter expansion/substring matching. To find all files that correspond to each xxx.smr file, find all xxx.smr files. Then for each xxx.smr, extract xxx and remove all xxx_step?.* and xxx_info.* files. After each of the step? and info files are removed, then remove xxx.smr. In multi-line form it would look like:
for i in *.smr; do
for j in ${i%.*}; do
rm -f "${j}_step?.*" "${j}_info.*"
done
rm -f "$i"
done
Or, in a single line:
for i in *.smr; do for j in ${i%.*}; do rm -f "${j}_step?.*" "${j}_info.*"; done; rm -f "$i"; done
Note this will remove all xxx_step... and xxx_info... files for each xxx.smr file. Make sure this is what you intend and run on a test directory first. You can tighten the extensions above to just remove xxx_info.xlsx by replacing xxx_info.* with xxx_info.xlsx, etc...

downloading lastfinished build from teamcity

I'm using Perl's File::Fetch to download a file from the lastfinished build in Teamcity. This is working fine except the file is versioned, but I'm not getting the version number.
sub GetTeamcityFiles {
my $latest_version = "C:/dowloads"
my $uri = "http://<teamcity>/guestAuth/repository/download/bt11/.lastFinished/MyApp.{build.number}.zip";
# fetch the uri to extract directory
my $ff = File::Fetch->new(uri => "$uri");
my $where = $ff->fetch( to => "$latest_version" );
This gives me a file:
C:\downloads\MyApp.{build.number}.zip.
However, the name of the file downloaded has a build number in the name. Unfortunately there is no version file within the zip, so this is the only way I have of telling what file i've downloaded. Is there any way to get this build number?
c:\downloads\MyApp.12345.zip
With build configs modification
If you have the ability to modify the build configs in TeamCity, you can easily embed the build number into the zip file.
Create a new build step - choose command line
For the script, do something like: echo %build.number% > version.txt
That will put version.txt at the root directory of your build folder in TeamCity, which you can include in your zip later when you create it.
You can later read that file in.
I'm not able to access my servers right now so I don't have the exact name of the parameter, but typing %build will pull up a list of TeamCity parameters to choose from, and I think it is %build.number% that you're after.
Without build configs modification
If you're not able to modify the configs, you're going to need something like egrep:
$ echo MyApp.12.3.4.zip | egrep -o '([0-9]+.){2}[0-9]+'
> 12.3.4
$ echo MyApp.1234.zip | egrep -o '[0-9]+'
> 1234
It looks like you're running on Windows; in those cases I use UnxUtils & UnxUpdates to get access to utilities like this. They're very lightweight and do not install to the registry, just add them to your system PATH.