I followed the Vapor Website Docs, created hello project.
VaporDocs
under hello project path, inputed swift build in terminal. But when open Package.swift, Xcode start to fetch vapor again.
Why? all dependencies repository is in .build path isn't? Why Xcode begains fetch Vapor over again?
swift package generate-xcodeproj can help generate xcode project, it seems like all dependencies is not package anymore. just groups.
What's the difference between just open Package.swift & use swift package generate-xcodeproj then open the generated xcodeproj file?
open Pacakge.swift with Xcode, fetch Vapor very slow. is there any better way?
like tell the Xcode everything is just in ./build path.
When you work with a project in terminal you use commands like
swift package update
swift build
swift run
swift package generate-xcodeproj
these commands works with hidden .build folder and Xcode project generated by swift package generate-xcodeproj works with that .build folder too. The only disadvantage of xcodeproj that if you change Package.swift or manually(outside of Xcode) add/delete some files from Sources folder then you have to run either just swift package generate-xcodeproj or swift package update && swift package generate-xcodeproj.
When you open project by double clicking on Package.swift it not uses classic xcodeproj anymore, now it is fully dynamic and you could edit Package.swift and files on-the-fly, doesn't matter in Xcode or in Finder, it will track all the changes. Though it sounds cool it works not perfectly and I still prefer classic xcodeproj cause it is not really hard to execute swift package generate-xcodeproj when needed. As far as I know there is no way to say Xcode to use .build folder in dynamic mode, it uses DerivedData folder to store dependencies.
I am developing several projects in a single workspace (monorepo). The workspace contains a shared framework that contains code shared between the projects. Some of the code depends on external packages that I import using the Swift Package Manager. Everything is working except that the packages aren't recognized when I use the UI testing target. When I run the UI tests for one of the projects it complains that the packages cannot be found (in the framework). Another solution that suits my needs is also welcome. Anyway, I'm using Xcode 11.3. To reproduce:
Create a new workspace.
Add a new project A and a new framework B into the workspace.
Add any dependency (for example SDWebImage) to the framework.
Add a Swift-file to the framework that just does import SDWebImage.
Now add framework B as a dependency to project A.
If you build project A or unit test project A, there is no problem. However, when you run UI tests on project A it complains that it cannot find the module SDWebImage in the Swift-file you added in point 4 above. Any idea how to solve this?
Edit: When I use Cocoapods instead it gives me the same error. When I use use_frameworks! it doesn't give me the error, but it crashes with "SDWebImage: image not found".
You have to manually add your B framework as a linked library in on your UI Tests target under Build Phases -> Link Binary With Libraries
From Gradle's documentation:
The scripts generated by this task are intended to be committed to
your version control system. This task also generates a small
gradle-wrapper.jar bootstrap JAR file and properties file which should
also be committed to your VCS. The scripts delegates to this JAR.
From: What should NOT be under source control?
I think generated files should not be in the VCS.
When are gradlew and gradle/gradle-wrapper.jar needed?
Why not store a gradle version in the build.gradle file?
Because the whole point of the gradle wrapper is to be able, without having ever installed gradle, and without even knowing how it works, where to download it from, which version, to clone the project from the VCS, to execute the gradlew script it contains, and to build the project without any additional step.
If all you had was a gradle version number in a build.gradle file, you would need a README explaining everyone that gradle version X must be downloaded from URL Y and installed, and you would have to do it every time the version is incremented.
Because the whole point of the Gradle wrapper is to be able, without having ever installed Gradle
Same argument goes for the JDK, do you want to commit that also? Do you also commit all your dependency libraries?
The dependencies should be upgraded continuously as new versions are released to get security and other bug fixes, and because if you get to far behind it can be a very time consuming task to get up to date again.
If the Gradle wrapper is incremented for every new release, and it is committed, the repo will grow very large. The problem is obvious when working with distributed VCS where a clone will download all versions of everything.
and without even knowing how it works
Create a build script that downloads the wrapper and uses it to build. Everyone does not need to know how the script works, they need to agree that the project is build by executing it.
where to download it from, which version
task wrapper(type: Wrapper) {
gradleVersion = 'X.X'
}
for Gradle version >= 5:
wrapper {
gradleVersion = 'X.X'
}
and then
gradle wrapper
to download the correct version.
to clone the project from the VCS, to execute the gradlew script it contains, and to build the project without any additional step.
Solved by the steps above. Downloading the Gradle wrapper is not different from downloading any other dependency. The script could be smart enough to check for any current Gradle wrapper and only download it if there is a new version.
If the developer has never used Gradle before and maybe doesn't know the project is built with Gradle, then it is more obvious to run a build.sh compared to running gradlew build.
If all you had was a gradle version number in a build.gradle file, you would need a README explaining everyone that gradle version X must be downloaded from URL Y an installed,
No, you would not need a README. You could have one, but we are developers and we should automate as much as possible. Creating a script is better.
and you would have to do it every time the version is incremented.
If the developers agree that the correct process is to:
Clone repo
Run build script
Then there upgrading to latest Gradle wrapper is no problem. If the version is incremented since last run, the script could download the new version.
I would like to recommend a simple approach.
In your project's README, document that an installation step is required, namely:
gradle wrapper --gradle-version 3.3
This works with Gradle 2.4 or higher. This creates a wrapper without requiring a dedicated task to be added to "build.gradle".
With this option, ignore (do not check in) these files/folders for version control:
./gradle
gradlew
gradlew.bat
The key benefit is that you don't have to check-in a downloaded file to source control. It costs one extra step on installation. I think it is worth it.
According to Gradle docs, adding gradle-wrapper.jar to VCS is expected as making Gradle Wrapper available to developers is part of the Gradle approach:
To make the Wrapper files available to other developers and execution environments you’ll need to check them into version control. All Wrapper files including the JAR file are very small in size. Adding the JAR file to version control is expected. Some organizations do not allow projects to submit binary files to version control. At the moment there are no alternative options to the approach.
What is the "project"?
Maybe there is a technical definition of this idiom that excludes build scripts. But if we accept this definition, then we must say your "project" is not all the things that you need to versioned!
But if we say "your project" is everything you have done. Then we can say you must include it and only it into VCS.
This is very theoretical and maybe not practical in case of our development works. So we change it to "your project is every file (or folder) you need to editing them directly".
"directly" means "not indirectly" and "indirectly" means by editing another file and then an effect will be reflected into this file.
So we reach the same that OP said (and is said here):
I think Generated files should not be in the VCS.
Yes. Because you haven't created them. So they are not part of "your project" according to the second definition.
What is the result about these files:
build.gradle: Yes. We need to edit it. Our works should be versioned.
Note: There is no difference where you edit it. Whether in your text editor environment or in Project Structure GUI environment. Anyway you doing it directly!
gradle-wrapper.properties: Yes. We need to at least determine Gradle version in this file.
gradle-wrapper.jar and gradlew[.bat]: I haven't created or edited them in any of my development works, till this moment! So the answer is "No". If you have done so, the answer is "Yes" about you at that work (and about the same file you edited).
The important note about the latest case is the user who clones your repo, needs to execute this command on repo's <root-directory> to auto-generate wrapper files:
> gradle wrapper --gradle-version=$v --distribution-type=$distType
$v and $distType are determined from gradle-wrapper.properties:
distributionUrl=https\://services.gradle.org/distributions/gradle-{$v}-{$distType}.zip
See https://gradle.org/install/ for more information.
gradle executable is bin/gradle[.bat] in local distribution. It's not required that local distribution be same as that determined in the repo. After wrapper files created then gradlew[.bat] can download determined Gradle distribution automatically (if not exists locally). Then he/she probably must regenerate wrapper files using new gradle executable (in downloaded distribution) using above instructions.
Note: In the above instructions, supposed the user has at least one Gradle distribution locally (e.g. ~/.gradle/wrapper/dists/gradle-4.10-bin/bg6py687nqv2mbe6e1hdtk57h/gradle-4.10). It covers almost all real cases. But what happens if the user hasn't any distribution already?
He/She can download it manually using the URL in .properties file. But if he/she doesn't locate it in the path that the wrapper expected, the wrapper will download it again! The expected path is completely predictable but is out of the subject (see here for the most complex part).
There are also some easier (but dirty) ways. For example, he/she can copy wrapper files (except .properties file) from any other local/remote repository to his/her repository and then run gradlew on his/her repository. It will automatically download the suitable distribution.
Old question, fresh answer. If you don't upgrade gradle often (most of us don't), it's better to commit it to VCS. And the main reason for me is to increase the build speed on the CI server. Nowadays, most of the projects are getting built and installed by CI servers, different server instance every time.
If you don't commit it, CI server will download a jar for every build and it significantly increases a build time. There are other ways to handle this problem, but I find this one the easiest to maintain.
Is there a script or a tool to generate the config.pri file for a BB10 project? Momentics IDE does it for you automatically whenever a project is refreshed or the directory is changed. Unfortunately, I am deploying and compiling via command line tools (it is an automated script which runs tests).
Without the config.pri file, I can't 'make'. An easy solution would be to track the file on github along with the source code for the project, but since the file has a timestamp on it, it causes many annoying merge conflicts.
Any ideas? I rather not write the script myself to parse through all the directories and accumulate header and source files myself. Since Momentics IDE does it, there must be a script that it calls or uses. Momentics is based off of Eclipse, is there any way to see what commands the IDE is calling?
config.pri is just an additional file included in the project's main .pro file. So if you're able to control .pro file you may not need config.pri at all and this inclusion could be safely removed from .pro file. For example, in earlier BB10 SDK releases config.pri simply didn't exist and has been added later.
.pro file is a main file for generating platform and environment-dependent Makefiles for building project, so I'm not sure if there's some tools allowing to manage it automatically. It could be initially generated by invoking qmake -project but I'm not sure if there are any console tools allowing you to manage it after automatically.
Technically, you should modify this file only if your project structure or build system options changed, all other stuff is done in platform-independent way.
I've got several file pairs:
mydata.json, mydata.raw
other.json, other.raw
etc...
A custom tool builds these files into data files to be consumed by my app, like so:
mydata.dat, other.dat, etc...
What is the correct way for me to set up Xcode to build these resource files? Either one could change, so I can't just set up a custom build rule for a .raw file, for example. I've thought about using an "external build" project using make, but I don't know how to register the results with Xcode to be copied as a resource.
You should be able to just add a Run Script build phase to your target in Xcode to run the external build tool and copy the resulting files into your app's bundle. Right-click on your target and choose Add > New Build Phase > New Run Script Build Phase.
You can use whatever scripting environment you prefer, just specify its path in the "Shell" field. For a bash script, you'd specify /bin/bash, for example.
You can then use the predefined Xcode environment variables to get the location of the various folders in the built package. Have a look at the Xcode Build Setting Reference for details. For example, the Resources folder in the app bundle is at ${UNLOCALIZED_RESOURCES_FOLDER_PATH}.