Subversion (SVN) and static libraries (.a files) compatibility? - iphone

I can't add .a files (static libraries) into my repository. Why?
Is there a way to "force" SVN to accept them (at least as static files...)?

The svn:ignore property contains a list of file patterns which certain Subversion operations will ignore.
Also do you have a configuration file that global-ignores. It is a list of whitespace-delimited globs which describe the names of files and directories
The svn status, svn add, and svn import commands also ignore files that match the list.
To override for a certain instance, use the --no-ignore command-line flag:
>>>>svn help add
usage: add PATH...
Valid options:
--targets ARG : pass contents of file ARG as additional args
-N [--non-recursive] : obsolete; try --depth=files or --depth=immediates
--depth ARG : limit operation by depth ARG ('empty', 'files',
'immediates', or 'infinity')
-q [--quiet] : print nothing, or only summary information
--force : force operation to run
--no-ignore : disregard default and svn:ignore property ignores
--auto-props : enable automatic properties
--no-auto-props : disable automatic properties

Make sure your issue is caused by the SVN ignore configuration. With `svn status' your '*.a' file will be missing, while 'svn status --no-ignore' shall display it with a question mark in front.
Open the Subversion configuration file in your home directory:
~/.subversion/config
Search for the 'global-ignores' section:
global-ignores = *.o *.lo *.la *.al .libs *.so *.so.[0-9]* *.a *.pyc *.pyo *.rej *~ #*# .#* .*.swp .DS_Store
Remove *.a from the list of ignored files.

Related

Ignoring a local file is deleting the depot file

I am using smartgit with github.
I have a config.json file on my remote github depot, with hidden passwords, at the root of the app .
I need to keep a different config.json file on my local depot, with real passwords.
As long as I try to ignore config.json locally, sometimes , it is still recorded as 'modified'
Some others times, when it finally gets ignored, by right clicking/ignore, It says 1'staged' , config.json finally gets deleted from Github when pushing the commit, I don't understand why:
THis is my .gitignore file :
.DS_Store
/config.json
config.json
node_modules
/uploads
/node_module
/dist
# local env files
.env.local
.env.*.local
# Log files
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Editor directories and files
.idea
.vscode
*.suo
*.ntvs*
*.njsproj
*.sln
*.sw?
My config.json file , with blank that I need to leave as this on Github, because Heroku needs it :
{
"localhost_db": "mongodb://localhost:27017/",
"mongoDb_atlas_db": "mongodb+srv://jose:x#cluster0-6kmcn.azure.mongodb.net/vue-starter-webpack?retryWrites=true&w=majority",
"dev": false,
"db_name": "vue-starter-webpack",
"ftp_config": {
"host": "ftpupload.net",
"user": "epiz_26763901",
"password": "x",
"secure": false
},
"node_file_path": "./tmp/files/",
"cloudinary_token": {
"cloud_name": "ddq5asuy2",
"api_key": "354237299578646",
"api_secret": "x"
},
"logs_path": "tmp/logs/logs.txt"
}
Is there any workaround ? I have tried plenty of things already. What does "staged" means ? How can I keep a different version of file on github and locally ?
EDIT : I am trying out this command, it seems to work ! :
git update-index --assume-unchanged config/database.yml
Ignore modified (but not committed) files in git?
I have a config.json file on my remote GitHub depot, with hidden passwords, at the root of the app.
That... is not a good practice. If that file (config.json) contains any sensitive information, it should not be added/committed, but explicitely ignored.
What you can commit is config.json.tpl, a template file (which is essentially what your config.json is right now)
From there, you could generate the right config.json file locally, and automatically on git clone/git checkout.
The generation script will:
search the right passwords from an external secure referential (like a vault)
replace the placeholder value in config.json.tpl to generate the right config.json
For that, do register (in a .gitattributes declaration) a content filter driver.
(image from "Customizing Git - Git Attributes", from "Pro Git book")
The smudge script will generate (automatically, on git checkout or git switch) the actual config.json file as mentioned above.
Again, the generated actual config.json file remains ignored (by the .gitignore).
See a complete example at "git smudge/clean filter between branches".

Stripping base path off the unpacked source tree with bitbake SRC_URI file:// fetcher

The manual here says that there is a basepath option to SRC_URI that should "strip the specified directories from the source path when unpacking".
I'm trying to fetch the sources from a local directory, say /src/someproject.
For that purpose I configured my recipe as follows:
SRC_URI="file:///src/someproject;subdir=source;basepath=/src/someproject"
The intention was to have the sources taken from /src/someproject directory and put into build/tmp/work/target/someproject/1.0-r1/source/. Instead, I'm getting the sources under build/tmp/work/target/someproject/1.0-r1/source/src/someproject.
Is there a way to get rid of /src/someproject subdirectory inside source ?
The documentation you point to is for yocto 1.6, release April 2014. The basepath parameter appears to have been removed in later releases, with no fanfare and no apparent replacement.
Instead you can do something like:
SRCDIR = "/path/to/your/files"
SRC_URI = "file://${SRCDIR}/contents/;subdir=src"
S = "${WORKDIR}/src"
Then you can access your files under ${S}/${SRCDIR}.
If you find using the ${SRCDIR} part too cumbersome, you can hook onto the do_patch target, which need the prefunc mechanism and not _prepend if you want to use shell scripting, as do_patch is otherwise written in python:
relocate_source() {
mv ${S}/${SRCDIR}/* ${S}/
rmdir ${S}/${SRCDIR}
}
do_patch[prefunc] += "relocate_source"
This will reorganise your source before applying any patch you can add to SRC_URI.
Also note that a file:// URI does not get cached in ${DL_DIR}, so there is no name-conflict to handle (the way we would need to use downloadfilename= in an http:// URI).

bitbake recipe - doing a simple copy of the image

I am attempting to write a recipe that would simple copy two files (MyfileA , MyfileB) to a specific directory when the overall image is built. This is what my directory structure looks like:
MyDir/MyRecipe.bb
MyDir/files/MyfileA
MyDir/files/MyfileB
I would like the two files to be copied to a folder in home (which would not exist initially hence the directories should be created)The folder lets say is called "Testfolder"
This is what my bitbake file looks like
DESCRIPTION = "Testing Bitbake file"
PR = "r0"
SRC_URI = "file://MyfileA \
file://MyfileB "
do_install() {
install -d MyfileA ~/TestFolder/
}
Kindly let me know if I am doing something wrong here?
When i run bitbake on this I get the following
The BBPATH variable is not set and bitbake did not find a conf/bblayers.conf file in the expected location.
Maybe you accidentally invoked bitbake from the wrong directory?
DEBUG: Removed the following variables from the environment: LANG, LS_COLORS, LESSCLOSE, XDG_RUNTIME_DIR, SHLVL, SSH_TTY, OLDPWD, LESSOPEN, SSH_CLIENT, MAIL, SSH_CONNECTION, XDG_SESSION_ID, _, BUILDDIR
Any help in this regard would be appreciated.
First of all, to create your own meta-layer, you should run command yocto-layer create MyRecipe in your Yocto Environment. This is to make sure that you have all the necessary element in your meta layer. Make sure to put the new meta-layer into conf/bblayers.conf
Creating HelloWorld Recipe Video can be found here
Second, to copy a file from one to another directories.
DESCRIPTION = "Testing Bitbake file"
SECTION = "TESTING"
LICENSE = "MIT"
LIC_FILES_CHKSUM = "file://${COMMON_LICENSE_DIR}/MIT;md5=0835ade698e0bcf8506ecda2f7b4f302"
PR = "r0"
SRC_URI = "file://MyfileA \
file://MyfileB "
#specify where to get the files
S = "${WORKDIR}"
inherit allarch
#create the folder in target machine
#${D} is the directory of the target machine
#move the file from working directory to the target machine
do_install() {
install -d ${D}/TestFolder
install -m ${WORKDIR}/MyfileA ${D}/TestFolder
}
To get more in details, this is my understanding of how the files move around in Yocto.
You have a directory that stored metadata in /sourced/meta-mylayer/recipes-myRecipe/. In that directory, there would be a folder with the same name as the recipe. I.E. myRecipe/ myRecipe_001.bb.
You would store the files that are related to myRecipe.bb (usually it is a patch) in myRecipe/ so that SRC_URI will get into that myRecipe/ directory to grab files. I.E. myFileA, myFileB
Then, you specify the S. This is the location in the Build Directory where unpacked recipe source code resides. By that mean, myFileA and myFileB are moved/copied to there when myRecipe builds.
Usually, S is equal to ${WORKDIR}, this is equivalent to ${TMPDIR}/work/${MULTIMACH_TARGET_SYS}/${PN}/${EXTENDPE}${PV}-${PR}
The actual directory depends on several things:
TMPDIR: The top-level build output directory
MULTIMACH_TARGET_SYS: The target system identifier
PN: The recipe name
EXTENDPE: The epoch - (if PE is not specified, which is usually the case for most recipes, then EXTENDPE is blank)
PV: The recipe version
PR: The recipe revision
After that, we inherit allarch. This class is used for architecture independent recipes/data files (usually scripts).
Then, the last thing we have to do is copy the files.
${D} is the location in the Build Directory where components are installed by do_install task. This location defaults to ${WORKDIR}/image
${WORKDIR}/image can also be described as the / directory in the target system.
Go to ${D} directory and create a folder call TestFolder
Then, copy myFileA from ${WORKDIR} to the ${D}/TestFolder
P.S. Please add comment to fix. There might be mistaken information here, cause I learned all this by myself.

Copy all files with given extension to output directory using CMake

I've seen that I can use this command in order to copy a directory using cmake:
file(COPY "myDir" DESTINATION "myDestination")
(from this post)
My problem is that I don't want to copy all of myDir, but only the .h files that are in there. I've tried with
file(COPY "myDir/*.h" DESTINATION "myDestination")
but I obtain the following error:
CMake Error at CMakeLists.txt:23 (file):
file COPY cannot find
"/full/path/to/myDIR/*.h".
How can I filter the files that I want to copy to a destination folder?
I've found the solution by myself:
file(GLOB MY_PUBLIC_HEADERS
"myDir/*.h"
)
file(COPY ${MY_PUBLIC_HEADERS} DESTINATION myDestination)
this also works for me:
install(DIRECTORY "myDir/"
DESTINATION "myDestination"
FILES_MATCHING PATTERN "*.h" )
The alternative approach provided by jepessen does not take into account the fact that sometimes the number of files to be copied is too high. I encountered the issue when doing such thing (more than 110 files)
Due to a limitation on Windows on the number of characters (2047 or 8191) in a single command line, this approach may randomly fail depending on the number of headers that are in the folder. More info here https://support.microsoft.com/en-gb/help/830473/command-prompt-cmd-exe-command-line-string-limitation
Here is my solution:
file(GLOB MY_HEADERS myDir/*.h)
foreach(CurrentHeaderFile IN LISTS MY_HEADERS)
add_custom_command(
TARGET MyTarget PRE_BUILD
COMMAND ${CMAKE_COMMAND} -E copy_if_different ${CurrentHeaderFile} ${myDestination}
COMMENT "Copying header: ${CurrentHeaderFile}")
endforeach()
This works like a charm on MacOS. However, if you have another target that depends on MyTarget and needs to use these headers, you may have some compile errors due to not found includes on Windows. Therefore you may want to prefer the following option that defines an intermediate target.
function (CopyFile ORIGINAL_TARGET FILE_PATH COPY_OUTPUT_DIRECTORY)
# Copy to the disk at build time so that when the header file changes, it is detected by the build system.
set(input ${FILE_PATH})
get_filename_component(file_name ${FILE_PATH} NAME)
set(output ${COPY_OUTPUT_DIRECTORY}/${file_name})
set(copyTarget ${ORIGINAL_TARGET}-${file_name})
add_custom_target(${copyTarget} DEPENDS ${output})
add_dependencies(${ORIGINAL_TARGET} ${copyTarget})
add_custom_command(
DEPENDS ${input}
OUTPUT ${output}
COMMAND ${CMAKE_COMMAND} -E copy_if_different ${input} ${output}
COMMENT "Copying file to ${output}."
)
endfunction ()
foreach(HeaderFile IN LISTS MY_HEADERS)
CopyFile(MyTarget ${HeaderFile} ${myDestination})
endforeach()
The downside indeed is that you end up with multiple target (one per copied file) but they should all end up together (alphabetically) since they start with the same prefix ORIGINAL_TARGET -> "MyTarget"

Find deleted files in Mercurial repository history, quickly?

You can use hg grep, but it searches the contents of all files.
What if I just want to search the file names of deleted files to recover one?
I tried hg grep -I <file-name-pattern> <pattern> but this seems to return no results.
using templates is simple:
$ hg log --template "{rev}: {file_dels}\n"
Update for Mercurial 1.6
You can use revsets for this too:
hg log -r "removes('**')"
(Edit: Note the double * - a single one detects removals from the root of the repository only.)
Edit: As Mathieu Longtin suggests, this can be combined with the template from dfa's answer to show you which files each listed revision removes:
hg log -r "removes('**')" --template "{rev}: {file_dels}\n"
That has the virtue (for machine-readability) of listing one revision per line, but you can make the output prettier for humans by using % to format each item in the list of deletions:
hg log -r "removes('**')" --template "{rev}:\n{file_dels % '{file}\n'}\n"
If you are using TortoiseHg workbench, a convenient way is to use the revision filter. Just hit ctrl+s, and then type
removes("**/FileYouWantToFind.txt")
**/ indicates that you want to search recursively in your repository.
You can use * wildcard in the filename too. You can combine this query with other revision sets using and, or operators.
There is also this Advanced Query Editor:
I have taken other answers and improved it.
Added "--no-merges". On large project with dev teams, there will lots of merges. --no-merger will filter out the log noise.
Change removes("**") to sort(removes("**"), -rev). For a large project with over 100K changesets, this will get to the latest files removed a lot faster. This reverses the order from starting at rev 0 to start at tip instead.
Added {author} and {desc} to ouput. This will give context as to why the files was removed by displaying the log comment and who did it.
So for my use case, it was hg log --template "File(s) deleted in rev {rev}: {author} \n {desc}\n {file_dels % '\n {file}'}\n\n" -r 'sort(removes("**"), -rev)' --no-merges
Sample output:
File(s) deleted in rev 52363: Ansariel
STORM-2141: Fix various inventory floater related issues:
* Opening new inventory via Control-Shift-I shortcut uses legacy and potentinally dangerous code path
* Closing new inventory windows don't release memory
* During shutdown legacy and inoperable code for inventory window cleanup is called
* Remove old and unused inventory legacy code
indra/newview/llfloaterinventory.cpp
indra/newview/llfloaterinventory.h
File(s) deleted in rev 51951: Ansariel
Remove readme.md file - again...
README.md
File(s) deleted in rev 51856: Brad Payne (Vir Linden) <vir#lindenlab.com>
SL-276 WIP - removed avatar_skeleton_spine_joints.xml
indra/newview/character/avatar_skeleton_spine_joints.xml
File(s) deleted in rev 51821: Brad Payne (Vir Linden) <vir#lindenlab.com>
SL-276 WIP - removed avatar_XXX_orig.xml files.
indra/newview/character/avatar_lad_orig.xml
indra/newview/character/avatar_skeleton_orig.xml
Search for a specific file you deleted efficiently, and format the result nicely:
hg log --template "File(s) deleted in rev {rev}: {file_dels % '\n {file}'}\n\n" -r 'removes("**/FileYouWantToFind.txt")'
Sample output:
File(s) deleted in rev 33336:
class/WebEngineX/Database/RawSql.php
File(s) deleted in rev 34468:
class/PdoPlus/AccessDeniedException.php
class/PdoPlus/BulkInsert.php
class/PdoPlus/BulkInsertInfo.php
class/PdoPlus/CannotAddForeignKeyException.php
class/PdoPlus/DuplicateEntryException.php
class/PdoPlus/Escaper.php
class/PdoPlus/MsPdo.php
class/PdoPlus/MyPdo.php
class/PdoPlus/MyPdoException.php
class/PdoPlus/NoSuchTableException.php
class/PdoPlus/PdoPlus.php
class/PdoPlus/PdoPlusException.php
class/PdoPlus/PdoPlusStatement.php
class/PdoPlus/RawSql.php
from project root
hg status . | grep "\!" >> /tmp/filesmissinginrepo.txt