PCLZIP restructure file paths - pclzip

I'm wondering if it's possible to remove a parent directory using PCLZip while the archive is loaded; without extracting it first and recompiling it.
I can remove the parent directory in the archive using:
$zip->delete(PCLZIP_OPT_BY_INDEX, '0');
And the zip listContent seems to show the parent directory removed
but when I browse the loaded archive the child files and folders are still hosted in the parent directory. I believe this is because their structure definitions still contain the parent directory.

This should not be possible - for all entries in ZIP archive the path is stored fully, including all parent directories. So to remove part of this path will require to process all entries, and re-write an archive file since data in all entry headers will be changed.

Related

How to specify buildroot build process variable to be called on make <package>-dirclean

Is there anyway I can just call into a define such as LIBFOO_DIRCLEAN, and just do what was implemented in the define?
Inside HOST_LIBFOO_INSTALL_CMDS, I copy files to the target directory, and would like the 'make package-dirclean' to delete what was copied into the target directory. 'make clean', would obviously do this(any many more), but that is much more than I want to do.
I see the following buildroot variables. LIBFOO_EXTRACT_CMDS, LIBFOO_CONFIGURE_CMDS, LIBFOO_BUILD_CMDS, HOST_LIBFOO_INSTALL_CMDS, LIBFOO_INSTALL_TARGET_CMDS, etc.
make foo-dirclean is a simple tool that just deletes the package build directory. In most cases, when the list of files installed by a package does not change over time (only files content changes) you can simply rebuild the package and the target directory will be rebuilt correctly.
If you want you can implement your own foo-myclean step that implements your own logic. However you must understand deleting files in the target directory is not supported by Buildroot and thus you are on your own.

Keep paths to images using Parceljs

Can I keep my folderstructure within my images folder when bundling with ParcelJS?
Now all files used in project get random filenames and are stored all in the same directory (cfr. dist||build)

Unity3D 5 packages conflict

I have Unity 5.0.2f1. Firstly, I've successfully added GooglePlayGamesPlugin-0.9.20.unitypackage to my project. Then, I've tried to import GoogleMobileAds.unitypackage, but I got these errors:
Error importing folder (The pathName assets/plugins is already mapped to fce8a713f1e5a4cc4b9973d1ef630f31. But the meta data wants it to be mapped to cbde64d36fd994c458fffca9e931b232)
Error importing folder (The pathName assets/plugins/android is already mapped to b8f0d9a6a7f9240c981894807effddbc. But the meta data wants it to be mapped to 2f5d736f7c4cb4c1e80d0816d0e81625)
Error importing folder (The pathName assets/plugins/ios is already mapped to 6490bb8acab6f4f92b29615e7429b8df. But the meta data wants it to be mapped to da135550add3c4abca622bda5280d204)
How to resolve this?
First make a backup of your project and delete your metadata files in your asset project folder and childs.
Files with extension .meta
The easiest workaround I could come up with:
1. Within the Unity Editor, create a subfolder of the Assets folder, for instance called TEMP123.
2. Move the entire (other) content of the Assets folder to TEMP123.
3. Import your package.
4. Manually merge the content of TEMP123back to the Assets folder.
If you see these kinds of errors, reimport all the assets. This reprocesses the metadata and the error goes off.

Can we move files from one folder to another in FTP server using Talend DI open studio?

There is an FTP server. On that server there are two folders (Folder1 and Folder2). Folder1 contains 20 csv files (Total size more than 2 GB). I want to move all csv from Folder1 to Folder2. But I don't want to use TFTPGet and TFTPPut as it will take to too much time to upload.
Can anyone help me?
Yes, we can. You can use tFTPRename component and give fully specified file paths of different folders to the Filemask and New name fields.
There are two ways to accomplish this in Talend. If you wish to copy all contents in a directory, then you only need a tFileCopy component and check "Copy a Directory" specifying the source and destination directories.
If you need to copy only certain files in a directory, you can accomplish this in Talend using 2 components that work together. You need a tFileList and a tFileCopy, connecting them together with an Iterate flow.
Use the tFileList to generate your list of files from a specified directory. You can configure wildcards in the filemask section. For example, to only take .txt you would enter "*.txt" in the filemask section.
Then rightclick tFileList in the designer and click Row-->Iterate. Connect this to the FileCopy component. In FileCopy use this code in file name:
((String)globalMap.get("tFileList_1_CURRENT_FILEPATH"))
You have other options in the FileCopy component as well, including Remove Source File, and Create the Directory if it doesn't exist.
Select which of the two best suit your needs.

Dynamically add files to visual studio deployment project

I've been desperately looking for the answer to this and I feel I'm missing something obvious.
I need to copy a folder full of data files into the TARGETDIR of my deployment project at compile time. I can see how I would add individual files (ie. right click in File System and go to Add->File) but I have a folder full of data files which constantly get added to. I'd prefer not to have to add the new files each time I compile.
I have tried using a PreBuildEvent to copy the files:
copy $(ProjectDir)..\Data*.* $(TargetDir)Data\
which fails with error code 1 when I build. I can't help but feel I'm missing the point here though. Any suggestions?
Thanks in advance.
Graeme
Went to this route.
Created a new project (deleted the default source file Class1)
Added the files/folders necessary to the project.
Added the project as project output in the installer, choosing the option content files.
This removes the complexity of having to zip/unzip the files as suggested earlier.
Try
xcopy $(ProjectDir)..\Data\*.* $(TargetDir)Data /e /c /i [/f] [/r] /y
/e to ensure tree structure fulfilment (use /s if you want to bypass empty folders)
/c to continue on error (let the build process finish)
/i necessary to create the destination folder if none exists
/y assume "yes" for overwrite in case of previously existing files
[optionnal]
/f if you wanna see the whole pathes resulting from the copy
/r if you want to overwrite even previously copied read-only files
The method is simpler on the project than on files, yes. Beside, on files, it copies only the modified/missing files on each build but forces you to maintain the project on each data pack modification. Depends on the whole data size and the variability of your data pack.
Also beware the remaining files if you remove some from your data pack and rebuild without emptying your target folder.
Good luck.
I solved the problem by a workaround:
Add a build action of packaging entire directory (could be filtered) to a ZIP file.
Add a reference to an empty ZIP file to deployment project.
Add a custom action to deployment project to extract the ZIP to destination folder.
It's simple and stable.
Your error is probably because your path has spaces in it and you don't have the paths in quotes.
ex copy "$(ProjectDir)..\Data*.*" "$(TargetDir)Data\"
I need to do a similar thing. Thinking a custom action...
I found a different workaround for this. I added a web project to my solution that points at the data directory I want included in the deployment project. The web project automatically picks up any new files in the data directory and you can refer to the project content in the deployment project.