How do I delete only files created by the installer during update with install4j - install4j

I have a install4j project and want to make a project.
There is a libs-folder within my setup, where all my jars are packaged:
<install-dir>/libs/myA.jar
<install-dir>/libs/myB.jar
<install-dir>/libs/alien.jar
The alien.jar is a additional file, not created from my setup.
if I deliver a update with a packages libs-folder the alien.jar get deleted and I don't understand why.
There is a DeleteFilesAction before the InstallFilesAction, which has a filter that all alien.jar weon't be deleted. If I make a MessageBox after my DeleteFilesAction, the
alien.jar
still exists.
When the InstallFileAction starts, the complete libs-Folder seems to be deleted.
I beleave there are configuration flags, which I don't know.
Question:
How do I delete the complete libs-folder, expect the alien.jar?

I think it is not possible with an DeleteFilesAndDirectory-Action to delete all (unknown) files within a directory.
I used a Action where <install-dir>/libs is provided as folder to be deleted.
I provided a filter-Script for filenames (the alien.jar had a return value "false" the rest shozuld be deleted). But the action deletes the libs-dir as well. I think it works as designed, so it is not a bug.
For all peope with the same Problem a workarround or soultion:
com.install4j.runtime.installer.helper.fileinst.FileInstaller fi = com.install4j.runtime.installer.helper.fileinst.FileInstaller.getInstance();
File directory = new File(context.getInstallationDirectory(), "/libs");
File[] filesInLib = directory.listFiles();
for(File file:filesInLib){
boolean shouldDelete= true;
if(file.getName().toLowerCase().contains("alien")){
Util.logInfo(null,"dexchange detected: " + file.getName());
shouldDelete= false;
}
Util.logInfo(null,"Delete " + file.getAbsolutePath() + ": " +reval);
if(shouldDelete){
if (file.exists()) {
Util.logInfo( null, "delete file: " + file);
fi.deleteFile(file);
}
}
}
return false;
At Ingo Kegel:
Thanks for your help, I didn't want to open a support ticket, because I was quite sure it is no bug in you product. But I think another checkbox "delete if not empty" would help a lot in the DeleteFilesOrDirectory Action.

Related

Powershell: FTP download not working despite having permissions [duplicate]

What is the best way to download all files in a remote directory using C# and FTP and save them to a local directory?
Thanks.
downloading all files in a specific folder seems to be an easy task. However, there are some issues which has to be solved. To name a few:
How to get list of files (System.Net.FtpWebRequest gives you unparsed list and directory list format is not standardized in any RFC)
What if remote directory has both files and subdirectories. Do we have to dive into the subdirs and download it's content?
What if some of the remote files already exist on the local computer? Should they be overwritten? Skipped? Should we overwrite older files only?
What if the local file is not writable? Should the whole transfer fail? Should we skip the file and continue to the next?
How to handle files on a remote disk which are unreadable because we don’t have sufficient access rights?
How are the symlinks, hard links and junction points handled? Links can easily be used to create an infinite recursive directory tree structure. Consider folder A with subfolder B which in fact is not the real folder but the *nix hard link pointing back to folder A. The naive approach will end in an application which never ends (at least if nobody manage to pull the plug).
Decent third party FTP component should have a method for handling those issues. Following code uses our Rebex FTP for .NET.
using (Ftp client = new Ftp())
{
// connect and login to the FTP site
client.Connect("mirror.aarnet.edu.au");
client.Login("anonymous", "my#password");
// download all files
client.GetFiles(
"/pub/fedora/linux/development/i386/os/EFI/*",
"c:\\temp\\download",
FtpBatchTransferOptions.Recursive,
FtpActionOnExistingFiles.OverwriteAll
);
client.Disconnect();
}
The code is taken from my blogpost available at blog.rebex.net. The blogpost also references a sample which shows how ask the user how to handle each problem (e.g. Overwrite/Overwrite older/Skip/Skip all).
Using C# FtpWebRequest and FtpWebReponse, you can use the following recursion (make sure the folder strings terminate in '\'):
public void GetAllDirectoriesAndFiles(string getFolder, string putFolder)
{
List<string> dirIitems = DirectoryListing(getFolder);
foreach (var item in dirIitems)
{
if ( item.Contains('.') )
{
GetFile(getFolder + item, putFolder + item);
}
else
{
var subDirPut = new DirectoryInfo(putFolder + "\\" + item);
subDirPut.Create();
GetAllDirectoriesAndFiles(getFolder + item + "\\", subDirPut.FullName + "\\");
}
}
}
The "item.Contains('.')" is a bit primitive, but has worked for my purposes. Post a comment if you need an example of the methods:
GetFile(string getFileAndPath, string putFileAndPath)
or
DirectoryListing(getFolder)
For FTP protocol you can use FtpWebRequest class from .NET framework. Though it does not have any explicit support for recursive file operations (including downloads). You have to implement the recursion yourself:
List the remote directory
Iterate the entries, downloading files and recursing into subdirectories (listing them again, etc.)
Tricky part is to identify files from subdirectories. There's no way to do that in a portable way with the FtpWebRequest. The FtpWebRequest unfortunately does not support the MLSD command, which is the only portable way to retrieve directory listing with file attributes in FTP protocol. See also Checking if object on FTP server is file or directory.
Your options are:
Do an operation on a file name that is certain to fail for file and succeeds for directories (or vice versa). I.e. you can try to download the "name". If that succeeds, it's a file, if that fails, it's a directory. But that can become a performance problem, when you have a large number of entries.
You may be lucky and in your specific case, you can tell a file from a directory by a file name (i.e. all your files have an extension, while subdirectories do not)
You use a long directory listing (LIST command = ListDirectoryDetails method) and try to parse a server-specific listing. Many FTP servers use *nix-style listing, where you identify a directory by the d at the very beginning of the entry. But many servers use a different format. The following example uses this approach (assuming the *nix format)
void DownloadFtpDirectory(
string url, NetworkCredential credentials, string localPath)
{
FtpWebRequest listRequest = (FtpWebRequest)WebRequest.Create(url);
listRequest.UsePassive = true;
listRequest.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
listRequest.Credentials = credentials;
List<string> lines = new List<string>();
using (WebResponse listResponse = listRequest.GetResponse())
using (Stream listStream = listResponse.GetResponseStream())
using (StreamReader listReader = new StreamReader(listStream))
{
while (!listReader.EndOfStream)
{
lines.Add(listReader.ReadLine());
}
}
foreach (string line in lines)
{
string[] tokens =
line.Split(new[] { ' ' }, 9, StringSplitOptions.RemoveEmptyEntries);
string name = tokens[8];
string permissions = tokens[0];
string localFilePath = Path.Combine(localPath, name);
string fileUrl = url + name;
if (permissions[0] == 'd')
{
Directory.CreateDirectory(localFilePath);
DownloadFtpDirectory(fileUrl + "/", credentials, localFilePath);
}
else
{
var downloadRequest = (FtpWebRequest)WebRequest.Create(fileUrl);
downloadRequest.UsePassive = true;
downloadRequest.UseBinary = true;
downloadRequest.Method = WebRequestMethods.Ftp.DownloadFile;
downloadRequest.Credentials = credentials;
var response = downloadRequest.GetResponse();
using (Stream ftpStream = response.GetResponseStream())
using (Stream fileStream = File.Create(localFilePath))
{
ftpStream.CopyTo(fileStream);
}
}
}
}
The url must be like:
ftp://example.com/ or
ftp://example.com/path/
Or use 3rd party library that supports recursive downloads.
For example with WinSCP .NET assembly you can download whole directory with a single call to Session.GetFiles:
// Setup session options
SessionOptions sessionOptions = new SessionOptions
{
Protocol = Protocol.Ftp,
HostName = "example.com",
UserName = "user",
Password = "mypassword",
};
using (Session session = new Session())
{
// Connect
session.Open(sessionOptions);
// Download files
session.GetFiles("/home/user/*", #"d:\download\").Check();
}
Internally, WinSCP uses the MLSD command, if supported by the server. If not, it uses the LIST command and supports dozens of different listing formats.
(I'm the author of WinSCP)
You could use System.Net.WebClient.DownloadFile(), which supports FTP. MSDN Details here
You can use FTPClient from laedit.net. It's under Apache license and easy to use.
It use FtpWebRequest :
first you need to use WebRequestMethods.Ftp.ListDirectoryDetails to get the detail of all the list of the folder
for each files you need to use WebRequestMethods.Ftp.DownloadFile to download it to a local folder

How to delete generated bitstreams in DSpace 6x?

I would like to delete all the bitstreams generated by filter-media but only with a specific description "IM Thumbnail".
I am aware that I can just regenerate the thumbnail by using the -f flag to force it to regenerate the thumbnail. I am testing some settings in my setup and I would just like to delete the generated thumbnails with this specific description first.
I've tried tinkering the database via PgAdmin but I can only go as far as selecting the bitstreams. I don't even know how to group or order the returned results and not really sure if I've selected the correct tables.
SELECT
*
FROM
public.bitstream,
public.bundle,
public.bundle2bitstream,
public.metadatavalue,
public.item2bundle
WHERE
bitstream.uuid = metadatavalue.dspace_object_id AND
bitstream.uuid = bundle2bitstream.bitstream_id AND
bundle.uuid = item2bundle.bundle_id AND
bundle2bitstream.bundle_id = bundle.uuid AND
metadatavalue.text_value = 'IM Thumbnail';
Any advice on how to do this via database manipulation or any other means would be greatly appreciated. Applying the SQL deletion within a specific community or collection would be a really nice bonus too!
Thanks in advance!
Although the question was tagged with postgresql, I found the answer from DSpace Community Mailing List using Jython. Thanks for Joan Caparros for the original code. The message thread can be found here: Removing Thumbnails in DSpace 5. I also posted a similar query in the DSpace Technical Support Mailing List which can be found here: Batch delete bitstreams in the Bundle: THUMBNAIL where Joan posted a modified version of his code for my specific needs which is deleting only the thumbnails if it contains the description "IM Thumbnail". Below is the full code that achieved my goals:
from org.dspace.curate import ScriptedTask
from org.dspace.curate import Curator
from org.dspace.content.service import DSpaceObjectService
from org.dspace.content.factory import ContentServiceFactory
#from org.dspace.content.service import BitstreamService
class Main(ScriptedTask):
def init(self, curator, taskName):
print "initializing with Jython"
def performDso(self, dso):
#print "perform on dso "
if dso.getType()==2:
print "Item '" + dso.getName() + "' ("+dso.getHandle()+")"
myBundles = dso.itemService.getBundles(dso,"THUMBNAIL")
totalbundles = len(myBundles)
for i in myBundles:
myBitstreams = i.getBitstreams()
total = len(myBitstreams)
if len(myBitstreams)==0:
print "- DELETING EMPTY BUNDLE"
dso.itemService.removeBundle(Curator.curationContext(),dso,myBundles[0])
if len(myBitstreams)>0:
for k in range(0,len(myBitstreams)):
if myBitstreams[k].getDescription() == 'IM Thumbnail':
print "DELETE "+myBitstreams[0].getDescription()
bitstreamService = ContentServiceFactory.getInstance().getBitstreamService()
bitstreamService.delete(Curator.curationContext(),myBitstreams[0])
print "- DELETING BUNDLE"
dso.itemService.removeBundle(Curator.curationContext(),dso,myBundles[0])
return 0
def performId(self, context, id):
print "perform on id %s" % (id)
return 0
Please note that the code above was made as a curation task using Jython so the documentation on how to set up and use can be found here: Curation tasks in Jython.

Find paths that exceed SharePoint's 400 character limit

We recently migrated to SharePoint online and have found that a few of our paths are getting corrupt files because they exceed the 400 character limit set by SharePoint . I am only an admin of our specific site and not a global admin of our SharePoint Tenant, so trying to use SharePoint's PowerShell integration does not work. I also tried viewing in explorer view and running PowerShell from there to find anything -gt 400, but there is a windows limitation of only being able to find paths up to 248 characters before getting an error. Has anyone ran into this issue or know of any workarounds?
I have tried using the module SharePointPnPPowerShellOnline with PowerShell, but get forbidden error because I am not a global admin. I also tried recursively looking in Windows Explorer view but get an error.
Here is the error when trying to do it in windows explorer view:
Get-ChildItem : The specified path, file name, or both are too long.
The fully qualified file name must be less than 260 characters, and
the directory name must be less than 248 characters
did You consider using SharePoint online CSOM? You could use this nuget -> SharePointPnP
with this You could implement any kind of application (like a console app) and get all the folders, files and files in all folders and get their name, or path or whatever is needed and check if it is longer then 400 characters. (please remember that additional columns like FileServerRelativeUrl needs to be loaded in context)
try
{
string siteUrl = "SiteURL";
AuthenticationManager authManager = new AuthenticationManager();
using (ClientContext context = authManager.GetWebLoginClientContext(siteUrl))
{
List list = context.Web.Lists.GetByTitle("LibName");
context.Load(list);
context.Load(list.RootFolder);
context.Load(list.RootFolder.Folders);
context.Load(list.RootFolder.Files);
context.ExecuteQuery();
FolderCollection folderCol = list.RootFolder.Folders;
foreach (Folder f in folderCol)
{
context.Load(f.Files);
context.ExecuteQuery();
FileCollection innerFileCol = f.Files;
foreach (File file in innerFileCol)
{
//var x = file.Name;
// ToDo other logic here
}
}
FileCollection fileCol = list.RootFolder.Files;
foreach (File file in fileCol)
{
//var x = file.Name;
// ToDo other logic here
}
}
}
catch (Exception ex)
{
// log error
throw;
}
I hope it will be of any help :)

How at add files in data($HOME) folder for BlackBerry 10

I am trying to add few writable files in data folder of file system provided by BlackBerry 10 simulator.
PFB file system hierarchy in link provided : https://developer.blackberry.com/cascades/documentation/device_platform/filesystem/index.html
In bar-descriptor.xml file tried below option but not succeeded in any one of them.
1. $HOME/jsapp.html
2. ${HOME}/jsapp.html
any help would be appreciated.
It isn't clear from your question, but it sounds like you are trying to include thos files in the BAR file. You can't do this. All assets deployed with the BAR file are covered by the application signature and can't be changed (other than on the simulator or a device with a developer token with an unsigned BAR). If you need to modify an asset after installation you deploy the initial version with the BAR file and copy it to the data directory. One of the sample programs (the quote database sample if I remember correctly) does this.
As Richard said, this is directly from the Quotes sample app
void CustomSqlDataSource::copyFileToDataFolder(const QString fileName)
{
// Since we need read and write access to the file, it has
// to be moved to a folder where we have access to it. First,
// we check if the file already exists (previously copied).
QString dataFolder = QDir::homePath();
QString newFileName = dataFolder + "/" + fileName;
QFile newFile(newFileName);
if (!newFile.exists()) {
// If the file is not already in the data folder, we copy it from the
// assets folder (read only) to the data folder (read and write).
QString appFolder(QDir::homePath());
appFolder.chop(4);
QString originalFileName = appFolder + "app/native/assets/" + fileName;
QFile originalFile(originalFileName);
if (originalFile.exists()) {
// Create sub folders if any creates the SQL folder for a file path like e.g. sql/quotesdb
QFileInfo fileInfo(newFileName);
QDir().mkpath (fileInfo.dir().path());
if(!originalFile.copy(newFileName)) {
qDebug() << "Failed to copy file to path: " << newFileName;
}
} else {
qDebug() << "Failed to copy file data base file does not exists.";
}
}
mSourceInDataFolder = newFileName;
}

Commit a whole folder with modified files in it with SharpSvn.

I am using SharpSvn in my C# project. I am having a folder with some text files in it and another folders with subfolders in it. I am adding the folder under version control and commit it. So far, so good. This is my code:
using (SvnClient client = new SvnClient())
{
SvnCommitArgs ca = new SvnCommitArgs();
SvnStatusArgs sa = new SvnStatusArgs();
sa.Depth = SvnDepth.Empty;
Collection<SvnStatusEventArgs> statuses;
client.GetStatus(pathsConfigurator.FranchisePath, sa, out statuses);
if (statuses.Count == 1)
{
if (!statuses[0].Versioned)
{
client.Add(pathsConfigurator.FranchisePath);
ca.LogMessage = "Added";
client.Commit(pathsConfigurator.FranchisePath, ca);
}
else if (statuses[0].Modified)
{
ca.LogMessage = "Modified";
client.Commit(pathsConfigurator.FranchisePath, ca);
}
}
}
I make some changes to one of the text files and then run the code againg. The modification is not committed. This condition is false:
if (statuses.Count == 1)
and all the logic in the if block does not execute.
I have not written this logic and cannot quite get this lines of code:
client.GetStatus(pathsConfigurator.FranchisePath, sa, out statuses);
if (statuses.Count == 1) {}
I went on the oficial site of the API but couldn`t find documentation about this.
Can someone that is more familiar with this API tell what these two lines do ?
What changes need to be done to this code so if some of the contents of the pathsConfigurator.FranchisePath folder are changed, the whole folder with the changes to be commited. Any suggestions with working example will be greatly appreciated.
Committing one directory with everything inside is pretty easy: Just call commit on that directory.
The default Depth of commit is SvnDepth.Infinity so that would work directly. You can set additional options by providing a SvnCommitArgs object to SvnClient.Commit()