I am using SharpSvn in my C# project. I am having a folder with some text files in it and another folders with subfolders in it. I am adding the folder under version control and commit it. So far, so good. This is my code:
using (SvnClient client = new SvnClient())
{
SvnCommitArgs ca = new SvnCommitArgs();
SvnStatusArgs sa = new SvnStatusArgs();
sa.Depth = SvnDepth.Empty;
Collection<SvnStatusEventArgs> statuses;
client.GetStatus(pathsConfigurator.FranchisePath, sa, out statuses);
if (statuses.Count == 1)
{
if (!statuses[0].Versioned)
{
client.Add(pathsConfigurator.FranchisePath);
ca.LogMessage = "Added";
client.Commit(pathsConfigurator.FranchisePath, ca);
}
else if (statuses[0].Modified)
{
ca.LogMessage = "Modified";
client.Commit(pathsConfigurator.FranchisePath, ca);
}
}
}
I make some changes to one of the text files and then run the code againg. The modification is not committed. This condition is false:
if (statuses.Count == 1)
and all the logic in the if block does not execute.
I have not written this logic and cannot quite get this lines of code:
client.GetStatus(pathsConfigurator.FranchisePath, sa, out statuses);
if (statuses.Count == 1) {}
I went on the oficial site of the API but couldn`t find documentation about this.
Can someone that is more familiar with this API tell what these two lines do ?
What changes need to be done to this code so if some of the contents of the pathsConfigurator.FranchisePath folder are changed, the whole folder with the changes to be commited. Any suggestions with working example will be greatly appreciated.
Committing one directory with everything inside is pretty easy: Just call commit on that directory.
The default Depth of commit is SvnDepth.Infinity so that would work directly. You can set additional options by providing a SvnCommitArgs object to SvnClient.Commit()
Related
Given a list of repositories on GitHub with 'repoName' and 'userName', I need to find all the '.java' files, and get the content of the java files. Tried to use RestAPI first, but ran into the rate limit very easily, now I'm switching to GraphQL API, but don't know how to achieve that.
Here is how I would do it:
Algo Identify_java_files
Entry: path the folder
Out: java files in the folder
Identify all files in the folder of the repository
For each file
if the type of the file is "blob":
if ".java" is the end of the name of the file
get its content
else if the type of the file is "tree":
Identify_java_files(path of the tree file)
You can easily implement this pseudo code using Python. My pseudo code makes the assumption to use recursion, but it can be done otherwise, it's just for the example. You will need the requests and json libraries.
Here are the queries you might need, and you can use the explorer to test them.
{
repository(name: "checkout", owner: "actions") {
defaultBranchRef {
name
}
}
}
This query allows you to check the name of the default branch of the repository. You will need it for the following queries, or you can use a specific branch but you will have to know its name. I don't know (and don't believe) if you can get all the branches names for a repository.
{
repository(name: "checkout", owner: "actions") {
object(expression: "main:") {
... on Tree {
entries {
path
type
}
}
}
}
}
This query gets the content of the root folder for a specific repository. The expression: "main:" refers to the branch of the repository along with the path. Here the branch is main and the path is empty (it comes after the ":"), meaning we are looking at the root folder. Some repositories are using "master" as default branch, so be sure of which branch to use.
To check the content of a file, you can use this accepted answer.
I updated the example in order for you to be able to try it.
{
repository(name: "checkout", owner: "actions") {
object(expression: "main:.github/workflows/test.yml") {
... on Blob {
text
}
}
}
}
You send your requests to the API using requests or alike, and store the responses as JSON or alike for treatment.
As a side note, I do not think it is possible to achieve this without issuing multiple queries. I recently had to do something similar, and this is my first SO answer, so let me know if anything is unclear.
Edit:
You can use this answer to list all files in a repository.
I am customizing ICN (IBM Content Navigator) 2.0.3 and my requirement is to restrict user to upload files over 10mb and only allowed files are .pdf or .docx.
I know I have to extend / customize the AddContentItemDialog but there is very less detail on exactly how to do it, or any video on it. I'd appreciate if someone could guide.
Thanks
I installed the development environment but I am not sure how to extend the AddContentItemDialog.
public void applicationInit(HttpServletRequest request,
PluginServiceCallbacks callbacks) throws Exception {
}
I want to also know how to roll out the changes to ICN.
This can be easily extended. I would suggest to read the ICN red book for the details on how to do it. But it is pretty standard code.
Regarding rollout the code to ICN, there are two ways:
- If you are using plugin: just replace the Jar file on the server location and restart WAS.
- If you are using EDS: you need to redeploy the web service and restart WAS.
Hope this helps.
thanks
Although there are many ways to do this, one way indeed is tot extend, or augment the AddContentItemDialog as you qouted. After looking at the (rather poor IBM documentation) i figured you could probably use the onAdd event/method
Dojo/Aspect#around allows you to do exactly that, example:
require(["dojo/aspect", "ecm/widget/dialog/AddContentItemDialog"], function(aspect, AddContentItemDialog) {
aspect.around(AddContentItemDialog.prototype, "onAdd", function advisor(original) {
return function around() {
var files = this.addContentItemGeneralPane.getFileInputFiles();
var containsInvalidFiles = dojo.some(files, function isInvalid(file) {
var fileName = file.name.toLowerCase();
var extensionOK = fileName.endsWith(".pdf") || fileName.endsWith(".docx");
var fileSizeOK = file.size <= 10 * 1024 * 1024;
return !(extensionOK && fileSizeOK);
});
if (containsInvalidFiles) {
alert("You can't add that :)");
}else{
original.apply(this, arguments);
}
}
});
});
Just make sure this code gets executed before the actual dialog is opened. The best way to achieve this, is by wrapping this code in a new plugin.
Now on creating/deploying plugins -> The easiest way is this wizard for Eclipse (see also a repackaged version for newer eclipse versions). Just create a new arbitrary plugin, and paste this javascript code in the generated .js file.
Additionally it might be good to note that you're only limiting "this specific dialog" to upload specific files. It would probably be a good idea to also create a requestFilter to limit all possible uses of the addContent api...
In my application ,I have stored uploaded images to folder ./assets/uploads. I am using easyimage and imagemagick for storing the images.
In my application, after uploading the images, it should show the new uploaded image. But it is not displayed even if the page is refreshed. But when i do sails lift , the image is shown.
How to show image immediately after uploading the image? Thanks a lot!
It's a totally normal situation, because of the way Sails works with the assets.
The thing is that upon sails lift the assets are being copied (including directory structure and symlinks) from ./assets folder to ./.tmp/public, which becomes publicly accessible.
So, in order to show your images immediately after upload, you, basically, need to upload them not to ./assets/uploads but to ./.tmp/public/uploads.
The only problem now is that the ./.tmp folder is being rewritten each time your application restarts, and storing uploads in ./tmp/... would make them erased after every sails lift. The solution here would be storing uploads in, for example, ./uploads and having a symlink ./assets/uploads pointing to ../uploads.
Though this question is pretty old but I would like to add a solution which I just implemented.
Today I spend almost 4 hours trying all those solutions out there. But none helped. I hope this solution will save someone else's time.
WHY images are not available immediately after uploading to any custom directory?
Because according to the default Sails setup, you can not access assets directly from the assets directory. Instead you have to access the existing assets that is brought to .tmp/public directory by Grunt at time of sails lift ing
THE Problems
(Available but Volatile) If you upload a file (say image) anywhere inside .tmp/public
directory, your file (image) is going to erase at next sails lift
(Unavailability) If you upload a file in any other custom directory- say: ./assets/images, the uploaded file will not be available immediately but at next sails lift it will be available. Which doesn't makes sense because - cant restart server each time files gets uploaded in production.
MY SOLUTION (say I want to upload my images in ./assets/images dir)
Upload the file say image.ext in ./tmp/public/images/image.ext (available and volatile)
On upload completion make a copy of the file image.ext to ./assets/images/*file.ext (future-proof)
CODE
var uploadToDir = '../public/images';
req.file("incoming_file").upload({
saveAs:function(file, cb) {
cb(null,uploadToDir+'/'+file.filename);
}
},function whenDone(err,files){
if (err) return res.serverError(err);
if( files.length > 0 ){
var ImagesDirArr = __dirname.split('/'); // path to this controller
ImagesDirArr.pop();
ImagesDirArr.pop();
var path = ImagesDirArr.join('/'); // path to root of the project
var _src = files[0].fd // path of the uploaded file
// the destination path
var _dest = path+'/assets/images/'+files[0].filename
// not preferred but fastest way of copying file
fs.createReadStream(_src).pipe(fs.createWriteStream(_dest));
return res.json({msg:"File saved", data: files});
}
});
I dont like this solution at all but yet it saved more of my time and it works perfectly in both dev and prod ENV.
Thanks
Sails uses grunt to handle asset syncing. By default, the grunt-watch task ignores empty folders, but as long as there's at least one file in a folder, it will always sync it. So the quickest solution here, if you're intent on using the default static middleware to server your uploaded files, is to just make sure there's always at least one file in your assets/uploads folder when you do sails lift. As long as that's the case, the uploads folder will always be synced to your .tmp/public folder, and anything that's uploaded to it subsequently will be automatically copied over and available immediately.
Of course, this will cause all of your uploaded files to be copied into .tmp/public every time your lift Sails, which you probably don't want. To solve this, you can use the symlink trick #bredikhin posted in his answer.
Try to do this:
npm install grunt-sync --save-dev --save-exact
uncomment the line: // grunt.loadNpmTasks('grunt-sync');
usually it is near to the end of the file /tasks/config/sync.js.
lift the App again
Back to the Original answer
I was using node version 10.15.0, and I faced same problem. I solved this by updating to current version of node(12.4.0) and also updated npm and all the node modules. After this, I fixed the vulnerabilities(just run 'npm audit fix') and the grunt error that was coming while uploading the images to assets/images folder was fixed.
Try out this implementation
create a helper to sync the file
example of the filesync helper
// import in file
const fs = require('fs')
module.exports = {
friendlyName: 'Upload sync',
description: '',
inputs: {
filename:{
type:'string'
}
},
exits: {
success: {
description: 'All done.',
},
},
fn: async function ({
filename
}) {
var uploadLocation = sails.config.custom.profilePicDirectory + filename;
var tempLocation = sails.config.custom.tempProfilePicDirectory + filename;
//Copy the file to the temp folder so that it becomes available immediately
await fs.createReadStream(uploadLocation).pipe(fs.createWriteStream(tempLocation));
// TODO
return;
}
};
now call this helper to sync your files to the .temp folder
const fileName = result[0].fd.split("\\").reverse()[0];
//Sync to the .temp folder
await await sails.helpers.uploadSync(fileName);
reference to save in env
profilePicDirectory:path.join(path.resolve(),"assets/images/uploads/profilePictures/")
tempProfilePicDirectory:path.join(path.resolve(),".tmp/public/images/uploads/profilePictures/"),
also can try
process.cwd()+filepath
I have a install4j project and want to make a project.
There is a libs-folder within my setup, where all my jars are packaged:
<install-dir>/libs/myA.jar
<install-dir>/libs/myB.jar
<install-dir>/libs/alien.jar
The alien.jar is a additional file, not created from my setup.
if I deliver a update with a packages libs-folder the alien.jar get deleted and I don't understand why.
There is a DeleteFilesAction before the InstallFilesAction, which has a filter that all alien.jar weon't be deleted. If I make a MessageBox after my DeleteFilesAction, the
alien.jar
still exists.
When the InstallFileAction starts, the complete libs-Folder seems to be deleted.
I beleave there are configuration flags, which I don't know.
Question:
How do I delete the complete libs-folder, expect the alien.jar?
I think it is not possible with an DeleteFilesAndDirectory-Action to delete all (unknown) files within a directory.
I used a Action where <install-dir>/libs is provided as folder to be deleted.
I provided a filter-Script for filenames (the alien.jar had a return value "false" the rest shozuld be deleted). But the action deletes the libs-dir as well. I think it works as designed, so it is not a bug.
For all peope with the same Problem a workarround or soultion:
com.install4j.runtime.installer.helper.fileinst.FileInstaller fi = com.install4j.runtime.installer.helper.fileinst.FileInstaller.getInstance();
File directory = new File(context.getInstallationDirectory(), "/libs");
File[] filesInLib = directory.listFiles();
for(File file:filesInLib){
boolean shouldDelete= true;
if(file.getName().toLowerCase().contains("alien")){
Util.logInfo(null,"dexchange detected: " + file.getName());
shouldDelete= false;
}
Util.logInfo(null,"Delete " + file.getAbsolutePath() + ": " +reval);
if(shouldDelete){
if (file.exists()) {
Util.logInfo( null, "delete file: " + file);
fi.deleteFile(file);
}
}
}
return false;
At Ingo Kegel:
Thanks for your help, I didn't want to open a support ticket, because I was quite sure it is no bug in you product. But I think another checkbox "delete if not empty" would help a lot in the DeleteFilesOrDirectory Action.
My question is quite simple and with the SharpSvn Api, it should be easy as well. Here what I did:
path = "c:\project";
using (SvnLookClient client = new SvnLookClient())
{
SvnLookOrigin o = new SvnLookOrigin(path);
Collection<SvnChangedEventArgs> changeList;
client.GetChanged(o, out changeList); // <-- Exception
}
and when I call the GetChanged, I get an exception:
Can't open file 'c:\project\format': The system cannot find the file specified.
So, Maybe there is something I'm missing? Or maybe it's not the right way to do find out the list of files and folders that were modified in the local repository?
Thanks in advance.
The SvnLookClient class in SharpSvn is the equivalent to the 'svnlook' console application. It is a low level tool that enables repository hooks to look into specific transactions of a repository using direct file access.
You probably want to use the SvnClient class to look at a WorkingCopy and most likely its Status() or in some cases simpler GetStatus() function to see what changed.
The path that the SvnLookOrigin constructor wants is actually:
path = "c:\project\.svn\";
That is, it wants that special ".svn" directory not just the root of where the source is checked out to.
Although you probably do want to listen to Bert and do something like:
path = "c:\project";
using (SvnLookClient client = new SvnLookClient())
{
SvnLookOrigin o = new SvnLookOrigin(path);
Collection<SvnChangedEventArgs> changeList;
client.GetStatus(o, out changeList); // Should now return the differences between this working copy and the remote status.
}