We are using SharpSVN 1.5 dll for our source control functionalites.
Our checkout and CheckIn works as follws
Suppose I want to checkout a folder name TEST and its contain 3 files say file1.txt,file2.txt,file3.txt
Step 1:- Checkout file1.txt from SVN repository
Step 2:- Checkout file2.txt from SVN repository
Step 3:- Checkout file3.txt from SVN repository
During file1.txt checkout operation a .svn folder is created in our working folder.This .svn folder contains a file named as entries. This file contains svn repository and checkout file info(We can see it open via notepad). When file2.txt checkout operation happened(Checkout to the same working folder) there is no new .svn folder creation happened. Sharpsvn uses existing .svn fiolder(previous file checkout .svn folder) and append the file2.txt info into entries file. Same thing happened when checkout file3.txt into same working folder.
During checkin operation first checkin file1.txt then file2 then file3. Svn uses the single .svn folder (in working folder)for all these files to checkin.During file1.txt check in ,entries file contains its info so this file comes under svn version control and it could successfuly checked in.Similar way file2.txt and file3.txt
Now we are trying to use SharpSVN 1.6 dll,but facing some issues in checkout and checkin operation.
During file1.txt checkout .svn folder is created and entries file contains file1.txt info.During file2.txt checkout, existing .svn foder is deleted and new .svn folder is created.So entries contains only file2.txt info not file1.txt info.When I am trying to checkin, the last file which I checked out from svn is only checked in to svn.This is because entries file in .svn folder contains only last checked out file info.
I need to get all file info into entries file using sharpsvn 1.6 dll
My code snippet as follows
public string[] CheckOut(string pSCPath, string pComment, string pLocalPath, int pRevisionNum)// Checks out a file from svn
{
string[] strCheckoutDetails = new string[2];
Uri uriSCPath = new Uri(pSCPath);
SvnCheckOutArgs objChkoutargs = new SvnCheckOutArgs();
objChkoutargs.Revision = pRevisionNum;
SvnInfoEventArgs info;
try
{
objChkoutargs.Depth = SvnDepth.Empty;
string strSingleFiletoCheckout = uriSCPath.ToString();
string strFolderNameofSingleFileSelected = strSingleFiletoCheckout.Remove(strSingleFiletoCheckout.LastIndexOf('/'));
Uri UriSingleFileCheckout = new Uri(strFolderNameofSingleFileSelected);
_objSVNClient.CheckOut(uriSingleFileCheckout, pLocalPath,objChkoutargs); //empty working folder
SvnTarget target = new Uri(strSingleFiletoCheckout);
string strFileNameonlyfromUri = strSingleFiletoCheckout.Substring(strSingleFiletoCheckout.LastIndexOf("/") + 1);
if (!copyFiletoWorkingCopy(pLocalPath, strFileNameonlyfromUri, _objSVNClient)) //make versioned file available to the current working copy - Biju
{
pLocalPath = "";
throw new SharpSvn.SvnException("Checkout Exception");
}
_objSVNClient.GetInfo(uriSCPath, out info);
strCheckoutDetails[0] = info.LastChangeRevision.ToString();
}
catch (Exception ex)
{
pLocalPath = "";
throw ex;
}
strCheckoutDetails[1] = pLocalPath;
return strCheckoutDetails;
}
Thanks
Reji
Checking out files directly is not supported in Subversion. What you can do, and what still works in Subversion / SharpSvn 1.6 is:
Checkout a directory with depth=empty. This creates a working copy with no files inside.
svn update --set-depth=files file1.txt
This piece of example code checks out an empty working copy, and fetches file1.txt and file2.txt. If you fetch file3.txt at a later point similarly, all 3 files are in entries, and you're able to execute all Subversion commands on them.
using(SvnClient client = new SvnClient())
{
SvnCheckOutArgs coArgs = new SvnCheckOutArgs();
coArgs.Depth = SvnDepth.Empty;
client.CheckOut(new Uri("http://server/repos/directory"), targetDir, coArgs);
SvnUpdateArgs updateArgs = new SvnUpdateArgs();
updateArgs.Depth = SvnDepth.Files;
client.Update(Path.Combine(targetDir, new string[] {"file1.txt", "file2.txt"}), updateArgs);
}
Related
I am using smartgit with github.
I have a config.json file on my remote github depot, with hidden passwords, at the root of the app .
I need to keep a different config.json file on my local depot, with real passwords.
As long as I try to ignore config.json locally, sometimes , it is still recorded as 'modified'
Some others times, when it finally gets ignored, by right clicking/ignore, It says 1'staged' , config.json finally gets deleted from Github when pushing the commit, I don't understand why:
THis is my .gitignore file :
.DS_Store
/config.json
config.json
node_modules
/uploads
/node_module
/dist
# local env files
.env.local
.env.*.local
# Log files
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Editor directories and files
.idea
.vscode
*.suo
*.ntvs*
*.njsproj
*.sln
*.sw?
My config.json file , with blank that I need to leave as this on Github, because Heroku needs it :
{
"localhost_db": "mongodb://localhost:27017/",
"mongoDb_atlas_db": "mongodb+srv://jose:x#cluster0-6kmcn.azure.mongodb.net/vue-starter-webpack?retryWrites=true&w=majority",
"dev": false,
"db_name": "vue-starter-webpack",
"ftp_config": {
"host": "ftpupload.net",
"user": "epiz_26763901",
"password": "x",
"secure": false
},
"node_file_path": "./tmp/files/",
"cloudinary_token": {
"cloud_name": "ddq5asuy2",
"api_key": "354237299578646",
"api_secret": "x"
},
"logs_path": "tmp/logs/logs.txt"
}
Is there any workaround ? I have tried plenty of things already. What does "staged" means ? How can I keep a different version of file on github and locally ?
EDIT : I am trying out this command, it seems to work ! :
git update-index --assume-unchanged config/database.yml
Ignore modified (but not committed) files in git?
I have a config.json file on my remote GitHub depot, with hidden passwords, at the root of the app.
That... is not a good practice. If that file (config.json) contains any sensitive information, it should not be added/committed, but explicitely ignored.
What you can commit is config.json.tpl, a template file (which is essentially what your config.json is right now)
From there, you could generate the right config.json file locally, and automatically on git clone/git checkout.
The generation script will:
search the right passwords from an external secure referential (like a vault)
replace the placeholder value in config.json.tpl to generate the right config.json
For that, do register (in a .gitattributes declaration) a content filter driver.
(image from "Customizing Git - Git Attributes", from "Pro Git book")
The smudge script will generate (automatically, on git checkout or git switch) the actual config.json file as mentioned above.
Again, the generated actual config.json file remains ignored (by the .gitignore).
See a complete example at "git smudge/clean filter between branches".
I need to get the path of the file inside the private folder.
On my local machine I was able to get it by using the path "../../../../../", however, when I deployed to meteor server using meteor deploy, it doesn't work anymore. Also I tried to log the current directory using process.cwd() and got the following, which is different from the structure I got on my local machine:
/meteor/containers/3906c248-566e-61b7-4637-6fb724a33c16/bundle/programs/server
The directory logged from my local machine gives:
/Users/machineName/Documents/projectName/.meteor/local/build/programs/server
Note: I am using this path to setup https://www.npmjs.com/package/apn
You can use assets/app/ as the relative path. While this may not make sense on the first look Meteor re-arranges your /private directory to map to assets/app from the /programs/server directory. This is both in development and production.
Basically assume that private/ maps to assets/app/.
Call Assets.absoluteFilePath(assetPath) on one of the assets in the private folder, then chop of the name of the asset file from the string you get back, e.g., assuming you have a file called test.txt in the private folder:
var aFile = 'test.txt';// test.txt is in private folder
var aFilePath = Assets.absoluteFilePath(aFile);
var aFolder = aFilePath.substr(0, aFilePath.length - aFile.length);
console.log(aFolder);
https://docs.meteor.com/api/assets.html#Assets-absoluteFilePath
I would like to get the source code of a project at specific time (changeset). So I need do download whole folder. I would like to do it for different times and handling a different workspace is not very convenient.
I know about TFS Get Specific Version into separate folder (with workspace) and Need command to get a file from TFS without a workspace (one file).
Is there some solution for whole folder without creating a new workspace?
Edit
I have found the accepted answer too ambitious. I needed something more simple.
Assumptions:
I can access TFS from Visual Studio on my computer
I want to get the ChangeSetNumber Changeset from the folder
DesiredFolder in TFS project tProj
I run the following batch from a destination folder in Visual Studio Command Prompt
set workspace_name=TemporaryWorkspace%username%
set changeset= ChangeSetNumber
tf workspace -new %workspace_name% -noprompt
tf workfold -map $/tProj . -workspace:%workspace_name%
tf get $/tProj/DesiredFolder -version:C%changeset% -recursive -noprompt
tf workfold -unmap . -workspace:%workspace_name%
tf workspace -delete %workspace_name% -noprompt
It is necessary to confirm remove source control association when starting the downloaded solution.
I use this syntax for temporary workspaces:
tf workspace -new %JOB_NAME%;%user% -noprompt -server:http://%host%:8080/tfs/%project% -login:%user%,%password%
tf workfold -map $/Release/MilestoneX.X . -workspace:%JOB_NAME% -server:http://%host%:8080/tfs/%project% -login:%user%,%password%
tf get . -version:L%TFS_LABEL% -recursive -noprompt -login:%user%,%password%
tf workfold -unmap . -workspace:%JOB_NAME% -login:%user%,%password%
tf workspace -delete %JOB_NAME%;%user% -noprompt -server:http://%host%:8080/tfs/%project% -login:%user%,%password%
I've discovered that you can do this through the HTTP api that TFS exposes.
The "signature" for the URL is as follows:
http(s)://{server}:{port}/tfs/{collectionName}/{teamProjectName}/_api/_versioncontrol/itemContentZipped?version={versionSpec}&path={escapedPathToFolder}
So, if you have a project named "MyProject" in the DefaultCollection, and want to get the content of a folder called "MyFeature":
http://MyTfsServer:8080/tfs/DefaultCollection/MyProject/_api/_versioncontrol/itemContentZipped?version=C1001&path=%24%2FMyProject%2FMyFeature
I think "version" can be any version spec, which is documented in the TFS API documentation. My example is requesting the version as of change set 1001. I was using the .NET API to get a specific version, which is pretty straightforward, but slow because it can only get one file at a time. I'm trying to figure out if this same functionality is exposed through the .NET API because downloading the files this way is much much faster than getting a single file at a time.
I implemented this as an extension method on Microsoft.TeamFoundation.VersionControl.Client.Item. This returns a stream that contains a zip file. I had used this as part of a custom MSBuild task which then saves the contents of this stream to a file location.
public static class TfsExtensions
{
const String ItemContentZippedFormat = "/_api/_versioncontrol/itemContentZipped?version={0}&path={1}&__v=3";
public static Stream DownloadVersion(this Item folder, VersionSpec version)
{
if (folder.ItemType != ItemType.Folder)
throw new ArgumentException("Item must be a folder", "folder");
var vcs = folder.VersionControlServer;
var collectionName = vcs.TeamProjectCollection.CatalogNode.Resource.DisplayName;
var baseUri = folder.VersionControlServer.TeamFoundationServer.Uri;
if (!baseUri.LocalPath.EndsWith(collectionName, StringComparison.OrdinalIgnoreCase))
baseUri = new Uri(baseUri, baseUri.LocalPath + "/" + collectionName);
var apiPath = String.Format(ItemContentZippedFormat, version.DisplayString, WebUtility.UrlEncode(folder.ServerItem));
var downloadUri = new Uri(baseUri, baseUri.LocalPath + apiPath);
var req = WebRequest.Create(downloadUri);
req.Credentials = CredentialCache.DefaultCredentials;
var response = req.GetResponse();
return response.GetResponseStream();
}
}
I think you should create a temporary Workspace to retrieve the content you want, then delete the Workspace and keep the local items.
A Workspace in TFS is a local view of what's on the server, for a given Workspace you choose which folder(s) you want to retrieve locally and where you'll store the folders/files.
It's not like SourceSafe you're not bound to only one workspace, you can have as many as you want on a given computer.
So I suggest you to create a dedicated Workspace for the operation you want to do and get rid of it when you judge it appropriate.
Use the TF.exe workspace command to create/delete a Workspace from the Shell. Then TF.exe get to retrieve the files.
You can use tf view to get a specific file without creating a workspace.
Retrieves a specific version of a file to a temporary folder on your computer and displays it.
tf vc view [/collection:TeamProjectCollectionUrl]
[/console] [/recursive] [/output:localfile]
[/shelveset:shelvesetname[;owner]] [/noprompt] itemspec
[/version:versionspec] [/login:username,[password]]
Versionspec:
Date/Time D"any .Net Framework-supported format"
or any of the date formats of the local machine
Changeset number Cnnnnnn
Label Llabelname
Latest version T
Workspace Wworkspacename;workspaceowner
How to create new file in a user directory on NetBeans Platform application? I used:
System.getProperty("netbeans.user", "user.home") + "/myfile");
But the NB IDE 7.1.1 told me that it is depreceated and I should use InstalledFile Locator instead. Ok, I tried this:
File file = InstalledFileLocator.getDefault().locate("myfile", null, false);
It works fine, if the file already exists. I cannot see any way, how to create new with the InstalledFileLocator. But the javadoc say, this method allows to get folder. So I tried this:
File file = InstalledFileLocator.getDefault().locate("myfile", null, false);
if (file == null) {
file = new File(InstalledFileLocator.getDefault().locate("", null, false), "myfile");
}
Again without success, the method locate now fails that it can't find anything (the "/" is forbidden and does not work too).
So my question is, how to corectly load in my NetBeans Platform application an existing file in the user directory (it is for writing also, so it should not be in the program directory) and if it does not exist, create it?
You could use Places.getUserDirectory().
File file = InstalledFileLocator.getDefault().locate("myfile", null, false);
if (file == null)
{
file = new File(Places.getUserDirectory() + File. separator + "myfile");
}
From the netbeans platform docs InstalledFileLocator should not be used to find resources on the system filesystem. To find data in the system filesystem, use the Filesystems API. Ex:
FileObject fo = FileUtil.getConfigFile(myfile);
if (fo == null) {
fo = FileUtil.getConfigRoot().createData(myFile,ext);
}
Probably the easiest thing you can do is to include a simple empty file (say "here.txt") in your module that will be installed in the user directory automatically. You can see an example of this here (see the section "Lessons learned: bundling files with your NetBeans modules").
Basically you include the file in the "release/modules/ext/here.txt" directory of your module.
When the module is installed the platform will install the 'here.txt' file included in your module in the user directory automatically for you, so you don't have to worry about this.
Once your module is installed an running you want to locate the file like this:
File hereTXT = InstalledFileLocator.getDefault()
.locate("modules/ext/here.txt",
"a.b.c",
false);
(Where "a.b.c" is your module identifier.)
And then from that 'hereTXT' file you can get the directory with 'hereTXT.getParent()', and you're all set.
You can use hg grep, but it searches the contents of all files.
What if I just want to search the file names of deleted files to recover one?
I tried hg grep -I <file-name-pattern> <pattern> but this seems to return no results.
using templates is simple:
$ hg log --template "{rev}: {file_dels}\n"
Update for Mercurial 1.6
You can use revsets for this too:
hg log -r "removes('**')"
(Edit: Note the double * - a single one detects removals from the root of the repository only.)
Edit: As Mathieu Longtin suggests, this can be combined with the template from dfa's answer to show you which files each listed revision removes:
hg log -r "removes('**')" --template "{rev}: {file_dels}\n"
That has the virtue (for machine-readability) of listing one revision per line, but you can make the output prettier for humans by using % to format each item in the list of deletions:
hg log -r "removes('**')" --template "{rev}:\n{file_dels % '{file}\n'}\n"
If you are using TortoiseHg workbench, a convenient way is to use the revision filter. Just hit ctrl+s, and then type
removes("**/FileYouWantToFind.txt")
**/ indicates that you want to search recursively in your repository.
You can use * wildcard in the filename too. You can combine this query with other revision sets using and, or operators.
There is also this Advanced Query Editor:
I have taken other answers and improved it.
Added "--no-merges". On large project with dev teams, there will lots of merges. --no-merger will filter out the log noise.
Change removes("**") to sort(removes("**"), -rev). For a large project with over 100K changesets, this will get to the latest files removed a lot faster. This reverses the order from starting at rev 0 to start at tip instead.
Added {author} and {desc} to ouput. This will give context as to why the files was removed by displaying the log comment and who did it.
So for my use case, it was hg log --template "File(s) deleted in rev {rev}: {author} \n {desc}\n {file_dels % '\n {file}'}\n\n" -r 'sort(removes("**"), -rev)' --no-merges
Sample output:
File(s) deleted in rev 52363: Ansariel
STORM-2141: Fix various inventory floater related issues:
* Opening new inventory via Control-Shift-I shortcut uses legacy and potentinally dangerous code path
* Closing new inventory windows don't release memory
* During shutdown legacy and inoperable code for inventory window cleanup is called
* Remove old and unused inventory legacy code
indra/newview/llfloaterinventory.cpp
indra/newview/llfloaterinventory.h
File(s) deleted in rev 51951: Ansariel
Remove readme.md file - again...
README.md
File(s) deleted in rev 51856: Brad Payne (Vir Linden) <vir#lindenlab.com>
SL-276 WIP - removed avatar_skeleton_spine_joints.xml
indra/newview/character/avatar_skeleton_spine_joints.xml
File(s) deleted in rev 51821: Brad Payne (Vir Linden) <vir#lindenlab.com>
SL-276 WIP - removed avatar_XXX_orig.xml files.
indra/newview/character/avatar_lad_orig.xml
indra/newview/character/avatar_skeleton_orig.xml
Search for a specific file you deleted efficiently, and format the result nicely:
hg log --template "File(s) deleted in rev {rev}: {file_dels % '\n {file}'}\n\n" -r 'removes("**/FileYouWantToFind.txt")'
Sample output:
File(s) deleted in rev 33336:
class/WebEngineX/Database/RawSql.php
File(s) deleted in rev 34468:
class/PdoPlus/AccessDeniedException.php
class/PdoPlus/BulkInsert.php
class/PdoPlus/BulkInsertInfo.php
class/PdoPlus/CannotAddForeignKeyException.php
class/PdoPlus/DuplicateEntryException.php
class/PdoPlus/Escaper.php
class/PdoPlus/MsPdo.php
class/PdoPlus/MyPdo.php
class/PdoPlus/MyPdoException.php
class/PdoPlus/NoSuchTableException.php
class/PdoPlus/PdoPlus.php
class/PdoPlus/PdoPlusException.php
class/PdoPlus/PdoPlusStatement.php
class/PdoPlus/RawSql.php
from project root
hg status . | grep "\!" >> /tmp/filesmissinginrepo.txt