Visual Source Safe Automated Commit - version-control

I am currently attempting to set up an automated add/checkout/checking script using MS Visual Source Safe via command line. Online documentation is... lacking... and I was hoping that someone else had tried doing something like this in the past.
Before going any further, I am well aware that there are other, better alternatives to VSS, so please don't give "use SVN" as an answer.
The files I've got are a scripted version of our database schema, and looks like this in the repo:
$/project_name/DBScripts/servers/databases/object_types, where object_types are Tables, StoredProcedures, etc.
I am attempting to do the following:
1- Script all database object to files. This part is done and working correctly.
2- Add all new files to repo.
3- Commit all files that have changed. Make sure files do NOT remain checked out or read-only.
EDIT 2:
Removed old code again. Included current code below. Add works correctly, but the checkout command does NOT work on any files that were changed locally.
In this context, if I were to modify stored proc A, script it to file, then try running the batch commands below, all procs BUT A will be checked out.
I've included 2 examples of the checkout command. Neither is working...
set PATH=%path%;C:\Program Files\Microsoft Visual SourceSafe
set SSDIR=repo_path
cd DBScripts/server/database/StoredProcedures
ss cp $/project/DBScripts/server/database/StoredProcedures
for %%F in (*.*) do ss add %%~nF%%~xF -C- -I-N -K- -W
for %%F in (*.*) do ss checkout $/project/DBScripts/server/database/StoredProcedures/%%~nF%%~xF -C- -G- -M- -L+
ss checkout $/project/DBScripts/server/database/StoredProcedures *.* -C- -G- -M- -L+ -Vltemp
for %%F in (*.*) do ss checkin %%~nF%%~xF -C- -K- -P $/project/DBScripts/server/database/StoredProcedures -W
cd ../../../..
Note: SourceSafe's "-R" command is inconsistent. I'd rather loop through all subfolders manually and do "for %%F in (.)" commands.

Possible reason is the checkin command is executed before prior add/checkout commands are finished. Try checking if add is successful before the checkout command, and checking if checkout is successful before the checkin command.

Related

How to upload a lot of files at once using FileZilla (possibly using a file containing the list of files to publish)?

Is there a way, using FileZilla, to publish many files at once (currently I have to choose them one by one every time, because they can be in different directories and I can't publish the whole directory)?
The ideal solution I am looking for is to use a single .txt file where I can paste the list of paths I want to publish and then somehow tell FileZilla to use it and publish each file to the remote server.
FileZilla lets you export the list of the files you have published with File -> Export in XML format. I am looking for something like this but I need to do the opposite operation.
If someone has some insights on it, please share them with me. Thanks!
P.S.: currently, I also use NetBeans IDE and publish files with it by clicking with the right button of the mouse and selecting Upload. If there's a way to do the same with NetBeans, that would be great (I write PHP code).
Thanks for the attention.
FileZilla does not allow any kind of automation.
See How do I send a file with FileZilla from the command line?
But you can use any other command-line FTP client.
For example WinSCP FTP client has Uploading a list of files example that exactly covers your task:
You may use following batch file that calls WinSCP script:
#echo off
set SESSION=ftp://user:password#example.com/
set REMOTE_PATH=/home/user/
echo open %SESSION% >> script.tmp
rem Generate "put" command for each line in list file
for /F %%i in (list.txt) do echo put "%%i" "%REMOTE_PATH%" >> script.tmp
echo exit >> script.tmp
winscp.com /script=script.tmp
set RESULT=%ERRORLEVEL%
del script.tmp
rem Propagating WinSCP exit code
exit /b %RESULT%

Ctools do not show up in pentaho UI

I am using Pentaho CE 5 on windows. I would like to use CTools but I can't make them show up in the File -> New menu to use them.
Being behind a proxy, I can not use the Marketplace plugin, so I have tried a manual installation.
First, I tried to use the ctools-installer.sh. I have run the following command line in cygwin (wget and unzip are installed):
./ctools-installer.sh -s /cygdrive/d/Users/[user]/Mes\ Programmes/pentaho/biserver-ce/pentaho-solutions/ -w /cygdrive/d/Users/[user]/Mes\ programmes/pentaho/biserver-ce/tomcat/webapps/pentaho/
The script starts, asks me what module I want to install, and begins the downloads.
For each module, I get an output like (set -x added to the script) :
echo -n 'Downloading CDF...' Downloading CDF...+ wget -q --no-check-certificate 'http://ci.analytical-labs.com/job/Webdetails-CDF-5-Release/lastSuccessfulBuild/artifact/bi-platform-v2-plugin/dist/zip/dist.zip'
-O .tmp/cdf/dist.zip SYSTEM_WGETRC = c:/progra~1/wget/etc/wgetrc syswgetrc = C:\Program Files (x86)\GnuWin32/etc/wgetrc
'[' '!' -z '' ']'
rm -f .tmp/dist/marketplace.xml
unzip -o .tmp/cdf/dist.zip -d .tmp End-of-central-directory signature not found. Either this file is not a zipfile, or it
constitutes one disk of a multi-part archive. In the latter case
the central directory and zipfile comment will be found on the last
disk(s) of this archive. unzip: cannot find zipfile directory in
.tmp/cdf/dist.zip,
and cannot find .tmp/cdf/dist.zip.zip, period.
chmod -R u+rwx .tmp
echo Done Done
Then the script ends. I have seen on this page (pentaho-bi-suite) that it is the normal output. Nevertheless, it seems a bit strange to me and when I start my pentaho server (login: admin/password), I cannot see any new tools in the menus.
After a look to a few other tutorials and the script itself, I have downloaded the .zip snapshots for every tool and unzipped them in the system directory of my pentaho server. Same result.
I would like to make the .sh works, what can I try or adjust ?
Thanks
EDIT 05/06/2014
I checked the dist.zip files dowloaded by the script and they are all empty. It seems that wget cannot fetch the zip files, and therefore the installation fails.
When I try to get any webpage through wget, it fails. I think it is because of the proxy.
Here is my .wgetrc file, located in my user's cygwin home folder:
use_proxy=on
http_proxy=http://[url]:[port]
https_proxy=http://[url]:[port]
proxy_user=[user]
proxy_password=[password]
How could I make this work?
EDIT 10/06/2014
In the end, I have changed my network connection settings to bypass the proxy. It seems that there is an offline mode for the installer, so one can download all needed files on a proxy-free environment and then run the script offline.
I guess this is related with the -r option.
I consider this post solved, since it not a CTools issue anymore.
Difficult to identify the issue in the above procedure..
but you can refer this blog he is key member of pentaho itself..
In the end, I have changed my network connection settings to bypass the proxy. It seems that there is an offline mode for the installer, so one can download all needed files on a proxy-free environment and then run the script offline. I guess this is related with the -r option.
I consider this post solved, since it is not a CTools issue anymore.
You can manually install the components from http://www.webdetails.pt/ctools/ or if you have pentaho 5.1 or above, you add the following parameters to CATALINA_OPTS option (in start-pentaho.bat or start-pentaho.sh):
-Dhttp.proxyHost= -Dhttp.proxyPort= -Dhttp.nonProxyHosts="localhost|127.0.0.1|10...*"
http://docs.treasuredata.com/articles/pentaho-dataintegration#tips-how-can-i-use-pentaho-through-a-proxy

Perforce: Prevent keywords from being expanded when syncing files out of the depot?

I have a situation where I'd like to diff two branches in Perforce. Normally I'd use diff2 to do a server-side diff but in this case the files on the branches are so large that the diff2 call ends up filling up /tmp on my server trying to diff them and the diff fails.
I can't bring down my server to rectify this so I'm looking at checking out the the content to disk and using diff on the command line to inspect and compare the content.
The trouble is: most of the files have RCS keywords in them that are being expanded.
I know can remove keyword expansion from a file by opening the files for edit and removing the -k attribute from the files in the process, but that seems a bit brute force. I was hoping I could just tell the p4 sync command not to expand the keywords on checkout. I can't seem to find a way to do this? Is it possible?
As a possible alternative solution, does anyone know if you can tell p4 diff2 which directory to use for temporary space when you call it? If I could tell it to use abundant NAS space instead of /tmp on the Perforce server I might be able to make it work.
I'm using 2010.x version of Perforce if that changes the answer in any way.
There's no way I know of to disable keyword expansion on sync. Here's what I would try:
1) Create a branch spec between the two sets of files
2) Run "p4 files //path/to/files/... | cut -d '#' -f 1 > tmp"
Path to files above should be the right hand side of the branch spec you created
3) p4 -x tmp diff2 -b
This tells p4 to iterate over the lines of text in 'tmp' and treat them as arguments to the command. I think /tmp on your server will get cleared in-between each file this way, preventing it from filling up.
I unfortunately don't have files large enough to test that it works, so this is entirely theoretical.
To change the temp directory that p4d uses just TEMP or TMP to a different path and restart p4d. If you're on Windows make sure to call 'p4 set -S perforce TMP=' to set variable for the Perforce service; without the -S perforce you'll just set it for the current user.

How to change (CQ5) VLT repo url/port?

I have checked out vlt repo using:
vlt co http://localhost:4502/crx/-/jcr:root path/to/repo --force
But now, my CQ instance changed location (port). Is there a way to set new URL(port) to vlt?
(without checking out again)
I have tried unzipping path/to/repo/.vlt and changing repository.url file sometimes it works, but in most cases it breaks local repo, or I'm unable to unzip.
I understand you're looking for something like the "svn relocate" command. This is not possible with the VLT tool directly.
Options (any one of these should do it):
I recommend checking out a new copy of the repository and reapplying the changes that show from running "vlt status" over there.
Set up a new CQ server on the old port, then use "vlt rcp". The process would probably be: copy the whole repository from old to new server, push your local stuff to the new server, copy part of the tree from new to old.
The repository.url setting is nested in .vlt files under all subdirectories of the repository. You could try a global/recursive search & replace for all of these. I've never tried this though. For example, something like this: (I get permission denied running this, needs more work.)
find -name .vlt -type f -print0 | xargs -0 sed -i 's/localhost:4502/localhost:4503/g'
Remove all the .vlt files and use the vlt import/export commands to load. See the "Using import/export instead of .vlt control" section of this document: http://wem.help.adobe.com/enterprise/en_US/10-0/core/how_to/how_to_use_the_vlttool.html

How to ask ClearCase to identify and make nice with new files?

I have a whole big bunch of new files I've recently dropped (via robocopy) into a folder tree. How can I ask ClearCase UCM to identify those files and get them placed under version control?
One easy way is to use clearfsimport in order to do a (recursive) import of a full directory content.
See "How can I use ClearCase to “add to source control …” recursively?".
It does work for ClearCase UCM view as well as base ClearCase view.
You only have to set an activity on the view you are using for the import before executing the clearfsimport.
That is easier than copying those file directly in the ClearCase destination view, and trying to detect those private (i.e. "not yet versionned") files.
Ah, turned out someone had already made such a post on an internal blog. Who knew!
I took that and then expanded it
for /f %1 in ('cleartool ls -r -s -nxname -view_only ^| grep -e ^\.\\SRC_ ^| grep -e vcproj$') do #cleardlg -addtosrc -nc "%1"
(Yeah, I've got some unix tools installed, grep and the like).
Looks for all vcproj files in a directory that starts with .\SRC_.