Team Explorer Everywhere (command line) lying? - command-line

I'm trying to get some code pulled down from a local TFS server onto my Mac. I've been fussing around with TEE for quite some time now and it seems it doesn't keep track of what I'm doing from one command to the next. I set a working folder, then try to to perform a get, and I'm met with odd messages:
GA8995AC511228:TEE-CLC-10.0.0 rr154459$ ./tf dir ../all -server:http://10.227.212.202:8080/tfs -login:rr154459#na
There is no working folder mapping for /Users/rr154459/tfs/all.
GA8995AC511228:TEE-CLC-10.0.0 rr154459$ ./tf workfold -map -login:rr154459#na -server:http://10.227.212.202:8080/tfs -workspace:GA8995AC511228 '$\' '../all'
An error occurred: The new working folder mapping of $\ to /Users/rr154459/tfs/all conflicts with the local path in the existing mapping of $/ to /Users/rr154459/tfs/all.

$\ is not a server path. You need to use $/.
Please remove your existing workspace mappings and set them using the correct path formatting.

Related

Prevent downtime using lftp mirror

I'm using lftp to deploy a website via Travis CI. There is a build process before the deployment, for that reason a build directory is present and pushed to the root of the ftp server.
lftp $FTP_URL -e "glob -d mirror build . --reverse --delete-first --parallel=10 && exit"
It works quite well, but I dislike to have a downtime / temporary PHP parse errors because of missing files on my website. What is the best way to work arround that issue?
My first approach was an option to set a temporary directory, but the lftp man page says there is only a options for temporary files. I still tried the option but it didn't help.
My second approach was to use "mirror build temp" to use a temporary folder and then replace the root with it. The problem here is, that I cannot exclude the temp folder while deleting the old files and folders like rm -rf *.
For small changes not involving adding/removing php files set xfer:use-temp-file should be sufficient. Also don't use --remove-first, as it causes lftp to delete obsolete files before uploading.
For larger changes I'd create a separate directory for each version of the site and redirect the web server to the directory using .htaccess mod_rewrite or some other configuration file. This technique will allow atomic switch to the new version (and back if needed). Besides, you will be able to do final pre-production testing of the new version if you redirect to the new version conditionally based on your IP address or using some other rule.
If you don't want to re-upload whole site for each new version and the FTP server supports FXP with itself, then you can copy old version to a new directory using mirror old_directory ftp://user#example.com/new_directory, then update the new directory using mirror -eR local_dir new_directory.
This is a zero downtown pattern - each placeholder should be replaced:
lftp $FTP_URL -e "mirror {SOURCE} {TARGET}-new-{TIMESTAMP} --reverse --delete-first;
mv {TARGET} {TARGET}-old-{TIMESTAMP};
mv {TARGET}-new-{TIMESTAMP} {TARGET};
rm -rf {TARGET}-old-{TIMESTAMP};
exit"

how to use backup files to create regular files in emacs

I am trying to create a file named caseexp.sml . Emacs created a backup file of this file when I was working on it at some earlier point, and now when I try to open it as caseexp.sml, emacs opens a #caseexp.sml# file and everytime I try to save it using C-x C-w, emacs saves it as another backup file with another tilde added to its name. Several attempts later, I have only managed to save it as #caseexp.sml"~~~.
How can I avoid creating these "tilde" backup files and save my file simply as caseexp.sml ?
There are a few unexpected behaviors here, so I can't be sure that this is what's going on, but usually what happens is that if files with hashes are left around, it's possible that Emacs crashed while you had unsaved changes. However, usually Emacs should prompt you to run "M-x recover-this-file" to restore changes from the unsaved-changes file (the filename with the hashes) to the actual file, so it's not clear what's going on there. Try fixing this from the command line.
You probably want to cp all the files to another location first, in order to have a backup (I'm assuming a Unix-like OS):
$ cp *caseexp* /tmp
Then delete the extra files while preserving the one with the most recent changes:
$ cp <most recent file with latest changes> caseexp.sml
$ rm \#caseexp*

Is it possible to use relative paths in sphinx.conf?

I'm using Sphinx on a Linux production server as well as a Windows dev machine running WampServer.
The index configurations in sphinx.conf each require a path setting for the output file name. Because the filesystems on the production server and dev machine are different, I have to have two lines and then comment one out depending on which server I'm using.
#path = /path/to/folder/name #LIVE
path = C:\wamp\www\site\path\to\folder\name #LOCALHOST
Since I have lots of indexes, it gets really old having to constantly comment and uncomment dozens of lines every time I need to update the file.
Using relative paths would be the ideal solution, but when I tried that I received the following error when running the indexer:
FATAL: failed to open ../folder/name.tmp.spl: Invalid argument, will not index. Try --rotate option.
Is it possible to use relative paths in sphinx.conf?
You can use relative paths, but its kind of tricky because you the various utilities will have different working directories.
eg On windows the searchd service, will start IIRC with a working directory of $WINDIR$\System32
on linux, via crontab, I think it has working directory left over from previously, so would have to change the folder in the actual command line
... ie its not relative to the config file, its relative to the current working directory.
Personally I use a version control system (SVN actually) to manage it. The version from Dev, is always the one commited to the repository, the 'working copy' on the LIVE server, has had the paths edited to the right location. So when 'update' to the latest file, only changes are merged leaving the local filepaths in tact.
Other people use a dynamic config file. The config file can be a script (php/python/perl etc) - but this only works on linux so wont help you.
Or can just have a 'publish' script. Basically, you edit a 'master' config file, and one that can be freely copied to all servers. Then a 'publish' script, that writes the apprirate local path. It could do it with some pretty simple search replace.
<?php
if (trim(`hostname`) == 'live') {
$path = '/path/to/folder/';
} else {
$path = 'C:\wamp\www\site\path\to\folder\`;
}
$contents = file_get_contents('sphinx.conf.master');
$contents = str_replace('$path',$path,$contents);
file_put_contents('sphinx.conf',$contents);
Then have path = $path\name in the master config file, which will get replaced to the proper path, when run the script on the local machine

How to change (CQ5) VLT repo url/port?

I have checked out vlt repo using:
vlt co http://localhost:4502/crx/-/jcr:root path/to/repo --force
But now, my CQ instance changed location (port). Is there a way to set new URL(port) to vlt?
(without checking out again)
I have tried unzipping path/to/repo/.vlt and changing repository.url file sometimes it works, but in most cases it breaks local repo, or I'm unable to unzip.
I understand you're looking for something like the "svn relocate" command. This is not possible with the VLT tool directly.
Options (any one of these should do it):
I recommend checking out a new copy of the repository and reapplying the changes that show from running "vlt status" over there.
Set up a new CQ server on the old port, then use "vlt rcp". The process would probably be: copy the whole repository from old to new server, push your local stuff to the new server, copy part of the tree from new to old.
The repository.url setting is nested in .vlt files under all subdirectories of the repository. You could try a global/recursive search & replace for all of these. I've never tried this though. For example, something like this: (I get permission denied running this, needs more work.)
find -name .vlt -type f -print0 | xargs -0 sed -i 's/localhost:4502/localhost:4503/g'
Remove all the .vlt files and use the vlt import/export commands to load. See the "Using import/export instead of .vlt control" section of this document: http://wem.help.adobe.com/enterprise/en_US/10-0/core/how_to/how_to_use_the_vlttool.html

TFS command line to get list of files checked in yesterday

I'm looking for a simple way to get a list of files that were checked in on a certain day. Is there a command line I can use? I don't want changesets just the file names.
A bit late, but for others asking the same question.
Open a Visual Studio Command Prompt (2010) and use the command:
tf history "local path" /version:D2011-03-29 /recursive /noprompt
Replace the date with the date you wan't information about and localpath to the local folder you bound to, you WILL get the changeset number, but also all items changes. It's also possible to use a collection and remote path instead of local path.
Naturally you can discard noprompt and login and put in that information in a prompt.
More information about the TFS 2010 commandline: http://msdn.microsoft.com/en-us/library/yxtbh4yh.aspx
Use tf command line. It may do what ever you want.