Accepting selective folders from a component in RTC using command line - command-line

I have an RTC (IBM Rational Team Concert) Stream, call it as "ABCStream" within that stream I have folder like 'A', 'B' ,'C' and there could be nested folders within A or B or C folder.
ABCStream
|-------A
|-- other Subfolders
|-------B
|-------C
In workspace "WS-A" only folder "A" is loaded. I am creating a command line utility where I want to accept all changes coming for folder 'A'. I am not sure how to do that. I can accept all incoming changes for a component but not just for a subfolder within a component. I want to accept incoming changes only for folder "A" and its subfolders not for folder "B" and "C".
Here is the command line reference I am using.
scm reference
any leads will be helpful?
Thanks!

Related

Azure data factory File Content Replace in the Azure blob

Good morning,
We have Azure Data Factory (ADF). We have 2 files that we want to merge into one another. The files are currently in the Azure Blob storage. Below are the contents of the files. We are trying to take the contents of File2.txt and replace the '***' in File1.txt. When finished, it should look like File3.txt.
File1.txt
OP01PAMTXXXX01997
***
CL9900161313
File2.txt
ZCBP04178 2017052520220525
NENTA2340 2015033020220330
NFF232174 2015052720220527
File3.txt
OP01PAMTXXXX01997
ZCBP04178 2017052520220525
NENTA2340 2015033020220330
NFF232174 2015052720220527
CL9900161313
Does anyone know how we can do this? I have been working with this for 2 days and it would seem that this should not be a difficult thing to do.
All the best,
George
You can merge 2 files or more using ADF but i can't see a way where we can merge with a condition / control the way we merge files, so what i can recommend is to use Azure Function and do the merge programmatically.
if you want to know how to merge files without preserving line priorities use my approach:
create a pipeline
add a "Copy activity"
in copy activity use this basic settings:
in source -> chose WildCard file path (select the folder that files are located at), make sure in wildcard path to write "*" in filename this will guarantee to chose all files under the same folder.
this will merge all the files under the same folder.
in Sink -> make sure to select in Copy behavior Merge Files mode.
Output :

.gitignore all except one specific file (via a full/fuller file path)

The Goal:
How to use .gitignore to exclude all folders & files except PowerShell $Profile?
The answer should help expand the current exception file list to more files in other subfolders. If possible, reduce wildcards to minimum, and use as specific as possible (full/fuller file path). Why? For instance, Book1.xlsx may exist in multiple subfolders but I want to be able to choose only the specific desired subfolders.
Thanks in advance!
Current status:
On Windows 10 (not Linux Distros):
git init initiated on top level directory C:\. [Please don't suggest to start from other subfolders. Just remain with C:\, as I will include more files to the exception list]
C:\.gitignore containing the below:
# Ignore All
/*
# Exception List [eg. PowerShell's $Profile (please use FULL/FULLER FILE PATH, if possible)]
!.gitignore
!Microsoft.PowerShell_profile.ps1
!Users/John/Documents/WindowsPowerShell/Microsoft.PowerShell_profile.ps1
!Users\John\Documents\WindowsPowerShell\Microsoft.PowerShell_profile.ps1
!C:/Users/John/Documents/WindowsPowerShell/Microsoft.PowerShell_profile.ps1
!C:\Users\John\Documents\WindowsPowerShell\Microsoft.PowerShell_profile.ps1
With the above codes, git status successfully returned only .gitignore as 1 untracked file. Microsoft.PowerShell_profile.ps1 remained missing from untrack listing.
I've tried alternative ways (wildcard, partial subfolder name with pattern, etc) but all failed to return the fuller file path PowerShell $Profile.

How to cd a file with date in its name

I'm trying to be able to create several sub-folders within a folder on the main path I have already created, but I get a message saying the folder I'm trying to "cd" isn't valid. I would appreciate if someone could explain why this is happening, and either help me fix my cd code or give me an alternative way to access this folder.
This is for MATLAB 2019. I'm trying to get the code to auto-generate a folder with the date and time (which you can see in the first line below), and then create a sub-folder "Participant 1" (i.e. when you double click on the date and time, you open the sub-folder "Participant 1"). I then want to add a further sub-folder, "EMG_Data". I'm getting stuck at the point where I have to cd for the folder that has the currDate and "Participant 1" in it. As stated above, I would like to be able to create an additional sub-folder called "EMG_Data" within the "Participant 1" sub-folder, but I don't know how to get to the "Participant 1" folder (presumably I would have to cd it) because I don't know how I'm supposed to format the date (currDate) within the cd or other functions.
currDate = strrep(datestr(datetime), ':', '_');
mkdir('SMC Project Data Collection')
cd('C:/Users/wynkoopp/Documents/MATLAB/SMC Project/SMC Project Data Collection/')
mkdir(currDate,'Participant 1')
cd('C:/Users/wynkoopp/Documents/MATLAB/SMC Project/SMC Project Data Collection/currDate/Participant 1/')
mkdir('EMG_Data')
% Want the 'currDate' above to always be integrated into cd function above
% at the end, since name of folder will vary
I expect to have the subfolder 'EMG_Data' formed in the subfolder 'Participant 1', but this is not happening. Instead, I get:
Error using cd
Cannot CD to C:\Users\wynkoopp\Documents\MATLAB\SMC Project\SMC Project Data
Collection\currDate\Participant 1 (Name is nonexistent or not a directory).
Error in Paulcopydirectorygenerator (line 5)
cd('C:/Users/wynkoopp/Documents/MATLAB/SMC Project/SMC Project Data
Collection/currDate/Participant 1/')
The line mkdir(currDate,'Participant 1') create a folder in the folder with the current date. Your cd command tries to access to another folder which does not contain the current date.

Below is the folder and file structure, in which I have move all the files with keeping its sub folder structure excluding few files and sub folder

enter image description hereI am really facing difficulty in writing Powershell script to move my application code from one server to another server.
Below is the folder and file structure, in which I have move all the files with keeping its sub folder structure excluding few files and sub folder.
Server 1
\\networkshardpath\D$\MyProject
Main Folder : Folder A, B, C,D,E,F,G
Files : File 1, File 2, File 3, File 4
Sub Folder A : File A1, File A2, File A3 etc
Sub Folder B : File B1, File B2, File B3, File B4
Sub Folder C & D : Lots of log files which I need to exclude.
Sub Folder E : File E1, File E2, File E3,
Now I want a script which copies all the files and folder keeping same structure excluding below mentioned folders and files on Server 2 with network path like \server2\D$\MyProject2\ .
Exclude Folders : C,D,G
Exclude Files:
From Main Folder : File4
From Sub Folder A : A3
From Sub Folder B : B4
During this activity, 1 log file needs to be generated which will helps me to keep track of all the files details, like which files copied and which are excluded. During this if any error occurs then the detailed error details with time.
Please help me with this script or please guide me to get this resolved.
Thanks a lot in advance.
I have written below mentioned code. Main purpose is to avoid login into RDP server and doing all the deployments from the local system. This code is working fine but its working very slow from system, but the same is working very fast from application server.
Now 1 more thing is that, on my local system we are having very limited access and UAC access is disabled by Administrator. Please suggest me how to run this script with the help of batch file or any other way just by double clicking on it.

gsutil - is it possible to list only folders?

Is it possible to list only the folders in a bucket using the gsutil tool?
I can't see anything listed here.
For example, I'd like to list only the folders in this bucket:
Folders don't actually exist. gsutil and the Storage Browser do some magic under the covers to give the impression that folders exist.
You could filter your gsutil results to only show results that end with a forward slash but this may not show all the "folders". It will only show "folders" that were manually created (i.e., not implicitly exist because an object name contains slashes):
gsutil ls gs://bucket/ | grep -e "/$"
Just to add here, if you directly drag a folder tree to google cloud storage web GUI, then you don't really get a file for a parent folder, in fact each file name is a fully qualified url e.g. "/blah/foo/bar.txt" , instead of a folder blah>foo>bar.txt
The trick here is to first use the GUI to create a folder called blah and then create another folder called foo inside (using the button in the GUI) and finally drag the files in it.
When you now list the file you will get a separate entry for
blah/
foo/
bar.txt
rather than only one
blah/foo/bar.txt