I know how to add or remove a store with PowerShell using Microsoft.Office.Interop.Outlook, but I haven't found any information about changing values.
I read https://learn.microsoft.com/en-us/office/vba/api/outlook.namespace#methods but I don't see a method available for setting properties.
Context: User's PST files have been moved from one path to another. I'm trying to avoid disruption wherever possible, so I'm writing a PS script to move the PST files, and then update Outlook with the new path.
Since removing and re-adding the stores will break user-defined stuff like rules, I'm hoping for a way to change existing store filepaths that will require no user action.
Is this possible at all?
As a second option, can I pull the existing rules, and modify them (or recreate them)?
PST store entry id embeds the PST path in it (you can see it in OutlookSpy - I am its author - click IMessage / IMAPIFolder / IMsgStore button, select PR_STORE_ENTRYID, click "..." next to the Value edit box).
If a rule includes a store id (e.g. copy / move message action), you would need to reset / recreate the rule.
I you don't want to remove / add a store, can reset the store location using ProfMan library (I am also its author) directly in the profile section in the registry. See https://www.dimastr.com/redemption/profman_examples.htm#example2 for an example on how to read PST paths. You can modify the script to set the path instead.
Related
I am trying to create a program to automatically download the attached files that are sent to us from a certain email and then transform the delimiter with SAS, of those csv that are attached to us and pass those csv through a flow that I have already created.
I have managed to create a program that treats the csv as I want and the delimiter that I want, the problem is that when it comes to automating the download of files from Outlook it does not work.
What I have done is create a rule with the following VB code that I found on the internet:
Public Sub SaveAttachmentsToDisk(MItem As Outlook.MailItem)
Dim oAttachment As Outlook.Attachment
Dim sSaveFolder As String
sSaveFolder = "C:\Users\ES010246\Desktop"
For Each oAttachment In MItem.Attachments
oAttachment.SaveAsFile sSaveFolder & oAttachment.DisplayName
Next
End Sub
I have changed the path to my personal path where i want the files are downloaded.
website: https://es.extendoffice.com/documents/outlook/3747-outlook
The problem is that this code does not work for me, it does absolutely nothing for me and no matter how much I search the internet, only this code appears.
Is there any other way to do with SAS what I want? What is it to automatically download 8 csv files sent to me by Outlook, or has someone experienced the same thing as me with VBA?
I have followed all the steps about 7 times so I think the error is not in copying the code or selecting certain options wrong, in fact I had copied and pasted the code and later I modified the path where I wanted those to be saved. files but it doesn't work, does anyone know why?
I will be tremendously grateful, thank you very much for everything!
First of all, you need to make sure the file name and path doesn't include forbidden symbols.
The VBA macro used for a rule in Outlook is absolutely valid except that a mail item may contain the attached files with the same name, so a file saved to the disk may be overwritten (saved with the same name). That's why I'd suggest generating a file name with your own unique IDs making sure that DisplayName property is not empty and has a valid name what can be used for file names (exclude forbidden symbols).
Also you may consider handling the NewMailEx event of the Application class which is fired when a new message arrives in the Inbox and before client rule processing occurs. Use the Entry ID returned in the EntryIDCollection string to call the NameSpace.GetItemFromID method and process the item. This event fires once for every received item that is processed by Microsoft Outlook. The item can be one of several different item types, for example, MailItem, MeetingItem, or SharingItem.
The Items.ItemAdd event can be helpful when items are moved to a folder (from Inbox). This event does not run when a large number of items are added to the folder at once.
We are creating a new file from template as specified in the Office Uri Schemes documentation.
As per the documentation we can give an optional parameter with |s to specify the default path offered as a save location when the file is first saved. The only restriction is that if the optional default save location is supplied, it must be pointing to the same host name as the template.
We are using Office 365 group drive to save the template and the default save location(for newly generated file) is also in the same group drive. But it is not picking the default save location as specified.
Following is the URI that we are testing.
ms-word:nft|u|https://figg.sharepoint.com/sites/1dev/Shared%20Documents/Config/Files/Minutes.dotx|s|https://figg.sharepoint.com/sites/1dev/Shared%20Documents/Dest
The above url when opened generate a new file from template but it did not give the default save location specified in the command when we try to save the file.
Any help if how to fix this would be very helpful.
Are you sure about your save URL ? It looks truncated as Dest...
If it's correct, make sure that it ends with a forward slash, like this:
ms-word:nft|u|https://figg.sharepoint.com/sites/1dev/Shared%20Documents/Config/Files/Minutes.dotx|s|https://figg.sharepoint.com/sites/1dev/Shared%20Documents/Dest/
You also must make sure you have the correct permissions to write back to that particular document library
HTH
our test server was hacked and they installed a ransomware (Cry36) for which there is no solution to date. We also didn't keep any snapshots up to date (lesion learned).
Since it's only a test server, i am not too worried. But we had stored in our Firebird DB (v2.5) a bunch of work which i would like to save.
Looking at the database in a hex editor, i can see that the data is encrypted up until offset 00006430.
Looking at the structure of the firebird database it says that all the headers are encrypted (Header page, PIP,..., Data page).
All the data is still there.
I've tryed with gfix and even copying the headers from an older version of the db. But while it does fix the db, the headers are wrong and most of the new pages are removed.
Does anyone have any idea how to restore the database or extract the tables?
Regards
I have used this method restoring ransomware files encrypted on hard drives from any ransomware by renaming the file in question back to its original filename and extension. You may be able to apply the same method to revert the data or database back to the pre-encrypted version of the file/s or data/bases.
From my testing:
the ransomed file = is compressed and or simply renamed, the encryption is either not applied actually but only implied or the containing file or renamed file is encrypted but the original file is never touched. Simply rename back to original and you can access the file as you could be for the attack. Example:
This is the Ransomed file:
Adobe Acrobat XI Pro 11.0.20.zip.id[42AF04FF-2275].[supportcrypt2019#cock.li].Adame
This is the Ransomed file, renamed and fixed:
Adobe Acrobat XI Pro 11.0.20.zip
The removed portion of the FileName is:
.id[42AF04FF-2275].[supportcrypt2019#cock.li].Adame
Upon renaming the file, you will be prompted for approval to change the application type/ file type for which the file will be opened (Back to its original state), and what application will open it (its original designation as determined by the FileType preset after the FileName. The reason the file doesn't work when ransomed is the final file extension renaming scheme, whereas in this case .ADAME is not a real file type, but made up, and no program will or can open it. Thus, the file can not be opened as named.
You would need to do this for each file individually, could you post more information on the database file and encryption information as this should work for you as well. The Ransom Methodology should be the same. I can not identify the naming scheme used on your system without more information pertaining to unusual or new/unidentified portions of code injected throughout your instance.
For Renaming multiple files you could try an application such as "Advanced Renamer" for bulk processing.
I use filters a lot in Fiddler, to switch between filtering for multiple related sites. However, I'm having problems saving some of these filters.
As an example, if I go to the Filters tab, I can add a single domain like this:
*.example.com
and save it (using the Actions->Save Filterset... button) as example.ffx. In that file, there is a line as follows:
<slHosts>*.example.com;</slHosts>
I then change the filter to also try to include another set of domains which includes a trailing wildcard (e.g. example.co.uk, example.co.au):
*.example.com;*.example.co.*
This filters correctly at run-time, but when I try to save the new filterset (Actions->Save Filterset...), it won't save - it gives me the option to overwrite the existing example.ffx, but it doesn't save - the filters tab reverts to the original value with only *.example.com. This also happens if I save the new filterset under a different name.
This only seems to happen with trailing asterisks - the filters work, but they can't be saved. Even if I manually edit the .ffx file to include both domains like this:
<slHosts>*.example.com; *.example.co.*;</slHosts>
when I load the filterset file, it only includes the first domain in the Filters tab.
Fiddler's HostList object (used in the Filters tab and elsewhere) supports leading wildcards only; it does not support wildcards elsewhere in the string.
The Filters tab's UI should properly reflect the state (e.g. when you move away, it should delete the tail-wildcard rule); I'll pass this bug along to the development team.
I'm using the visual client for perforce and I want to exclude a directory from the workspace. Before streams, I would just navigate to my workspace, find the folder in the tree, and exclude it (and I've found this solution in a number of other related questions I've found). However, now that I am using a stream, it won't let me do this, i have to edit the stream mapping apparently.
So I tried to add this line to the remapped box when editing the stream:
-//NumberPlus/current/Library/... //nplus-mainline/current/Library/
However I just get an error:
Error in stream specification.
Error detected at line 24
Null directory (//) not allowed in '-//NumberPlus/current/Library/...'.
EDIT: I'm in Windows 8.1, for clarification.
If the folder you want to exclude is specific to your machine, setting P4IGNORE locally is the easiest way to exclude it from being added to the depot.
http://www.perforce.com/blog/120214/new-20121-p4ignore
You'd set P4IGNORE to some name like "p4ignore.txt", create a file with that name, and add "Libraries" to it -- subsequent "p4 add" commands will skip over paths found in the P4IGNORE file, so those files will never get added to the depot.
If this is something that's going to be common to all workspaces of this stream (e.g. it's a build artifact that everyone is going to generate and nobody is supposed to check in), what you want to do is add an "exclude" to the stream's Paths (this will exclude it from both branch views and client views generated by that stream). E.g.:
Paths:
share ...
exclude Libraries/...
The "exclude Libraries/..." is basically the same thing as the exclusion line you would add to the client view, except you specify it as a relative path, you don't need to specify both sides of the mapping, and the "-" is implied by the "exclude" type. The "remap" type is if you want to keep those files but in a different depot location, which doesn't sound applicable here.
More information on defining stream views:
http://www.perforce.com/perforce/doc.current/manuals/p4v/streams_views.html
You can't just edit the mappings for your client workspace if it is switched to a particular stream. The whole point of streams is that your workspace mapping is directly generated from the stream definition. So that's a feature.
It's not totally clear whether
you don't want the directory in the stream at all, or
it's valid to have the directory in the stream, but you don't want to sync it to your workstation, or
you want the directory sync'd to your workstation, but you want the directory to have different contents (say, from some other stream which has a different version of the library.
However, for all of these situations, I suspect the best path forward is to define a new child stream of your current stream.
You will want to define the path mappings using the "share", "exclude", "isolate", and "import" mapping types.
For example, if you just didn't want the Library/... directory at all, you'd "exclude" it from your parent.
Then that stream simply won't have that directory, and it (of course) won't be on your workstation when you sync to the stream, either.
If you wanted to have a different copy of the code in the Library/... directory, so that it became a point of intentional divergence from the parent, you'd "isolate" it from your parent to submit your own custom version, or "import" it from another stream to use that stream's Library/... directory instead.
In either case, the directory would be part of the stream, and would be sync'd to your workstation, but the contents of that directory would differ from the contents that are used in the parent stream (the exact way in which they'd differ is under your control, as you define the stream accordingly).
Documentation and some examples are here: http://www.perforce.com/perforce/doc.current/manuals/p4v/streams_views.html
and here:
http://www.perforce.com/sites/default/files/pdf/Streams-ebook.pdf
I believe I have solved this. To be clear, I wanted the folder to be completely ignored by version control. I'm using p4connect with Unity and it keeps wanting to include unnecessary stuff in my depot.
All I had to do was add this line to my parent stream in the Paths box:
exclude current/Library/...