recover lost gpg password - certificate

I found my old .gnupg directory in a backup and want to use it again. Unfortunately I have lost my password but I have some ideas of what the password was. I have not much understanding of gpg and pgp, however I know the basics of asymmetric cryptography.
My challenge now is to recover that key/password that I might be able to guess by some structure that I recall. So I will need to use some permutation engine that assembles various pieces of that password and checks if it is correct. I could write a script that does but I also could use john the ripper with gpg2john. Trying to figure out which way to go I face some obstacles:
My .gnupg directory is from 2005, created on a Sun system at that time. The directory contains a pubring.gpg and the newer format pubring.gpx. A subdirectory private-keys-v1.d contains 5 .key files.
Trying john first I seem to provide the wrong input.
gpg2john ~/.gnupg/pubring.kbx
File ~/.gnupg/pubring.kbx
can't find PGP armor boundary.
gpg2john ~/.gnupg/pubring.gpg\~
<lots of different messages like>
Hash material(5 bytes):
Sub: image attribute(sub 1) Image encoding - JPEG(enc 1)
Reason - No reason specified
lots of other stuff
Error: No hash was generated for ~/.gnupg/pubring.gpg~, ensure that the input file contains a single private key only
How can I generate a file that gpg2john expects as input?
All approaches of mine to extract the private key failed because I need the key for that process, which I want to recover ...
For the manual approach I would need a way to test if my password is correct. What is the easiest approach here? I am a bit confused because I have 5 .key files. Which one is my private key?
gpg --list-keys | grep "My Name" gives me back 3 entries different from the key names in private-keys-v1.d. The keys are labeled [ultimate], [expired], and [revoked].
Whenever I ask gpg to do anything like gpg --export-secret-keys ID > exportedPrivateKey.asc I am getting 2 messageboxes asking for a passphrase for 2 keys. These Ids are found in private-keys-v1.d.
How can I make gpg ask me only for the password of the [ultimate] key?
(In this article for me a certificate is the private-public-key tripplet that gpg is using. I might be unclear in what I say for anyone really understanding the concept:)
Ps: I am not sure if the password that I might re-construct belongs to the revoked certificate. If so, can I unlock the private key of the revoked certificate? Can I generate a new certificated based on the revoked one? (I guess not because otherwise revoking does not have any positive security effect). What do I win by getting back the password to a revoked certificate?

I personally believe, that gpg2john needs asc file and your approach to export it using gpg --export-secret-keys ID > exportedPrivateKey.asc is right. Problem, that you does not succeed is perhaps in this change: https://github.com/open-keychain/open-keychain/pull/1182/files
They "disabled" exporting private key with passphrase without entering given passphrase. It is not photographically needed for such operation, but due discussion in issue https://github.com/open-keychain/open-keychain/issues/194 it has been implemented.
I suggest you to export given key using custom compiled version of gpg with given commits reverted.

I'm not sure if I missed something, but have you simply tried making a backup of the keyring (copy the whole .gnupg folder to be safe) and then deleting keys from it until only the desired one is left? I can't promise that this will work, I always used john with --armor-exported keys.
By the way, the filenames that you see in the private-keys-v1.d subfolder are the keygrip and don't match your key IDs.
You can match keys to their keygrip by using the --with-keygrip parameter (e.g., gpg --with-keygrip --list-secret-keys).
PS: You may find this tutorial helpful — https://github.com/drduh/YubiKey-Guide — while it's written for YubiKey users, it has many advanced concepts that are relevant in general.

Related

How to repair Unity registry on Windows 10?

I use PlayerPrefs to save some user data. I know that on Windows the data is actually saved in the registry under the \HKEY_CURRENT_USER\SOFTWARE\Unity\companyname key.
It used to be like this:
Unity
|--BugReporterV2
|--UnityEditor
|--companyname
|--some other things that I don't remember, maybe the project name then some detailed values
|--WebPlayer
I was intended to set all PlayerPrefs to default, so I deleted the whole key of companyname - stupidly without backup. Then in testing, I found that when calling
PlayerPrefs.SetString("testKey", "test");
it raised an exception
PlayerPrefsException: Could not store preference value
How can I repair the structure of the registry?
Can someone show me how the registry saved data with PlayerPrefs looks like under the "Unity" key with each keys and values please? Maybe I could fixed it manually.
Thank you!
I tried reinstall unity, and the registry is fixed. The origininal clean registry looks like this:
Unity
|--BugReporterV2
|--UnityEditor
|--WebPlayer
|--companyname
|--Project Name with Space
|--unity.cloud_userid_xxxxxx // value is quite complicated
|--UnityGraphicsQuality_xxxxxx // value is 5
The xxxxxx is something special for windows 10 that I'm not familiar with. Seems each key is followed by such a string that looks random to me. For anyone encountered similar issue, I think you can try creat the companyname and project name key. If the other keys are not created automatically after you run the project, then you could try reinstall unity.

Update Hidden Settings After Initial Upload

I'd like to change my Candy Machine from having hidden settings to no longer be hidden.
Initially, the Candy Machine is created with hidden settings like these:
hiddenSettings: {
name: "Name",
uri: "uri..."
hash: '44kiGWWsSgdqPMvmqYgTS78Mx2BKCWzd',
}
I have attempted updating the candy machine to set the value of hidden settings to null, but this does not change any of the NFTs' metadata or seem to do anything at all.
Is there a way to unhide the assets after initializing them to have hiddenSettings?
Very late but answering for others who may have the same question...
Unfortunately it's not that simple. The "hidden" settings for a candy machine determines how the NFTs are uploaded. With it set, all NFTs will be uploaded with the same URI - the placeholder image and metadata.
Once an NFT is uploaded and minted, the candy machine does not control its metadata. Even if you could remove the "hidden settings" field, this would not reveal your NFTs. In fact you need to keep the hidden settings (in particular the hash) for a reason listed below. Instead, you need to update the NFTs themselves, setting the new URI to the actual metadata file.
The tool which makes this easier is Metaboss. It that can explore the blockchain and make changes for you. In particular, you can find the mint accounts of the NFTs which have been minted, and update the URIs. Updating will require your keypair for the wallet with the update authority for the collection.
After installing Metaboss, the command
metaboss snapshot mints -c [YourCandyMachineAddress] --v2
will output an array of the mint accounts to ./[YourCandyMachineAddress]_mint_accounts.json
You can change the output destination with the -o flag. Then for a given NFT you can find the metadata using
metaboss decode mint -a [MintAddress]
which will output the metadata to ./[MintAddress]. Again the output destination can be changed. You will see that this metadata has the URI of your placeholder. The name field, like "SomeCollection #1", identifies which NFT this is. By changing the URI to the actual URI for that NFT, you reveal it. Then wallet and marketplace apps will see the real NFT. You can do this with
metaboss update uri -k [/path/to/keypair.json] -a [MintAddress] -u [https://somestorage.com/realurifornft1]
All these commands have good nested documentation with --help. Obviously doing this manually for a large collection is very impractical. I have uploaded a bash script for this here. Read the script for usage info.
Now you may be thinking "isn't editing the NFT metadata like this shady? Couldn't someone use this to maliciously change my NFT?" You would be correct. To prevent this, the hash field from the hidden settings is very important. This should be the MD5 hash of the cache file created when you launched your candy machine, which contains the "real" metadata URIs. If you were to change the metadata to a different URI, you could totally change the NFT. This hash field exists so that users can confirm after reveal that the real URIs have not been changed, by reconstructing that cache file and comparing the MD5 hashes. Hence you should not remove your hidden settings - without that hash, your collection cannot be trusted.
You can not unhide. Only solution is creating a new candy machine.

Unable to run experiment on Azure ML Studio after copying from different workspace

My simple experiment reads from an Azure Storage Table, Selects a few columns and writes to another Azure Storage Table. This experiment runs fine on the Workspace (Let's call it workspace1).
Now I need to move this experiment as is to another workspace(Call it WorkSpace2) using Powershell and need to be able to run the experiment.
I am currently using this Library - https://github.com/hning86/azuremlps
Problem :
When I Copy the experiment using 'Copy-AmlExperiment' from WorkSpace 1 to WorkSpace 2, the experiment and all it's properties get copied except the Azure Table Account Key.
Now, this experiment runs fine if I manually enter the account Key for the Import/Export Modules on studio.azureml.net
But I am unable to perform this via powershell. If I Export(Export-AmlExperimentGraph) the copied experiment from WorkSpace2 as a JSON and insert the AccountKey into the JSON file and Import(Import-AmlExperiment) it into WorkSpace 2. The experiment fails to run.
On PowerShell I get an "Internal Server Error : 500".
While running on studio.azureml.net, I get the notification as "Your experiment cannot be run because it has been updated in another session. Please re-open this experiment to see the latest version."
Is there anyway to move an experiment with external dependencies to another workspace and run it?
Edit : I think the problem is something to do with how the experiment handles the AccountKey. When I enter it manually, it's converted into a JSON array comprising of RecordKey and IndexInRecord. But when I upload the JSON experiment with the accountKey, it continues to remain the same and does not get resolved into RecordKey and IndexInRecord.
For me publishing the experiment as a private experiment for the cortana gallery is one of the most useful options. Only the people with the link can see and add the experiment for the gallery. On the below link I've explained the steps I followed.
https://naadispeaks.wordpress.com/2017/08/14/copying-migrating-azureml-experiments/
When the experiment is copied, the pwd is wiped for security reasons. If you want to programmatically inject it back, you have to set another metadata field to signal that this is a plain-text password, not an encrypted password that you are setting. If you export the experiment in JSON format, you can easily figure this out.
I think I found the issue why you are unable to export the credentials back.
Export the JSON graph into your local disk, then update whatever parameter has to be updated.
Also, you will notice that the credentials are stored as 'Placeholders' instead of 'Literals'. Hence it makes sense to change them to Literals instead of placeholders.
This you can do by traversing through the JSON to find the relevant parameters you need to update.
Here is a brief illustration.
Changing the Placeholder to a Literal:

GPG Keys getting deleted automatically

I am using the following steps to import keys into GPG
Open prompt and fire import KEY command
Fire --edit-key KEY_NAME
Type trust and type 5 save
Place gpg.conf file at %APPDATA%/Roaming/gnupg and it only has one word ''batch'
I have two keys
PUBLIC KEY - Used for encrypting files
PUBLIC_PRiVATE KEY PAIR - Used for decrypting files. The public part we use for encrypting files for QA.
We have three talend jobs - two of them use second key above for decrypting files and one uses first key for encrypting.
The jobs run at a frequency of 15 minutes.
The problem I am facing is that the keys along with gpg.conf file get deleted almost after 24 hours and sometimes randomly. I could neither find public key nor key-pair, the gpg.conf also gets deleted.
I would be really grateful if someone can help me here.
Thanks in advance
it is better to use tSystem component with commons line argument supported by GPG provider. i have tried for huge file processing and it is working fine, even you can set temp path and get the decrypted file path using variables.

Setting up replicated repositories in Plastic SCM

So we're trying to set up replicated repositories using PlasticSCM, one in the US, and one in Australia and running into a bit of a snag.
The US configuration is Active Directory, the AU configuration is User/Password. This in itself is not a big deal, I've already set up the SID translation table.
The problem is with plasticscm's replicate command itself. This is the command which should replicate from the US to AU, run ON the AU server.
cm replicate br:/main#rep:default#repserver:US:8084 rep:myrep#repserver:AU:9090 --trmode=name --trtable=trans.txt --authdata=ActiveDirectory:192.168.1.3:389:john.doe#factory.com:fPBea2rPsQaagEW3pKNveA==:dc=factory,dc=com
The part I'm stuck at is the authdata part (the above is an EXAMPLE only). How can I generate the obscured password? I think it's the only thing preventing these two repositories from talking to each other.
Ok, I've solved my own problem.
To get that "authdata" string, you need to configure your client to how you need to authenticate.
Then navigate to c:[users directory][username]\Local Settings\Application Data\plastic.
Pick up the client.conf and extract the string from the SecurityConfig element in the XML.
Check the new GUI here. It's a little bit easier.