I need to encode some strings using DES (not triple DES - I know that there is many articles describing how to use NCRYPTBYPASSPHRASE command). I have key and IV.
How I can do it using T-SQL? Is there any ready to use procedure or function?
Ripped from here. I haven't tested this, but this page lists all of the encryption types and DES appears to be one of them.
--If there is no master key, create one now.
IF NOT EXISTS
(SELECT * FROM sys.symmetric_keys WHERE symmetric_key_id = 101)
CREATE MASTER KEY ENCRYPTION BY
PASSWORD = '23987hxJKL969#ghf0%94467GRkjg5k3fd117r$$#1946kcj$n44nhdlj'
GO
CREATE CERTIFICATE HumanResources037
WITH SUBJECT = 'Employee Social Security Numbers';
GO
CREATE SYMMETRIC KEY SSN_Key_01
WITH ALGORITHM = DES
ENCRYPTION BY CERTIFICATE HumanResources037;
GO
-- Open the symmetric key with which to encrypt the data.
OPEN SYMMETRIC KEY SSN_Key_01
DECRYPTION BY CERTIFICATE HumanResources037;
-- Encrypt the value
SET #encrypted = EncryptByKey(Key_GUID('SSN_Key_01'), '555-77-4444');
Related
I am generating a data encryption key implicitly as follows (key IDs used are just representational):
from aws_encryption_sdk import encrypt
# Key provider with only 2 region master keys to begin with
kms_key_provider = KMSMasterKeyProvider(key_ids=[“west-1”, “west-2”])
# encrypt something random only to get the encrypted data keys in the header from those 2 regions
my_ciphertext, encryptor_header = encrypt(source=“somerandomplaintextofnorelevance”, key_provider= kms_key_provider, algorithm=AWSKeyProvider.DEFAULT_ALGORITHM, encryption_context={“somekey”: “some value”})
my_data_keys = []
for dek in encryptor_header.encrypted_data_keys:
my_data_keys.append(dek.encrypted_data_key)
I get two encrypted Data Encryption Keys (DEK) strings in my_data_keys (say, DEK_enc_west_1 and DEK_enc_west_2) both of which would decrypt to a single plain data encryption key, say, DEK_Plain. Now I can encrypt/decrypt for DEK_Plain in either of the regions for redundancy.
Then, I go on and activate two more master keys in regions east-1 and east-2. Now I want that same DEK_Plain to be also encrypted under those two new region (east-1 & east-2) master keys to get two new encrypted data keys (say, DEK_enc_east_1 and DEK_enc_east_2).
So, with the new fully formed Key Provider like:
kms_key_provider = KMSMasterKeyProvider(key_ids=[“west-1”, “west-2”, “east-1”, “east-2”])
I can get my DEK_Plain from any of these 4 regions using:
my_plain_data_key = kms_key_provider.decrypt_data_key_from_list(…..)
Basically, how can I add additional region master keys to be leveraged for the same data encryption key that was generated and encrypted earlier using some other regions master keys which existed before?
Looking around in the AWS crypto doc, I find something like the following sample helping my case (though it would have been ideal if KMS key provider implementation had such region extender capability for the data keys:). In the following assume plain_dek is assigned the DEK_Plain from the question above.
new_region_key_id_1 = "arn:aws:kms:us-east-1:XXXXXXXXXX:alias/xyz/master"
new_region_master_key_1 = kms_key_provider.master_key_for_encrypt(new_region_key_id_1)
key_provider_info = {"provider_id": u'aws-kms', "key_info": new_region_key_id_1}
key_provider_info_obj = MasterKeyInfo(**key_provider_info)
plain_dk_raw = RawDataKey(key_provider_info_obj, plain_dek)
encryption_context = {“somekey”: “some value”}
new_encrypted_dek_from_region_new_master_key_1 = new_region_master_key_1.encrypt_data_key(plain_dk_raw, AES_256_GCM_IV12_TAG16_HKDF_SHA384_ECDSA_P384, encryption_context)
This can be repeated in loop for any number of new kms regions thereby extending an existing data key to be decrypt'able in broader AWS regions where new master keys were, say, recently added.
I'm using something like this:
OPEN SYMMETRIC KEY SSNKey
DECRYPTION BY CERTIFICATE SSNCert;
UPDATE
Customers
SET
SSNEncrypted = EncryptByKey(Key_GUID('SSNKey'), 'DecryptedSSN')
Where SSNEncrypted is a varbinary column. I noticed the values come out different each time. Why is this? And what can I do to get consistent encrypted values, so I can compare them in different tables?
This is "by design".
The function EncryptByKey is nondeterministic.
But if you decrypt the different values you always get the original decrypted value.
Have a look at this blog on MSDN.
Can anyone suggest me the best possible options that I can use in pentaho to suit my requirement. The requirement is we need to convert first_name & last_name attributes into hash and load the hash values for these columns into the user table to support the business reports. For the reports the actual values for these columns are not needed, the reporting code only checks for NULL values in first_name & last_name columns, and validates length of these fields.
I tried converting the fields to hash using Add checksum transformation but wasn't sure about which type of checksum to use (CRC 32, ADLER 32, MD5, SHA-1). Any suggestions?
source & target DB is PostgreSql not sure if it's needed.
Thanks in advance.
Hashing and encryption are not the same thing.
It seems you want a one-way hash. What hash you choose depends mainly on how much you care about collisions. If you don't care that multiple names could generate the same hash, a short fast hash like CRC32 is fine. If you do care about collisions then I'd use at least MD5.
According to Amazon Redshift docs, the passwords must be at least 8 chars, and contain at least one uppercase letter, one lowercase letter, and one number.
Is there a way to disable this for a database?
We do not need such stringent requirements.
Also, the docs are unclear, but if I don't specify VALID UNTIL 'something' then it is valid forever, right? (The docs say you can also use VALID UNTIL 'infinity' but don't explain what would happen if you don't include VALID UNTIL at all)
You cannot modify the Redshift password criteria.
If you are referring to ALTER USER ... [VALID UNTIL], the validity date is not a required field. The password will remain valid forever.
By using the md5 function, you can get around the lengh/char requirements:
https://docs.aws.amazon.com/redshift/latest/dg/r_CREATE_USER.html
In particular, quoting from the above page:
To specify an MD5 password, follow these steps:
Concatenate the password and user name.
For example, for password ez and user user1, the concatenated string is ezuser1.
Convert the concatenated string into a 32-character MD5 hash string. You can use any MD5 utility to create the hash string. The following example uses the Amazon Redshift MD5 Function and the concatenation operator ( || ) to return a 32-character MD5-hash string.
select md5('ez' || 'user1');
md5
153c434b4b77c89e6b94f12c5393af5b
Concatenate 'md5' in front of the MD5 hash string and provide the concatenated string as the md5hash argument.
create user user1 password 'md5153c434b4b77c89e6b94f12c5393af5b';
Log on to the database using the user name and password.
For this example, log on as user1 with password ez.
is there any pair of encrypt and respective decrypt function?
Functions In PGCRYPTO library uses hash algorithms so they don't have decryption functions.
Also when I am using pgp_sym_encrypt() and pgp_sym_decrypt() functions,
pgp_sym_decrypt() function gives the above error for encrypted value of pgp_sym_encrypt().
I am using Postgres Plus Advanced Server 8.4.
Do I have to put \ before every escape sequence character or what?
Please provide the solution how to access bytea data and also put encrypted value in
table column and decrypt the same value.
Thanks
Tushar
If you encrypt/decrypt binary data you should use pgp_sym_encrypt_bytea and pgp_sym_decrypt_bytea functions.
The functions pgp_sym_encrypt and pgp_sym_decrypt are for textual data which has to be encoded in client encoding and possible to convert to database encoding. So you can not use them for example to encrypt images, PDFs etc.