How to use strong encryption for password field using grails with postgresql database? - postgresql

I had to migrate a legacy database with clear text password to a PostgresSQL database. I've looked up what's the best way to encrypt password in a database and found the pgcrypto extension with slow algorithm. (see pgcrypto documentation for 8.4)
The migration is done for data and everything is working well.
Now I have to write a CRUD application to handle this data.
I'm wondering what's the best way to use this strong encryption with grails ?
In my model, I've used the afterInsert event to handle this :
def afterInsert() {
Compte.executeUpdate("update Compte set hashpass=crypt(hashpass, gen_salt('bf', 8)) where id = (:compteId)", [compteId: this.id])
}
I guess that I should also check if the hashpass field is modified whenever the model is saved. But before that, is there another (best) way to achieve my goal ?
Edit : I cannot use the Spring Security bcrypt plugin here. The CRUD application that I'm writing use SSO CAS so I don't need such a plugin. The CRUD application manages accounts for another application that I don't own. I just need to create a new account, modify or delete an existing one. This is very simple. The tricky part is to hack grails so that it takes into account the password field and use a specific sql to store it to a postgreSQL database.
Edit 2 :
I've come up with the following code but it doesn't work
def beforeInsert() {
hashpass = encodePassword(hashpass);
}
def encodePassword(cleartextpwd) {
// create a key generator based upon the Blowfish cipher
KeyGenerator keygenerator = KeyGenerator.getInstance("Blowfish");
// create a key
SecretKey secretkey = keygenerator.generateKey();
// create a cipher based upon Blowfish
Cipher cipher = Cipher.getInstance(ALGORITHM);
// initialise cipher to with secret key
cipher.init(Cipher.ENCRYPT_MODE, secretkey);
// get the text to encrypt
String inputText = cleartextpwd;
// encrypt message
byte[] encrypted = cipher.doFinal(inputText.getBytes("UTF-8"));
return Base64.encodeBase64String(encrypted);
}
I get a hash that is not a blowfish hash (beginning with $2a$08$ )
Edit 3 :
I've finally came up with a cleaner grails solution after reading this wiki page : grails.org/Simple+Dynamic+Password+Codec (not enough reputation to put more than 2 links so add http:// before) and the bug report jira.grails.org/browse/GRAILS-3620
Following advice from #lukelazarovic, I've also used the algorithm from the spring security plugin.
Here is my password encoder to put into grails/utils :
import grails.plugin.springsecurity.authentication.encoding.BCryptPasswordEncoder;
class BlowfishCodec {
static encode(target) {
// TODO need to put the logcount = 8 in configuration file
return new BCryptPasswordEncoder(8).encodePassword(
target, null)
}
}
I've updated my Compte model to call my password encoder before saving / updating the data :
def beforeInsert() {
hashpass = hashpass.encodeAsBlowfish();
}
def beforeUpdate() {
if(isDirty('hashpass')) {
hashpass = hashpass.encodeAsBlowfish();
}
}

The tricky part is to hack grails so that it takes into account the
password field and use a specific sql to store it to a postgreSQL
database.
Is there any particular reason to do the hashing in database?
IMHO it's better to hash the password in Grails, therefore have a code that is not database-specific and easier to read.
For hashing passwords using Blowfish algorithm using Java or Groovy see Encryption with BlowFish in Java
The resulting hash begins with algorithm specification, iteration count and salt, separated with dollar sign - '$'. So the hash may look like "$2a$08$saltCharacters" where 2a is a algorithm, 08 is iteration count, then follows salt and after salt is the hash itself.
For broader explanation see http://www.techrepublic.com/blog/australian-technology/securing-passwords-with-blowfish. Don't mind that it concerns to Blowfish in PHP, the principles applies for Java or Groovy as well.

Related

Is it "secure" to store a password and username in a .env file in a server to validate an admin API endpoint against?

Context
I've build a RESTful API server in Actix-Web with Rust that's hosted on a Heroku paid plan. It has n amount of publicly available endpoints to access content, alongside 3 strictly admin-only endpoints (for creating, editing, and deleting public content).
I am the only developer who'd ever need to access the admin-only endpoints - and infrequently at that. Several random users will be using the publicly available endpoints daily.
Normally, I'd implement an authentication/authorization strategy akin to this using JWTs (but obviously in Rust for my case). However, the added complexity that comes with this "more common" solution seems overkill for my simple use-case.
My theorized solution
Could I add a username and password field to the .env file in my project like so in order to match against a username and password passed in the admin-only handler functions?
... OTHER KEYS ...
USERNAME = my_really_long_random_username
PASSWORD = my_really_long_random_password
At first glance I'm storing passwords in plain text... but, there's only 1 and it's in my .env file, which is private by default.
All I'd do for the admin-only routes then is this (pseudo-code):
pub fn router_handler(passed_data) -> HttpResponse {
if passed_data.username == env.username && passed_data.password == env.password {
// CONSIDER THEM ADMIN
} else {
// BLOCK THEM AS THEY'RE NOT AUTHENTICATED
}
}
What I've tried
I have yet to try this strategy, but I'm curious about your opinions on it.
Question
Is my theorized solution secure? Does it seem reasonable given my use-case?
Response to question: jthulhu - is this what I do?
So, my .env file should look something like this:
... OTHER KEYS ...
USERNAME = a98ysnrn938qwyanr9c8yQden
PASSWORD = aosdf83h282huciquhr8291h91
where both of those hashes are the results of running my pre-determined username and password through my to_hash function which I added below (likely using a lib like this).
Then, my handler should be like this (psuedo-code):
pub fn router_handler(passed_data) -> HttpResponse {
if to_hash(passed_data.username) == env.username && to_hash(passed_data.password) == env.password {
// CONSIDER THEM ADMIN
} else {
// BLOCK THEM AS THEY'RE NOT AUTHENTICATED
}
}
You should never store passwords in plain text in a server, because if someones breaks in your server, and can read that file, they now have access to everything (whereas they might previously not). Not only that, but most people tend to reuse passwords, so storing one password in plain text means exposing several services where that password is used.
Instead, you should hash the passwords and store the hash. To perform a login, check if the hash of the given password corresponds to the one stored. This mechanism can be used with files or with databases alike, and is pretty much independent on how you actually store the hashes.

IdentityServer3 idsrv.partial cookie gets too big

After login when redirecting the user using context.AuthenticateResult = new AuthenticateResult(<destination>, subject, name, claims) the partial cookie gets so big that it contains up to 4 chunks and ends up causing "request too big" error.
The number of claims is not outrageous (in the 100 range) and I haven't been able to consistently reproduce this on other environments, even with larger number of claims. What else might be affecting the size of this cookie payload?
Running IdSrv3 2.6.1
I assume that you are using some .NET Framework clients, because all of these problems are usually connected with the Microsoft.Owin middleware, that has some encryption that causes the cookie to get this big.
The solution for you is again part of this middleware. All of your clients (using the Identity Server as authority) need to have a custom IAuthenticationSessionStore imlpementation.
This is an interface, part of Microsoft.Owin.Security.Cookies.
You need to implement it according to whatever store you want to use for it, but basically it has the following structure:
public interface IAuthenticationSessionStore
{
Task RemoveAsync(string key);
Task RenewAsync(string key, AuthenticationTicket ticket);
Task<AuthenticationTicket> RetrieveAsync(string key);
Task<string> StoreAsync(AuthenticationTicket ticket);
}
We ended up implementing a SQL Server store, for the cookies. Here is some example for Redis Implementation, and here is some other with EF DbContext, but don't feel forced to use any of those.
Lets say that you implement MyAuthenticationSessionStore : IAuthenticationSessionStore with all the values that it needs.
Then in your Owin Startup.cs when calling:
app.UseCookieAuthentication(new CookieAuthenticationOptions
{
AuthenticationType = "Cookies",
SessionStore = new MyAuthenticationSessionStore()
CookieName = cookieName
});
By this, as the documentation for the IAuthenticationSessionStore SessionStore property says:
// An optional container in which to store the identity across requests. When used,
// only a session identifier is sent to the client. This can be used to mitigate
// potential problems with very large identities.
In your header you will have only the session identifier, and the identity itself, will be read from the Store that you have implemented

password hashing using md5 in mongoengine

I am using mongoengine(MongoDb ORM) in django.I wan to authenticate the user and his password should be stored in hashed value. Plz help me as mongoengine doesn't give any PasswordField() to store password.
Any other options through which I can authenticate the user login .
Django has two very useful password hashing algorithms built in.
See docs.djangoproject.com, which states
By default, Django uses the PBKDF2 algorithm with a SHA256 hash
and
Bcrypt is a popular password storage algorithm that’s specifically designed for long-term password storage. It’s not the default used by Django since it requires the use of third-party libraries, but since many people may want to use it Django supports bcrypt with minimal effort.
Either one of these two are excellent if you use a large enough number of iterations/work factor; do not use any of the other options. This is made easy by django per the link above:
The PBKDF2 and bcrypt algorithms use a number of iterations or rounds of hashing. This deliberately slows down attackers, making attacks against hashed passwords harder. However, as computing power increases, the number of iterations needs to be increased. We’ve chosen a reasonable default (and will increase it with each release of Django), but you may wish to tune it up
So, in your settings file for a new application, you could increase the work factor with a new subclass:
from django.contrib.auth.hashers import PBKDF2PasswordHasher
class MyPBKDF2PasswordHasher(PBKDF2PasswordHasher):
"""
A subclass of PBKDF2PasswordHasher that uses 100 times more iterations.
"""
iterations = PBKDF2PasswordHasher.iterations * 100
then put your variant in the settings file, while allowing old PBKDF2-HMAC-SHA-256 and BCryptSHA256 hashes to be read:
PASSWORD_HASHERS = [
'myproject.hashers.MyPBKDF2PasswordHasher',
'django.contrib.auth.hashers.PBKDF2PasswordHasher',
'django.contrib.auth.hashers.BCryptSHA256PasswordHasher',
'django.contrib.auth.hashers.BCryptPasswordHasher',
]
And also set some password validation:
AUTH_PASSWORD_VALIDATORS = [
{
'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',
'OPTIONS': {
'min_length': 9,
}
},
{
'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',
},
]
To validate an entered password (like on a login page) against a stored password:
check_password(password, encoded)
To generate a new password entry (like from a registration page, where they select or change their password):
make_password(password, salt=None, hasher='default')

How do I sign a PDF with a Smart Card in a web context using iText?

Read through the following references:
iText Digital signature white paper, and C# examples. (specifically chapter 4) For those interested, another great and concise summary of the PDF signing process.
CAPICOM documentation.
Online examples / questions here and on iText mailing list archives, such as here and here.
Hashing code:
BouncyCastle.X509Certificate[] chain = Utils.GetSignerCertChain();
reader = Utils.GetReader();
MemoryStream stream = new MemoryStream();
using (var stamper = PdfStamper.CreateSignature(reader, stream, '\0'))
{
PdfSignatureAppearance sap = stamper.SignatureAppearance;
sap.SetVisibleSignature(
new Rectangle(36, 740, 144, 770),
reader.NumberOfPages,
"SignatureField"
);
sap.Certificate = chain[0];
sap.SignDate = DateTime.Now;
sap.Reason = "testing web context signatures";
PdfSignature pdfSignature = new PdfSignature(
PdfName.ADOBE_PPKLITE, PdfName.ADBE_PKCS7_DETACHED
);
pdfSignature.Date = new PdfDate(sap.SignDate);
pdfSignature.Reason = sap.Reason;
sap.CryptoDictionary = pdfSignature;
Dictionary<PdfName, int> exclusionSizes = new Dictionary<PdfName, int>();
exclusionSizes.Add(PdfName.CONTENTS, SIG_BUFFER * 2 + 2);
sap.PreClose(exclusionSizes);
Stream sapStream = sap.GetRangeStream();
byte[] hash = DigestAlgorithms.Digest(
sapStream,
DigestAlgorithms.SHA256
);
// is this needed?
PdfPKCS7 sgn = new PdfPKCS7(
null, chain, DigestAlgorithms.SHA256, true
);
byte[] preSigned = sgn.getAuthenticatedAttributeBytes(
hash, sap.SignDate, null, null, CryptoStandard.CMS
);
var hashedValue = Convert.ToBase64String(preSigned);
}
Just a simple test - a dummy Pdf document is created on initial page request, hash is calculated, and put in a hidden input field Base64 encoded. (the hashedValue above)
Then use CAPICOM on client-side to POST the form and get user's signed response:
PdfSignatureAppearance sap = (PdfSignatureAppearance)TempData[TEMPDATA_SAP];
PdfPKCS7 sgn = (PdfPKCS7)TempData[TEMPDATA_PKCS7];
stream = (MemoryStream)TempData[TEMPDATA_STREAM];
byte[] hash = (byte[])TempData[TEMPDATA_HASH];
byte[] originalText = (Encoding.Unicode.GetBytes(hashValue));
// Oid algorithm verified on client side
ContentInfo content = new ContentInfo(new Oid("RSA"), originalText);
SignedCms cms = new SignedCms(content, true);
cms.Decode(Convert.FromBase64String(signedValue));
// CheckSignature does not throw exception
cms.CheckSignature(true);
var encodedSignature = cms.Encode();
/* tried this too, but no effect on result
sgn.SetExternalDigest(
Convert.FromBase64String(signedValue),
null,
"RSA"
);
byte[] encodedSignature = sgn.GetEncodedPKCS7(
hash, sap.SignDate, null, null, null, CryptoStandard.CMS
);
*/
byte[] paddedSignature = new byte[SIG_BUFFER];
Array.Copy(encodedSignature, 0, paddedSignature, 0, encodedSignature.Length);
var pdfDictionary = new PdfDictionary();
pdfDictionary.Put(
PdfName.CONTENTS,
new PdfString(paddedSignature).SetHexWriting(true)
);
sap.Close(pdfDictionary);
So right now I'm not sure if I'm messing up hashing part, signature part, or both. In signature code snippet above and in client code (not shown) I'm calling what I think is signature verification code, but that may be wrong too, since this is a first for me. Get the infamous "Document has been altered or corrupted since it was signed" invalid signature message when opening the PDF.
Client-side code (not authored by me) can be found here. Source has a variable naming error, which was corrected. For reference, CAPICOM documentation says signed response is in PKCS#7 format.
EDIT 2015-03-12:
After some nice pointers from #mkl and more research, it seems CAPICOM is practicably unusable in this scenario. Although not documented clearly, (what else is new?) according to here and here, CAPICOM expects a utf16 string (Encoding.Unicode in .NET) as input to create a digital signature. From there it either pads or truncates (depending which source in previous sentence in correct) whatever data it receives if the length is an odd number. I.e. signature creation will ALWAYS FAIL if the Stream returned by PdfSignatureAppearance.GetRangeStream() has a length that is an odd number. Maybe I should create an I'm lucky option: sign if ranged stream length is even, and throw an InvalidOperationException if odd. (sad attempt at humor)
For reference, here's the test project.
EDIT 2015-03-25:
To close the loop on this, here's a link to a VS 2013 ASP.NET MVC project. May not the be best way, but it does provide a fully working solution to the problem. Because of CAPICOM's strange and inflexible signing implementation, as described above, knew a possible solution would potentially require a second pass and a way to inject an extra byte if the return value of PdfSignatureAppearance.GetRangeStream() (again, Stream.Length) is an odd number. I was going to try the long and hard way by padding the PDF content, but luckily a co-worker found it was much easier to pad PdfSignatureAppearance.Reason. Requiring a second pass to do something with iText[Sharp], is not unprecedented - e.g. adding page x of y for a document page header/footer.
Use of PdfPkcs7
The server-side code contains this block after the calculation of the range stream digest and before forwarding data to the web page:
PdfPKCS7 sgn = new PdfPKCS7(
null, chain, DigestAlgorithms.SHA256, true
);
byte[] preSigned = sgn.getAuthenticatedAttributeBytes(
hash, sap.SignDate, null, null, CryptoStandard.CMS
);
var hashedValue = Convert.ToBase64String(preSigned);
In the case at hand this is not necessary. It is needed only if the external signing API you use merely returns a signed digest; in that case the PdfPKCS7 instance builds the CMS/PKCS#7 signature container. You, on the other hand, use an API for which you know
CAPICOM documentation says signed response is in PKCS#7 format.
Thus, you don't need and (more to the point) must not use the PdfPKCS7 instance.
What does sign.js sign
The content of the server-side hash variable already is the hash digest value of the data to sign. Thus, the frontend, i.e. the sign.js used there, must not hash it again to get the message digest attribute value to put into the signature.
But sign.js signing methods for IE eventually execute
var signedData = new ActiveXObject("CAPICOM.SignedData");
// Set the data that we want to sign
signedData.Content = src;
SignedData.Content, on the other hand, is documented as
Content Read/write Data to be signed.
(msdn: "SignedData object")
So the hash from the backend is used as data to be signed and not as hash of the data to be signed, you indeed hash twice and so have the wrong hash value there.
Thus, it looks like you have to transmit the whole ranged stream which is not really practical...
"But there used to be signing samples using CAPICOM..."
Indeed some old iTextSharp (version 4.x) signing example used CAPICOM. But that code only worked because it created signatures of PDF signature type adbe.pkcs7.sha1 for which a SHA1 hash of the ranged stream indeed is the data embedded in and signed by the PKCS#7 signature.
This is no real option anymore because
it requires the use of SHA1 which in serious contexts is invalid, and
its use has been discouraged at least since ISO 32000-1 (2008) and will be officially deprecated in ISO 32000-2 (under development).

Laravel postgresql case insensitive

I'm coding a web app using Laravel 4.1 and Postgresql as database.
The db is case sensitive, but i'd like to make it case insensitive because, i.e., when a user is logging he should be able to access using upper case or lower case email address (like in every other website). However the column for the hash of the password must be case sensitive because the encryption method i use generates case sensitive strings.
I'm using Eloquen ORM of Laravel so i don't write queries directly.
How can i solve this problem?
Thanks in advance!
A bit late, but this is pretty simple. Just use the "ILIKE" operator, e.g.
User::where("email", "ILIKE", $email)->get();
I had this exact problem- my solution was to pretend to be case-insensitive:
1) add one line in the relevant methods to make the entered email value lowercase
2) did a replacement in the database so that emails were lowercase
3) made sure that new emails come in as lowercase
Details:
1) The relevant methods* are core laravel code, so you override them with a new version that replaces the email value after validating the request:
$request['email'] = Str::lower($request['email']);
log in flow: postLogin function from AuthenticatesUsers trail, I added to AuthController.php postLogin
password reset flow: postEmail and postReset functions from ResetsPassword trail, I added them in PasswordController.php
2) update users set email = lower(email);
3) For me this is easy because I create all users myself (my site is just for family)- but you'd do something similar in the auth flow
Hope this helps!