How do i download document from EMC DOCUMENTUM D2 into my C# code while using Kerberos authentication? - kerberos

My C# application is deployed in Sharepoint and i want to download a document from EMC DOCUMENTUM D2 in C# and upload the same in Sharepoint Document library.
I am using kerberos authentication for single sign on.Kerberos works fine when i search for documents however when i try download document from D2 somehow the service account is being used by the code to download the document(I know it uses service account when i see the incoming traffic log of the Documentum).
IObjectService objectService = this.GetRemoteServiceDownload<IObjectService>(moduleName);
The defination of the function.
protected IObjectService GetRemoteServiceDownload<IObjectService>(string serviceModule)
{
KerberosTokenHandler handler = new KerberosTokenHandler();
try
{
using (KerberosClientContext kerberosClientContext = new KerberosClientContext(servicePrincipalName, true, ImpersonationLevel.Delegation))
{
try
{
KerberosBinarySecurityToken token = new KerberosBinarySecurityToken(kerberosClientContext.InitializeContext(), KerberosValueType.KERBEROSV5_AP_REQ);
handler.SetBinarySecurityToken(token);
List<IEndpointBehavior> handlers = new List<IEndpointBehavior>();
handlers.Add(handler);
handlers.Add(new DFSBindingBehaviour(0, 10, 0, 0, 10, 0, 40960, 32, 16384, 16384, 20000000));
var remoteService = ServiceFactory.Instance.GetRemoteService<IObjectService>(serviceContext, serviceModule, address, handlers);
return remoteService;
}
catch (Exception ex)
{
Service.LoggerService.SetError(new Exception("In GetRemoteService" + ex.Message, ex));
return default(IObjectService);
}
}
}
catch (Exception ex)
{
Service.LoggerService.SetError(new Exception("In GetRemoteService using" + ex.Message, ex));
return default(IObjectService);
}
}

Guys i was able to solve the problem!.
It was not a kerberos issue actually it was a issue of the location of the calling function.
The function that initiated the Kerberos Authentication was within
using(SPSecurity.RunWithElevatedPrivileges )
{
}
Because of this the Kerberos authentication was making use of the service account to download the document.

Related

SSIS - Redirecting Email to Folder

I am generating emails from an SSIS package using a Script task. During testing, I do not want to really send the email, but drop the message into a folder. In an application, I would use the specifiedPickupDirectory option in the web.config, but SSIS packages do not have a web.config.
Is there a way to send the email to a folder?
Thanks
If you script task is using C# then the following should work. It's similar to how you would change the Web.config to use specifiedPickupDirectory
SmtpClient client = new SmtpClient("my_smtp_host");
client.DeliveryMethod = SmtpDeliveryMethod.SpecifiedPickupDirectory;
client.PickupDirectoryLocation = #"C:\save_email_directory";
client.Send(message);
You may also need to add Network credentials, see link for example
If you use Exchange mail and this library: http://independentsoft.de/ you can create a message and move it into a specific folder.
I do not own this software, but I'm a satisfied user.
Just start here: http://independentsoft.de/exchangewebservices/tutorial/createmessage.html with this example code:
using System;
using System.Net;
using Independentsoft.Exchange;
namespace Sample
{
class Program
{
static void Main(string[] args)
{
NetworkCredential credential = new NetworkCredential("username", "password");
Service service = new Service("https://myserver/ews/Exchange.asmx", credential);
try
{
Message message = new Message();
message.Subject = "Test";
message.Body = new Body("Body text");
message.ToRecipients.Add(new Mailbox("John#mydomain.com"));
message.CcRecipients.Add(new Mailbox("Mark#mydomain.com"));
ItemId itemId = service.CreateItem(message);
}
catch (ServiceRequestException ex)
{
Console.WriteLine("Error: " + ex.Message);
Console.WriteLine("Error: " + ex.XmlMessage);
Console.Read();
}
catch (WebException ex)
{
Console.WriteLine("Error: " + ex.Message);
Console.Read();
}
}
}
}

The client is not authorized to make this request -- while trying to get google cloud sql instance by java

I want to get details of Google Cloud Sql instance by using google cloud service account. I have created a service account which is billing enabled. I have successfully did Google Cloud Storage functionality like bucket create, bucket delete and so on by using this service account from java code. But while I tried to get GCS Sql functionality I am getting following error:
{
"code" : 403,
"errors" : [ {
"domain" : "global",
"message" : "The client is not authorized to make this request.",
"reason" : "notAuthorized"
} ],
"message" : "The client is not authorized to make this request."
}
Below are my java code snippet:
private SQLAdmin authorizeSqlAdmin() throws Exception {
if (cloudSqlAdmin == null) {
HttpTransport httpTransport = new NetHttpTransport();
JsonFactory jsonFactory = new JacksonFactory();
List<String> scopes = new ArrayList<String>();
scopes.add(SQLAdminScopes.CLOUD_PLATFORM);
scopes.add(SQLAdminScopes.SQLSERVICE_ADMIN);
String propertiesFileName = "/cloudstorage.properties";
Properties cloudStorageProperties = null;
try {
cloudStorageProperties = Utilities.getProperties(propertiesFileName);
} catch (Exception e) {
logger.error(e.getMessage(), e);
return null;
}
Credential credential = new GoogleCredential.Builder()
.setTransport(httpTransport)
.setJsonFactory(jsonFactory)
.setServiceAccountId(
cloudStorageProperties.getProperty(ACCOUNT_ID_PROPERTY)
)
.setServiceAccountPrivateKeyFromP12File(
new File(cloudStorageProperties.getProperty(PRIVATE_KEY_PATH_PROPERTY))
)
.setServiceAccountScopes(scopes).build();
cloudSqlAdmin = new SQLAdmin.Builder(httpTransport, jsonFactory, credential)
.setApplicationName(
cloudStorageProperties.getProperty(APPLICATION_NAME_PROPERTY)
)
.build();
}
return cloudSqlAdmin;
}
public DatabaseInstance getInstanceByInstanceId(String projectId, String instanceId) throws Exception {
SQLAdmin cloudSql = authorizeSqlAdmin();
Get get = cloudSql.instances().get(projectId, instanceId);
DatabaseInstance dbInstance = get.execute();
return dbInstance;
}
What am I missing here?
Somebody please help me.
N.B: I have added that service account as a member in permissions tab and gave this account as CAN EDIT permission
Solved this issue by replacing instance id value.
From GCS console I got the instance id as project-id:instance-name.
I putted whole part of project-id:instance-name as instance id and thats why I got the above error
After some trials I found that I need to give instance-name as instanceId in here
Get get = cloudSql.instances().get(projectId, instanceId);
That solved my problem.
Updated answer
if you are on terraform and receive this error, it means that your master instance name is set wrongly. A master should refer to an instance name that already exists in cloud sql( i.e., whichever is to be the master of the instance your are creating)
master_instance_name = "${google_sql_database_instance.master.name}"
It would be the same for json setup
"masterInstanceName": "source-instance"

Mapping an Azure File Service CloudFileShare as a virtual directory on each instance of a cloud service

I have an azure cloud service which I am attempting to upgrade for high availability and I have subscribed to the Microsoft Azure File Service preview which has been enabled in the preview portal. I have created a new storage account and can see the storage account now has a Files endpoint located at:
https://<account-name>.file.core.windows.net/
Within my web role I have the following code which looks to see if a share called scorm is created and if not it creates it:
public static void CreateCloudShare()
{
CloudStorageAccount account = CloudStorageAccount.Parse(System.Configuration.ConfigurationManager.AppSettings["SecondaryStorageConnectionString"].ToString());
CloudFileClient client = account.CreateCloudFileClient();
CloudFileShare share = client.GetShareReference("scorm");
share.CreateIfNotExistsAsync().Wait();
}
This works without issue. My problem is that I am unsure as to how to map the CloudShare that has been created as a virtual directory within my cloud service. On a single instance I was able to do this:
public static void CreateVirtualDirectory(string VDirName, string physicalPath)
{
try
{
if (VDirName[0] != '/')
VDirName = "/" + VDirName;
using (var serverManager = new ServerManager())
{
string siteName = RoleEnvironment.CurrentRoleInstance.Id + "_" + "Web";
//Site theSite = serverManager.Sites[siteName];
Site theSite = serverManager.Sites[0];
foreach (var app in theSite.Applications)
{
if (app.Path == VDirName)
{
// already exists
return;
}
}
Microsoft.Web.Administration.VirtualDirectory vDir = theSite.Applications[0].VirtualDirectories.Add(VDirName, physicalPath);
serverManager.CommitChanges();
}
}
catch (Exception ex)
{
System.Diagnostics.EventLog.WriteEntry("Application", ex.Message, System.Diagnostics.EventLogEntryType.Error);
//System.Diagnostics.EventLog.WriteEntry("Application", ex.InnerException.Message, System.Diagnostics.EventLogEntryType.Error);
}
}
I have looked and seen that it is possible to map this via powershell but I am unsure as to how I could call the code within my web role. I have added the following method to run the powershell code:
public static int ExecuteCommand(string exe, string arguments, out string error, int timeout)
{
Process p = new Process();
int exitCode;
p.StartInfo.FileName = exe;
p.StartInfo.Arguments = arguments;
p.StartInfo.CreateNoWindow = true;
p.StartInfo.UseShellExecute = false;
p.StartInfo.RedirectStandardError = true;
p.Start();
error = p.StandardError.ReadToEnd();
p.WaitForExit(timeout);
exitCode = p.ExitCode;
p.Close();
return exitCode;
}
I know that the command I have to run is:
net use z: \\<account-name>.file.core.windows.net\scorm /u:<account-name> <account-key>
How can I use this from within my web role? My web role code is as follows but does not seem to be working :
public override bool OnStart()
{
try
{
CreateCloudShare();
ExecuteCommand("net.exe", "user " + userName + " " + password + " /add", out error, 10000);
ExecuteCommand("netsh.exe", "firewall set service type=fileandprint mode=enable scope=all", out error, 10000);
ExecuteCommand("net.exe", " share " + shareName + "=" + path + " /Grant:" + userName + ",full", out error, 10000);
}
catch (Exception ex)
{
System.Diagnostics.EventLog.WriteEntry("Application", "CREATE CLOUD SHARE ERROR : " + ex.Message, System.Diagnostics.EventLogEntryType.Error);
}
return base.OnStart();
}
Our blog post Persisting connections to Microsoft Azure Files has an example of referencing Azure Files shares from web and worker roles. Please see the "Windows PaaS Roles" section and also take a look at the note under "Web Roles and User Contexts".
The library RedDog.Storage makse it really easy to mount a drive in your Cloud Service without having to worry about P/Invoke:
Install-Package RedDog.Storage
After the package is installed, you can simply use the extension method "Mount" on your CloudFileShare:
public class WebRole : RoleEntryPoint
{
public override bool OnStart()
{
// Mount a drive.
FilesMappedDrive.Mount("P:", #"\\acc.file.core.windows.net\reports", "sandibox",
"key");
// Unmount a drive.
FilesMappedDrive.Unmount("P:");
// Mount a drive for a CloudFileShare.
CloudFileShare share = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"))
.CreateCloudFileClient()
.GetShareReference("reports");
share.Mount("P:");
// List drives mapped to an Azure Files share.
foreach (var mappedDrive in FilesMappedDrive.GetMountedShares())
{
Trace.WriteLine(String.Format("{0} - {1}", mappedDrive.DriveLetter, mappedDrive.Path));
}
return base.OnStart();
}
}
More information: http://fabriccontroller.net/blog/posts/using-the-azure-file-service-in-your-cloud-services-web-roles-and-worker-role/

java api to get a file content for enterprise github

I tried so hard for a simple line of code that read a file content from enterprise github with oauth token, but could not find a example of such.
I tried https://github.com/jcabi/jcabi-github, but it does not support enterprise github?(maybe I am wrong)
Now i am trying egit:
GitHubClient client = new GitHubClient("enterprise url");
GitHubRequest request = new GitHubRequest();
request.setUri("/readme");
GitHubResponse response = client.get(request);
Then what? I only saw a getBody, maybe I need to parse it with some kinda json library? It has to be simpler..I am expecting something like: repo.get(url).getContent()
Finally figure out by reading source code..
GitHubClient client = new GitHubClient(YOURENTERPRICEURL);
client.setOAuth2Token(token);
// first use token service
RepositoryService repoService = new RepositoryService(client);
try {
Repository repo = repoService.getRepository(USER, REPONAME);
// now contents service
ContentsService contentService = new ContentsService(client);
List<RepositoryContents> test = contentService.getContents(repo, YOURFILENAME);
List<RepositoryContents> contentList = contentService.getContents(repo);
for(RepositoryContents content : test){
String fileConent = content.getContent();
String valueDecoded= new String(Base64.decodeBase64(fileConent.getBytes() ));
System.out.println(valueDecoded);
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}

Cloud SQL Admin API

I've been working with sqladmin-appengine-sample and the v1beta3 json API.
The Java code is running on App Engine. oauth2.
I can get it to work where when the currently logged in user is the app owner, but what I think I need is something like AppIdentityCredential so that the app can access any of the SQL instances it has access to regardless of the currently logged in user.
How do I do this?
Do I need to use a service account?
The short answer is that I could not get AppIdentityCredential to work, but setting up a Service Account credential did work. Here is the code:
Set<String> oAuthScopes = new HashSet<String>();
oAuthScopes.add(SQLAdminScopes.CLOUD_PLATFORM);
oAuthScopes.add(SQLAdminScopes.SQLSERVICE_ADMIN);
// service account credential
GoogleCredential credential;
try {
File p12File = new File(servletContext.getResource(PK12_FILE_NAME).toURI());
credential = new GoogleCredential.Builder()
.setTransport(Utils.HTTP_TRANSPORT)
.setJsonFactory(Utils.JSON_FACTORY)
.setServiceAccountId(SERVICE_ACCOUNT_ID)
.setServiceAccountScopes(oAuthScopes)
.setServiceAccountPrivateKeyFromP12File(p12File)
.build();
} catch (Exception e) {
throw new SecurityException(e);
}
// build the SQLAdmin object using the credentials
this.sqlAdmin = new SQLAdmin.Builder(Utils.HTTP_TRANSPORT, Utils.JSON_FACTORY, credential)
.setApplicationName(APPLICATION_NAME)
.build();
String timestamp = new Date().toString().replace(" ", "_").replace(":", "_");
ExportContext exportContent = new ExportContext();
exportContent.setDatabase(Arrays.asList(database_name));
exportContent.setKind("sql#exportContext");
exportContent.setUri("gs://"+GCS_BUCKET_NAME+"/"+database_name+"_"+timestamp+".mysql");
InstancesExportRequest exportRequest = new InstancesExportRequest();
exportRequest.setExportContext(exportContent);
// execute the exportRequest
this.sqlAdmin.instances().export(APPLICATION_NAME, instance_name, exportRequest).execute();