I need to create files for 45 separate locations (example: Boston, London, etc). And these file names have to be based on the date. Also can I provide a maximum file size to roll the files and the maximum number of files to roll.
Basically a file name must look like : Info_Boston_(2019.02.25).txt
So far I have come up with the below code to get by date. But I couldn't limit the file size to 1MB. The file grows beyond 1MB, and a new rolling file is not created. Please assist
<appender name="MyAppenderInfo" type="log4net.Appender.RollingFileAppender">
<param name="File" value="C:\\ProgramData\\Service\\Org\\Info"/>
<param name="RollingStyle" value="Date"/>
<param name="DatePattern" value="_(yyyy.MM.dd).\tx\t"/>
<param name="StaticLogFileName" value="false"/>
<maxSizeRollBackups value="10" />
<maximumFileSize value="1MB" />
<appendToFile value="true" />
<lockingModel type="log4net.Appender.FileAppender+MinimalLock" />
<layout type="log4net.Layout.PatternLayout">
<conversionPattern value="%date %message%n" />
</layout>
<filter type="log4net.Filter.LevelRangeFilter">
<levelMin value="DEBUG" />
<levelMax value="INFO" />
</filter>
</appender>
To address your specific post, I would not do this with a config based approach, as it would get rather cumbersome to manage I would think. A more programmatic approach would be to generate the logging instances dynamically.
EDIT: I took down the original to post this reworked example based on this SO post log4net: different logs on different file appenders at runtime
EDIT-2: I had to rework this again, as I realized I had omitted some required parts, and had some things wrong after the rework. This is tested and working. However, a few things to note, you will need to provide the using statements on the controller, to the logging class you make. next, you will need to DI your logging directories in as I have done, or come up with another method of providing the list of log file outputs.
This will allow you to very cleanly dynamically generate as many logging instances as you need to, to as many independent locations as you would like. I pulled this example from a project I did, and modified it a bit to fit your needs. Let me know if you have questions.
Create a Dynamic logger class which inherits from the base logger in the heirarchy:
using log4net;
using log4net.Repository.Hierarchy;
public sealed class DynamicLogger : Logger
{
private const string REPOSITORY_NAME = "somename";
internal DynamicLogger(string name) : base(name)
{
try
{
// try and find an existing repository
base.Hierarchy = (log4net.Repository.Hierarchy.Hierarchy)LogManager.GetRepository(REPOSITORY_NAME);
} // try
catch
{
// it doesnt exist, make it.
base.Hierarchy = (log4net.Repository.Hierarchy.Hierarchy)LogManager.CreateRepository(REPOSITORY_NAME);
} // catch
} // ctor(string)
} // DynamicLogger
then, build out a class to manage the logging instances, and build the new loggers:
using log4net;
using log4net.Appender;
using log4net.Config;
using log4net.Core;
using log4net.Filter;
using log4net.Layout;
using log4net.Repository;
using Microsoft.Extensions.Options;
using System.Collections.Generic;
using System.Linq;
public class LogFactory
{
private static List<ILog> _Loggers = new List<ILog>();
private static LoggingConfig _Settings;
private static ILoggerRepository _Repository;
public LogFactory(IOptions<LoggingConfig> configuration)
{
_Settings = configuration.Value;
ConfigureRepository(REPOSITORY_NAME);
} // ctor(IOptions<LoggingConfig>)
/// <summary>
/// Configures the primary logging repository.
/// </summary>
/// <param name="repositoryName">The name of the repository.</param>
private void ConfigureRepository(string repositoryName)
{
if(_Repository == null)
{
try
{
_Repository = LogManager.CreateRepository(repositoryName);
}
catch
{
// repository already exists.
_Repository = LogManager.GetRepository(repositoryName);
} // catch
} // if
} // ConfigureRepository(string)
/// <summary>
/// Gets a named logging instance, if it exists, and creates it if it doesnt.
/// </summary>
/// <param name="name"></param>
/// <returns></returns>
public ILog GetLogger(string name)
{
string filePath = string.Empty;
switch (name)
{
case "core":
filePath = _Settings.CoreLoggingDirectory;
break;
case "image":
filePath = _Settings.ImageProcessorLoggingDirectory;
break;
} // switch
if (_Loggers.SingleOrDefault(a => a.Logger.Name == name) == null)
{
BuildLogger(name, filePath);
} // if
return _Loggers.SingleOrDefault(a => a.Logger.Name == name);
} // GetLogger(string)
/// <summary>
/// Dynamically build a new logging instance.
/// </summary>
/// <param name="name">The name of the logger (Not file name)</param>
/// <param name="filePath">The file path you want to log to.</param>
/// <returns></returns>
private ILog BuildLogger(string name, string filePath)
{
// Create a new filter to include all logging levels, debug, info, error, etc.
var filter = new LevelMatchFilter();
filter.LevelToMatch = Level.All;
filter.ActivateOptions();
// Create a new pattern layout to determine the format of the log entry.
var pattern = new PatternLayout("%d %-5p %c %m%n");
pattern.ActivateOptions();
// Dynamic logger inherits from the hierarchy logger object, allowing us to create dynamically generated logging instances.
var logger = new DynamicLogger(name);
logger.Level = Level.All;
// Create a new rolling file appender
var rollingAppender = new RollingFileAppender();
// ensures it will not create a new file each time it is called.
rollingAppender.AppendToFile = true;
rollingAppender.Name = name;
rollingAppender.File = filePath;
rollingAppender.Layout = pattern;
rollingAppender.AddFilter(filter);
// allows us to dynamically generate the file name, ie C:\temp\log_{date}.log
rollingAppender.StaticLogFileName = false;
// ensures that the file extension is not lost in the renaming for the rolling file
rollingAppender.PreserveLogFileNameExtension = true;
rollingAppender.DatePattern = "yyyy-MM-dd";
rollingAppender.RollingStyle = RollingFileAppender.RollingMode.Date;
// must be called on all attached objects before the logger can use it.
rollingAppender.ActivateOptions();
logger.AddAppender(rollingAppender);
// Sets the logger to not inherit old appenders, or the core appender.
logger.Additivity = false;
// sets the loggers effective level, determining what level it will catch log requests for and log them appropriately.
logger.Level = Level.Info;
// ensures the new logger does not inherit the appenders of the previous loggers.
logger.Additivity = false;
// The very last thing that we need to do is tell the repository it is configured, so it can bind the values.
_Repository.Configured = true;
// bind the values.
BasicConfigurator.Configure(_Repository, rollingAppender);
LogImpl newLog = new LogImpl(logger);
_Loggers.Add(newLog);
return newLog;
} // BuildLogger(string, string)
} // LogFactory
Then, in your Dependency Injection you can inject your log factory. You can do that with something like this:
services.AddSingleton<LogFactory>();
Then in your controller, or any constructor really, you can just do something like this:
private LogFactory _LogFactory;
public HomeController(LogFactory logFactory){
_LogFactory = logFactory;
}
public async Task<IActionResult> Index()
{
ILog logger1 = _LogFactory.GetLogger("core");
ILog logger2 = _LogFactory.GetLogger("image");
logger1.Info("SomethingHappened on logger 1");
logger2.Info("SomethingHappened on logger 2");
return View();
}
This example will output:
2019-03-07 10:41:21,338 INFO core SomethingHappened on logger 1
in its own file called Core_2019-03-07.log
and also:
2019-03-07 11:06:29,155 INFO image SomethingHappened on logger 2
in its own file called Image_2019-03-07
Hope that makes more sense!
Related
I have a spring batch job (defined in xml) which generates the csv export.
Inside FlatFileItemWriter bean I am setting resource, where the name of file is set.
<bean id="customDataFileWriter" class="org.springframework.batch.item.file.FlatFileItemWriter" scope="step">
<property name="resource" value="file:/tmp/export/custom-export.csv"/>
...
Now I need to set this file name taking account a certain logic, so I need to set the file name from some java class. Any ideas?
Use the different builder classes of spring batch (job builder, step builder, and so on). Have a look at https://blog.codecentric.de/en/2013/06/spring-batch-2-2-javaconfig-part-1-a-comparison-to-xml/ to get an idea.
You can implement your own FlatFileItemWriter to override the method setResource and add your own logic to rename the file.
Here's an example implementation :
#Override
public void setResource(Resource resource) {
if (resource instanceof ClassPathResource) {
// Convert resource
ClassPathResource res = (ClassPathResource) resource;
try {
String path = res.getPath();
// Do something to "path" here
File file = new File(path);
// Check for permissions to write
if (file.canWrite() || file.createNewFile()) {
file.delete();
// Call parent setter with new resource
super.setResource(new FileSystemResource(file.getAbsolutePath()));
return;
}
} catch (IOException e) {
// File could not be read/written
}
}
// If something went wrong or resource was delegated to MultiResourceItemWriter,
// call parent setter with default resource
super.setResource(resource);
}
Another possibility exists with the use of jobParameters, if your logic can be applied before job is launched. See 5.4 Late Binding of Spring Batch Documentation.
Example :
<bean id="flatFileItemReader" scope="step" class="org.springframework.batch.item.file.FlatFileItemReader">
<property name="resource" value="#{jobParameters['input.file.name']}" />
</bean>
You can also use a MultiResourceItemWriter with a custom ResourceSuffixCreator. That will let you create 1 to n files with a common filename pattern.
Here's an example of the method getSuffix of a custom ResourceSuffixCreator :
#Override
public String getSuffix(int index) {
// Your logic
if (true)
return "XXX" + index;
else
return "";
}
Another MS CRM question from me, I'm afraid. I've got the following code being executed on the update of a contact record but it gives me an error saying the job was cancelled because it includes an infinite loop. Can anyone tell me why this is happening, please?
// <copyright file="PostContactUpdate.cs" company="">
// Copyright (c) 2013 All Rights Reserved
// </copyright>
// <author></author>
// <date>8/7/2013 2:04:26 PM</date>
// <summary>Implements the PostContactUpdate Plugin.</summary>
// <auto-generated>
// This code was generated by a tool.
// Runtime Version:4.0.30319.1
// </auto-generated>
namespace Plugins3Test
{
using System;
using System.ServiceModel;
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;
/// <summary>
/// PostContactUpdate Plugin.
/// Fires when the following attributes are updated:
/// All Attributes
/// </summary>
public class PostContactUpdate: Plugin
{
/// <summary>
/// Initializes a new instance of the <see cref="PostContactUpdate"/> class.
/// </summary>
public PostContactUpdate()
: base(typeof(PostContactUpdate))
{
base.RegisteredEvents.Add(new Tuple<int, string, string, Action<LocalPluginContext>>(40, "Update", "contact", new Action<LocalPluginContext>(ExecutePostContactUpdate)));
// Note : you can register for more events here if this plugin is not specific to an individual entity and message combination.
// You may also need to update your RegisterFile.crmregister plug-in registration file to reflect any change.
}
/// <summary>
/// Executes the plug-in.
/// </summary>
/// <param name="localContext">The <see cref="LocalPluginContext"/> which contains the
/// <see cref="IPluginExecutionContext"/>,
/// <see cref="IOrganizationService"/>
/// and <see cref="ITracingService"/>
/// </param>
/// <remarks>
/// For improved performance, Microsoft Dynamics CRM caches plug-in instances.
/// The plug-in's Execute method should be written to be stateless as the constructor
/// is not called for every invocation of the plug-in. Also, multiple system threads
/// could execute the plug-in at the same time. All per invocation state information
/// is stored in the context. This means that you should not use global variables in plug-ins.
/// </remarks>
protected void ExecutePostContactUpdate(LocalPluginContext localContext)
{
if (localContext == null)
{
throw new ArgumentNullException("localContext");
}
// TODO: Implement your custom Plug-in business logic.
// Obtain the execution context from the service provider.
IPluginExecutionContext context = localContext.PluginExecutionContext;
IOrganizationService service = localContext.OrganizationService;
IServiceProvider serviceProvider = localContext.ServiceProvider;
ITracingService tracingService = localContext.TracingService;
// Obtain the target entity from the input parmameters.
//Entity contextEntity = (Entity)context.InputParameters["Target"];
Entity targetEntity = null;
targetEntity = (Entity)context.InputParameters["Target"];
Guid cid = targetEntity.Id;
ColumnSet cols = new ColumnSet("jobtitle");
Entity contact = service.Retrieve("contact", cid, cols);
contact.Attributes["jobtitle"] = "Sometitle";
service.Update(contact);
}
}
}
it's happening because your plugin is executed when a contact is updated and the last line of your code update the contact again, this cause to call again the plugin ...
Then you have your infinite loop
You can prevent the loop using the IExecutionContext.Depth property
http://msdn.microsoft.com/en-us/library/microsoft.xrm.sdk.iexecutioncontext.depth.aspx
However if you explain your requirement I think it's possible to find a solution.
At first if IExecutionContext.Depth <= 1 seems like a great idea, but it can bite you if you have a different plugin that updates the contact. You should be using the SharedVariables of the plugin context.
Something like this should work:
Add this declaration to the plugin class as a class level field:
public static readonly Guid HasRunKey = new Guid("{6339dc20-01ce-4f2f-b4a1-0a1285b65bff}");
And add this as the first step of your plugin:
if(context.SharedVariables.ContainsKey[HasRunKey]){
return;
}else{
context.SharedVariables.Add(HasRunKey);
// Proceed with plugin execution
}
**I went through a lot of trial and error. I don't know why plugin context does not work but this works but the parentcontext works. This (workaround?) works :)
**
if (this.Context.ParentContext != null && this.Context.ParentContext.ParentContext != null)
{
var assemblyName = Assembly.GetExecutingAssembly().GetName().Name;
if (!this.Context.ParentContext.ParentContext.SharedVariables.Contains(assemblyName))
{
this.Context.ParentContext.ParentContext.SharedVariables.Add(assemblyName, true.ToString() );
}
else
{
// isRecursive = true;
return;
}
}
Your plugin is updating the "jobtitle" field, I'm not sure if this plugin is being triggered by all contact updates, or you have set some FilteringAttributes to it in the Registerfile.crmregister Plugin's definition. By excluding the "jobtitle" field from the attributes that trigger this plugin you can solve your issue.
I have been using Entity Framework model first since VS 2010. When I build my project, EF generates a Model.Designer.cs file containing all entities. This designer file also contains the documentation added to the entities in the EDMX file.
When I created a new EF model first project in VS 2012, a Model.tt file is added to my EDMX file. This T4 template generates a single file for every entity in my model. Unfortunately, the documentation from the EDMX file is not used in the generated code.
I really like having my model documented so IntelliSense shows up when using it. The only workaround I have found so far is remove the Model.tt and the generated class files and turning the code generation on my EDMX file back on. This reverts back to the behaviour I am used from VS 2010. However, I would prefer having a separate file per entity.
Is there any way (preferably using VS tools and without having to modify any files that ship with VS) to include the documentation from the EDMX file in the generated single class files?
Edit: To further illustrate my problem, here is a quick example.
Let's say my model looks like this:
I have highlighted the part where I entered the documentation in the Properties window of the Id property.
This is what the entity looks like in the EDMX file:
<EntityType Name="Entity1">
<Key>
<PropertyRef Name="Id" />
</Key>
<Property Type="Int32" Name="Id" Nullable="false" annotation:StoreGeneratedPattern="Identity" >
<Documentation>
<Summary>This is documentation for the ID property.</Summary>
</Documentation>
</Property>
</EntityType>
The generated class (Entity1.cs) by Model.tt looks like this:
public partial class Entity1
{
public int Id { get; set; }
}
But when I turn on the code generation for my model, this is what the entity looks like in Model.Designer.cs:
/// <summary>
/// No Metadata Documentation available.
/// </summary>
[EdmEntityTypeAttribute(NamespaceName="Model1", Name="Entity1")]
[Serializable()]
[DataContractAttribute(IsReference=true)]
public partial class Entity1 : EntityObject
{
#region Factory Method
/// <summary>
/// Create a new Entity1 object.
/// </summary>
/// <param name="id">Initial value of the Id property.</param>
public static Entity1 CreateEntity1(global::System.Int32 id)
{
Entity1 entity1 = new Entity1();
entity1.Id = id;
return entity1;
}
#endregion
#region Simple Properties
/// <summary>
/// This is documentation for the ID property.
/// </summary>
[EdmScalarPropertyAttribute(EntityKeyProperty=true, IsNullable=false)]
[DataMemberAttribute()]
public global::System.Int32 Id
{
get
{
return _Id;
}
set
{
if (_Id != value)
{
OnIdChanging(value);
ReportPropertyChanging("Id");
_Id = StructuralObject.SetValidValue(value, "Id");
ReportPropertyChanged("Id");
OnIdChanged();
}
}
}
private global::System.Int32 _Id;
partial void OnIdChanging(global::System.Int32 value);
partial void OnIdChanged();
#endregion
}
So you see: Model.Designer.cs contains my custom documentation string "This is documentation for the ID property." while Entity1.cs does not. However, Model.Designer.cs can get quite big if there are many entities and debugging into this file is somewhat slow. I'd prefer having several small files (one per entity), but still preserve the documentation from the EDMX file in the generated code.
I think you'll have to modified the T4 file. I've got the same problem and read through the T4 file a bit, and tried to follow the instruction here: http://karlz.net/blog/index.php/2010/01/16/xml-comments-for-entity-framework/
However, we're using VS 2012 and the instruction doesn't seem to work 100%. I ended up changing the property generation code at the end of the T4 file and it works exactly how I wanted it to be. The changes are in CodeStringGenerator.Property() and CodeStringGenerator.NavigationProperty()
public string Property(EdmProperty edmProperty)
{
string doc = "";
if (edmProperty.Documentation != null)
{
doc = string.Format(
CultureInfo.InvariantCulture,
"\n\t\t/// <summary>\n\t\t/// {0} - {1}\n\t\t/// </summary>\n\t\t",
edmProperty.Documentation.Summary ?? "",
edmProperty.Documentation.LongDescription ?? "");
}
return doc + string.Format(
CultureInfo.InvariantCulture,
"{0} {1} {2} {{ {3}get; {4}set; }}",
Accessibility.ForProperty(edmProperty),
_typeMapper.GetTypeName(edmProperty.TypeUsage),
_code.Escape(edmProperty),
_code.SpaceAfter(Accessibility.ForGetter(edmProperty)),
_code.SpaceAfter(Accessibility.ForSetter(edmProperty)));
}
public string NavigationProperty(NavigationProperty navigationProperty)
{
var endType = _typeMapper.GetTypeName(navigationProperty.ToEndMember.GetEntityType());
string doc = "";
if (navigationProperty.Documentation != null)
{
doc = string.Format(
CultureInfo.InvariantCulture,
"\n\t\t/// <summary>\n\t\t/// {0} - {1}\n\t\t/// </summary>\n\t\t",
navigationProperty.Documentation.Summary ?? "",
navigationProperty.Documentation.LongDescription ?? "");
}
return doc + string.Format(
CultureInfo.InvariantCulture,
"{0} {1} {2} {{ {3}get; {4}set; }}",
AccessibilityAndVirtual(Accessibility.ForProperty(navigationProperty)),
navigationProperty.ToEndMember.RelationshipMultiplicity == RelationshipMultiplicity.Many ? ("ICollection<" + endType + ">") : endType,
_code.Escape(navigationProperty),
_code.SpaceAfter(Accessibility.ForGetter(navigationProperty)),
_code.SpaceAfter(Accessibility.ForSetter(navigationProperty)));
}
Note that it won't work with class documentation, so you have to do something like this with entity and complex type
<#=codeStringGenerator.UsingDirectives(inHeader: false)#>
<#if (!ReferenceEquals(entity.Documentation, null))
{
#>
/// <summary>
/// <#=entity.Documentation.Summary#> – <#=entity.Documentation.LongDescription#>
/// </summary>
<#}#>
<#=codeStringGenerator.EntityClassOpening(entity)#>
I'm using Nlog to write some logging to a textfile. Partial nlog.config:
<target name="file" xsi:type="File" fileName="${basedir}/MBWRunner_log.txt"
layout="${date} (${level}): ${message}
Exception: ${exception:format=Method, ToString}"/>
Lines in the logfile look like this:
0001-01-01 00:00:00 (Trace): MBWRunner started
As you can see the date and time are all 0. I have tested {longdate} and {date:format=yyyyMMddHHmmss} with the same result.
The application is a console app, run from an elevated commandline.
Any clues?
[EDIT] I have tested this on 2 machine's within the organisation with the same result. Please help!
Code used:
static Logger _logger = LogManager.GetCurrentClassLogger();
public static void Log(string message, LogLevel priority)
{
LogEventInfo eventinfo = new LogEventInfo(); ;
eventinfo.Message = message;
eventinfo.Level = priority;
Log(eventinfo);
}
static void Log(LogEventInfo logentry)
{
_logger.Log(logentry);
}
UPDATE:
#edosoft I think the problem is your use of the default constructor for LogEventInfo. If you look at the source for LogEventInfo here
https://github.com/NLog/NLog/blob/master/src/NLog/LogEventInfo.cs
You will see that using the default constructor does not populate the .TimeStamp field, so the field will probably just default to the default value for DateTime, which I assume is DateTime.MinValue. You should use one of the other constructors or one of the Create methods. Since you are setting only the Message and Level fields, I would suggest either:
var logEvent = new LogEventInfo(priority, "", message); //Second param is logger name.
Or
var logEvent = LogEventInfo.Create(priority, "", message);
From the NLog source for DateLayoutRenderer (from here) we can see that the date value that gets written as part of the logging stream is calculated like this:
protected override void Append(StringBuilder builder, LogEventInfo logEvent)
{
var ts = logEvent.TimeStamp;
if (this.UniversalTime)
{
ts = ts.ToUniversalTime();
}
builder.Append(ts.ToString(this.Format, this.Culture));
}
What is happening here is that the DateLayoutRenderer is getting the TimeStamp value from the LogEventInfo object (NLog creates one of these each time you use the Logger.Trace, Logger.Debug, Logger.Info, etc methods. You can also create LogEventInfo objects yourself and log them with the Logger.Log method).
By default, when a LogEventInfo object is created, its TimeStamp field is set like this (from the source for LogEventInfo here) (note the use of CurrentTimeGetter.Now):
public LogEventInfo(LogLevel level, string loggerName, IFormatProvider formatProvider, [Localizable(false)] string message, object[] parameters, Exception exception)
{
this.TimeStamp = CurrentTimeGetter.Now;
this.Level = level;
this.LoggerName = loggerName;
this.Message = message;
this.Parameters = parameters;
this.FormatProvider = formatProvider;
this.Exception = exception;
this.SequenceID = Interlocked.Increment(ref globalSequenceId);
if (NeedToPreformatMessage(parameters))
{
this.CalcFormattedMessage();
}
}
The TimeStamp field is set in the LogEventInfo constructor using the TimeSource.Current.Now property, whose implementation can be seen here.
(UPDATE - At some point NLog changed from using CurrentTimeGetter to a more generic approach of having a TimeSource object that has several flavors (one of which, CachedTimeSource, is essentially the same as CurrentTimeGetter)).
To save the trouble of navigating the link, here is the source for CachedTimeSource:
public abstract class CachedTimeSource : TimeSource
{
private int lastTicks = -1;
private DateTime lastTime = DateTime.MinValue;
/// <summary>
/// Gets raw uncached time from derived time source.
/// </summary>
protected abstract DateTime FreshTime { get; }
/// <summary>
/// Gets current time cached for one system tick (15.6 milliseconds).
/// </summary>
public override DateTime Time
{
get
{
int tickCount = Environment.TickCount;
if (tickCount == lastTicks)
return lastTime;
else
{
DateTime time = FreshTime;
lastTicks = tickCount;
lastTime = time;
return time;
}
}
}
}
The purpose of this class is to use a relatively cheap operation (Environment.Ticks) to limit access to a relatively expensive operation (DateTime.Now). If the value of Ticks does not change from call to call (from one logged message to the next), then the value of DateTime.Now retrieved the this time will be the same as the value of DateTime.Now retrieved this time, so just use the last retrieved value.
With all of this code in play (and with Date/Time logging apparently working for most other people), one possible explanation of your problem is that you are using the Logger.Log method to log your messages and you are building the LogEventInfo objects yourself. By default, if you just new a LogEventInfo object, the automatic setting of the TimeStamp property should work fine. It is only dependent on Environment.Ticks, DateTime.Now, and the logic that reuses the last DateTime.Now value, if appropriate.
Is it possible that you are creating a LogEventInfo object and then setting its TimeStamp property to DateTime.MinValue? I ask because the date that is being logged is DateTime.MinValue.
The only other explanation that I can think of would be if Environment.Ticks returns -1 for some reason. If it did, then CurrentTimeGetter would always return the initial value of the lastDateTime private member variable. I can't imagine a scenario where Environment.Ticks would return -1.
Wondering if anybody out there has any success in using the JDEdwards XMLInterop functionality. I've been using it for a while (with a simple PInvoke, will post code later). I'm looking to see if there's a better and/or more robust way.
Thanks.
As promised, here is the code for integrating with JDEdewards using XML. It's a webservice, but could be used as you see fit.
namespace YourNameSpace
{
/// <summary>
/// This webservice allows you to submit JDE XML CallObject requests via a c# webservice
/// </summary>
[WebService(Namespace = "http://WebSite.com/")]
[WebServiceBinding(ConformsTo = WsiProfiles.BasicProfile1_1)]
public class JdeBFService : System.Web.Services.WebService
{
private string _strServerName;
private UInt16 _intServerPort;
private Int16 _intServerTimeout;
public JdeBFService()
{
// Load JDE ServerName, Port, & Connection Timeout from the Web.config file.
_strServerName = ConfigurationManager.AppSettings["JdeServerName"];
_intServerPort = Convert.ToUInt16(ConfigurationManager.AppSettings["JdePort"], CultureInfo.InvariantCulture);
_intServerTimeout = Convert.ToInt16(ConfigurationManager.AppSettings["JdeTimeout"], CultureInfo.InvariantCulture);
}
/// <summary>
/// This webmethod allows you to submit an XML formatted jdeRequest document
/// that will call any Master Business Function referenced in the XML document
/// and return a response.
/// </summary>
/// <param name="Xml"> The jdeRequest XML document </param>
[WebMethod]
public XmlDocument JdeXmlRequest(XmlDocument xmlInput)
{
try
{
string outputXml = string.Empty;
outputXml = NativeMethods.JdeXmlRequest(xmlInput, _strServerName, _intServerPort, _intServerTimeout);
XmlDocument outputXmlDoc = new XmlDocument();
outputXmlDoc.LoadXml(outputXml);
return outputXmlDoc;
}
catch (Exception ex)
{
ErrorReporting.SendEmail(ex);
throw;
}
}
}
/// <summary>
/// This interop class uses pinvoke to call the JDE C++ dll. It only has one static function.
/// </summary>
/// <remarks>
/// This class calls the xmlinterop.dll which can be found in the B9/system/bin32 directory.
/// Copy the dll to the webservice project's /bin directory before running the project.
/// </remarks>
internal static class NativeMethods
{
[DllImport("xmlinterop.dll",
EntryPoint = "_jdeXMLRequest#20",
CharSet = CharSet.Auto,
ExactSpelling = false,
CallingConvention = CallingConvention.StdCall,
SetLastError = true)]
private static extern IntPtr jdeXMLRequest([MarshalAs(UnmanagedType.LPWStr)] StringBuilder server, UInt16 port, Int32 timeout, [MarshalAs(UnmanagedType.LPStr)] StringBuilder buf, Int32 length);
public static string JdeXmlRequest(XmlDocument xmlInput, string strServerName, UInt16 intPort, Int32 intTimeout)
{
StringBuilder sbServerName = new StringBuilder(strServerName);
StringBuilder sbXML = new StringBuilder();
XmlWriter xWriter = XmlWriter.Create(sbXML);
xmlInput.WriteTo(xWriter);
xWriter.Close();
string result = Marshal.PtrToStringAnsi(jdeXMLRequest(sbServerName, intPort, intTimeout, sbXML, sbXML.Length));
return result;
}
}
}
You have to send it messages like the following one:
<jdeRequest type='callmethod' user='USER' pwd='PWD' environment='ENV'>
<callMethod name='GetEffectiveAddress' app='JdeWebRequest' runOnError='no'>
<params>
<param name='mnAddressNumber'>10000</param>
</params>
</callMethod>
</jdeRequest>
To anyone trying to do this, there are some dependencies to xmlinterop.dll.
you'll find these files on the fat client here ->c:\E910\system\bin32
this will create a 'thin client'
PSThread.dll
icudt32.dll
icui18n.dll
icuuc.dll
jdel.dll
jdeunicode.dll
libeay32.dll
msvcp71.dll
ssleay32.dll
ustdio.dll
xmlinterop.dll
I changed our JDE web service to use XML Interop after seeing this code, and we haven't had any stability problems since. Previously we were using the COM Connector, which exhibited regular communication failures (possibly a connection pooling issue?) and was a pain to install and configure correctly.
We did have issues when we attempted to use transactions, but if you're doing simple single business function calls this shouldn't be an problem.
Update: To elaborate on the transaction issues - if you're attempting to keep a transaction alive over multiple calls, AND the JDE application server is handling a modest number of concurrent calls, the xmlinterop calls start returning an 'XML response failed' message and the DB transaction is left open with no way to commit or rollback. It's possible tweaking the number of kernels might solve this, but personally, I'd always try to complete the transaction in a single call.