data context passed limit Plugin CRM - plugins

I am working on a plugin which is fired on the creation of the "Notes" annotation entity. But when attached file which is more then 10 MB size then plugin not execute.
what I want to know is what is the maximum limit of the data which is passed to the plugin. ?
plugin is working fine if I attached document which is less then 10 MB and is is not working if file size is more then 10 MB.
I have also used Async execution of the plugin so that I can check the error in system job but there is no error , system job is marked as "In Progress"
there is no detail about any error.
any suggestion ?

did you check if the note actually has the file attached? CRM has a configurable limit for attachment size - which defaults to 5MB, so you are probably getting blocked there unless you have increased that limit.

Just adding a bit more to what #Joseph said.
Here is the link which explains how to setup the file size limit for attachments.
CRM must be checking the file size limit before giving a chance to your plugin to be executed.

Related

How to set max form content in jetty or tomcat server

I was getting error "FORM too large" when I submit a form containing many data. As default size for JETTY server is 2 mb and my form size is exceeded.
I solved the issue locally by using system argument in Eclipse ie by setting max form content to -1, which means no limit to form content. But the problem is in UAT environment. How to do that, if we cannot use Eclipse?
Does any one know how to provide this configuration in JETTY or TOMCAT directly so that max form content is unlimited. I have tried the same configuration in start.ini file in JETTY but it is not working in UAT.
Please geeks share your idea,if any.
Thanks
Unlimited form content size is actually a form of vulnerability. (Look into how map hash key collision can be a harsh form of DOS attack)
Are you really sure you need to send 2MB of form data?
Note: multipart/form-data content that are FILES (not form fields) are handled with different max size configurations.
Look into the <multipart-config> entries in WEB-INF/web.xml or #MultiPartConfig annotations for more about those limits.

int-file:inbound-channel-adapter queue size clarification with AcceptAllFileListFilter filter

I have question about queue-size config in file inbound channel adapter. Based on my understanding this config allows you to keep no. of file in memory, so higher the number, more memory it will take.
Now we process lots of zip file, each zip file range from few 100kb to mb. Now if I use lower number like(10) and drop 20 zip file to dir, it is only processing 10 and ignoring other 10. What is happening is that we have custom filter which is processing all 20 zips and making an entry in db, so next time poller pick up remaining 10 zip then my filter is rejecting because there is already an entry in to db.
I am now confused how can i avoid this? Is it by changing filter or something else?
Note : we are using Custom filter which is extending AcceptAllFileListFilter as we need to tract duplicates zip file processed.
Using a queue-size was broken when using any filter - the queue size was ignored.
We recently fixed that in 4.2.
The workaround is to add the logic to your custom filter.

Crystal Report : "File is too large for attachment" Error

I am new for crystal report server. Here I'm explaining error details.
I'm using SAP Business Objects CMC for report generation for my application. Below is the version details in image.
When I try to generate report file with more than 1MB of file size it is throwing below error.
Error
The viewer could not process an event. 1c84865dce535c5.pdf File is too large for attachment. [] ---- Error code:0 [CRWEB00000119]
So, I went to following location to check the maximumUploadFileSize.
1. C:\Program Files (x86)\SAP BusinessObjects\tomcat\webapps\dswsbobje\WEB-INF\classes\dsws.properties
2. C:\Program Files (x86)\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\warfiles\webapps\dswsbobje\WEB-INF\classes\dsws.properties
3. C:\Program Files (x86)\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\java\pjs\container\work\<ServerName>.WebApplicationContainerServer\businessobjects\dswsbobje\WEB-INF\classes\dsws.properties
#Security measure to limit total upload file size
maximumUploadFileSize = 10485760
It is setted to 10485760 (10 MB), As per my understanding this is default size from this Reference Document.
So, if it is supporting upto 10 MB why it is throwing error when report file exceed 1MB?
And I tried to increase the size by multiple's of 10, meaning 104857600 (100 MB) in all this files, and restarted the server. But with out success, after restarting the server the modified value in third file is again setting to old value (10485760). Please help me in this.
Is there any way to increase maximumUploadFileSize through administrator console?
Please drop your comments, if you have any questions/doubts regarding this.
This issue can be resolved by following steps.
Actually this issue is related to number of records which are trying to generate as report in crystal report server. The default record data size limit is 20,000. By changing the limit into 0 (for Unlimited) we can able to resolve this issue.
Follow below steps to do this setting changes:
Log onto the CMC
Go to Servers in the drop-down menu
Expand Service Categories
Select Crystal Reports Services
In the right window will be listed the currently running services, find CrystalReports2013ProcessingServer under Description.
Double click on that, it will redirect to Properties page.
Set the value 0 for "Database Records Read When Previewing or Refreshing (0 for unlimited)".
Click Save & Close.
Restart CrystalReports2013ProcessingServer.
Now try to generate the large data pdf file, it should work fine.

SQL Server "Space Available" Alert?

I am looking for a way to send alerts when a database or log reach 10% space remaining.
Let me preface this question by saying that I intentionally did not include the word "file" in the question. While I have researched this question it appears that most people have their databases set up for auto-growth and then struggle to manage their database(s) at the file system level. There are a ton of examples out there dealing with how to send disk space alerts. THIS IS NOT MY QUESTION! My databases are ALL set to fixed size files. That means files are thus ALL pre-allocated from the file system when they are created or when a database needs to be expanded. As a matter of policy I do not allow ANY database to grow, uncontrolled, to the point of bringing down a whole server at the hands of one badly behaved application. Each database is managed within its pre-alloted space and grown manually as necessary to meet growing demands.
That said I am looking for the best way to send an alert when the database "reaming space" drops below 10% for example - technically I'll probably set up a warning and alert threshold. So far I haven't been able to find anything on this subject since most people seem fixated on disk space which makes this a bit like looking for a needle in a haystack.
I kind of hoped that SQL Server would have simple alert mechanism to do such a simple, obvious, thing right out of the box, but it looks like alerts mostly are designed to catch error messages which is a little late in my book - I'm looking to be a little more proactive.
So, again, looking to send alerts when database "remaining space" drops below various thresholds. Has anyone done this or seen it done?
Thanks!
Yes indeed. I have done this.
It is possible to set counters with queries against system tables. One possibility includes determining the percentage free space in a log or data file. Then, a SQL Alert can be created to E-mail a message to an operator that a particular threshold has been reached on a counter, such as there is only 5% space remaining in a database file. The solution requires several steps, but is possible using existing functionality.
To determine file names and space information, the following query may be used.
SELECT name AS 'File Name' ,
physical_name AS 'Physical Name',
size/128 AS 'Total Size in MB',
size/128.0 - CAST(FILEPROPERTY(name, 'SpaceUsed') AS int)/128.0 AS 'Available Space In MB',
round((CAST(FILEPROPERTY(name, 'SpaceUsed') AS float)/size)* 100 ,2) AS 'Percentage Used',
*
FROM sys.database_files;
Below are steps to set up an alert for percentage space free on a given file.
Create procedure that sets counter with value. This example sets counter number 10.
DECLARE #FreePercent int
SELECT #FreePercent = 100 - round((CAST(FILEPROPERTY(name, 'SpaceUsed') AS float)/size)* 100 ,2)
FROM sys.database_files
WHERE sys.database_files.name = 'NameOfYourLogOrDataFileHere';
EXEC sp_user_counter10 #FreePercent
Create a scheduled job to run the aforementioned procedure
Create SQL agent alert with counter, such that it executes when the free percentage drops below a certain threshold (i.e. 5%)
Configure database mail, test it, and create at least one operator
Enable SQL server agent alert E-mail (properties of agent), and restart the agent

Service invoked too many times: trigger

We are trying to implement a suite of spreadsheets that will handle budget figures for a set of stores. Everything works fine until we try to implement a spreadsheet that will collect data from all store spreadsheets and present statistics. Due to the limitation of ImportRange, of a maximum of 50 uses per spreadsheet doc, we have been implementing a Google docs script instead to handle the importing of data. But now when we have made a copy of the document to have one for each month, we are getting problems with our time triggers. We have setup a trigger to run the script once every minute and that results in an error message stating; Service invoked too many times: trigger.
What are the limitations here? And how do we best solve this?
We are also getting some other error messages and would like to know how to solve these;
Document tEHGO48zIBIFYRpb7Xhjwqg is missing (perhaps it was deleted?) (line 191)
Exceeded maximum execution time
Service error: Spreadsheets (line 290)
Where can we find documentation describing the different limitations and error messages?
Quota Limits for many services used with Google Apps Scripts have now been published on the Dashboard at:
https://docs.google.com/macros/dashboard
Just happened the same to me. It seems there is a non-published limit:
Premier accounts usually have larger quotas for every limitation.
The argument is that the account is better verified and less likely to exploit to resources.
But neither the regular limitations or Premier's better quotas are published by Google. And it seems that Googlers can't say it here in the forums either. The only well defined GAS limitation is the email quota, accessible through:
MailApp.getRemainingDailyQuota()
Which is 500 for regular accounts and 1500 for Premier.
Source: Google Support forums
Solutions are:
Join several scripts into one big trigger in case there is a limit in number of triggers
Optimize code (join loops, refresh only the necessary fields, etc.) in case it is based in CPU usage
Move minute timer triggers to OnEdit or OnOpen triggers whenever possible
Get a Premium Account
For your other errors, I haven't encountered any similar. You should post some details on the script or publish some code so we can debug it.