Identifying the last time customers ordered a particular item and have them enter a workflow - workflow

I have made a Customer search that identifying the customers that haven't purchased an item in six months or more. I've used group summaries and maximum summaries for the company name and the maximum transaction date, which refers to the latest sales order containing a certain item. The idea is to send them an email. However, the workflow is only executing on 20 records at a time. i even conducted a search that was not summarized and the workflow still only executed on 20 records. I did use the "Execute Now" button in testing mode to see how many entered the work flow from the summary search. But each execution only yields 20 workflow instances. the searches yield about 213 and 300 records respectively. I appreciate any insight!

Workflows using searches will only process 20 records when in testing mode.

From SuiteAnswers Article 36738 (NetSuite Login required)
When you execute a workflow on demand, NetSuite only processes the first 20 records returned by the saved search. For example, if the saved search for a scheduled workflow returns 1000 records, the workflow only initiates on the first 20 records returned by the saved search.

Related

How to check if user story was updated?

I'm trying to create a logic app around a query that checks if user stories have been updated in the last 24 hours and notify user otherwise, but if I use the state as my clause, anything contributes to the state and it wont be accurate to TRUE updates (e.g. comments).
How should I approach this? Is "ChangedDate" the best field to check for updated stories and then remind the user to update the story after 24 hours have passed? Or is there a better condition I can use that checks for legitimate user story updates?
ChangedDate is enough.
ChangedDate can filter out all changed work items including state change and comments add records. it can achieve your requirement.
E.G:
Created Date >= #StartOfDay - 1
To remind users to update user stories regularly, we can try to install extensions like Scheduled Work Item Query, then create a pipeline using the extension task to send the query result to team members. We can Configure schedules for pipelines to trigger the pipeline based on your requirement, for example every 24 hours.

{EpicorERP} Running a Report to see what users are doing/printing

I was wondering if anyone had a method of running a report to see what a user is doing in Epicor or what they are printing. We are having users report that in the middle of the night, when no one is here at the plant, there are 500 page reports being printed. We are able to see in the print queue who printed what, but the report doesn't match up with anything in our system. We would have for example a report called DailySales.rpt, but in the printer queue it would be something like hb986a87dthr.rpt. Just wondering if anyone else has seen this, or would have a solution that would let me see what a user is printing.
It is not possible to link the print job directly to the SysTask record because neither the print job number, the temp file, nor mac addresses are saved in Epicor for cross referencing. It can be approximated by looking at runtimes and the SysTask record.
You can create a BAQ and BAQ report to display Active and recently completed SysTask information by user. This will give you the report run, start/end times, user, and current status. If you need more detailed information such as the criteria used in the report, you can also join to the SysTaskParam table. Keep in mind that the SysTaskParam table is fully normalized by field name, so you may want to join multiple copies of the table with specific criteria if you need a lot of information. Unfortunately, for "print all pages" jobs, Epicor doesn't know how many pages the report will be until after the data is instantiated and then it is rendered in the reporting software, so you won't be able to get any estimate of number of pages or size.
There are many strategies for mitigating the issue you described. Here are a couple:
You can use criteria within the BAQ to limit the number of records returned for a specific query
You can create a subquery criteria from BAQ parameters to return no data when abnormally open parameters are used for the report (e.g. > 30 days range). You could also use this method with time gates based on the current system time.
Retrain users

Streaming data Complex Event Processing over files and a rather long period

my challenge:
we receive files every day with about 200.000 records. We keep the files for approx 1 year, to support re-processing, etc..
For the sake of the discussion assume it is some sort of long lasting fulfilment process, with a provisioning-ID that correlates records.
we need to identify flexible patterns in these files, and trigger events
typical questions are:
if record A is followed by record B which is followed by record C, and all records occured within 60 days, then trigger an event
if record D or record E was found, but record F did NOT follow within 30 days, then trigger an event
if both records D and record E were found (irrespective of the order), followed by ... within 24 hours, then trigger an event
some pattern require lookups in a DB/NoSql or joins for additional information either to select the record, or to put into the event.
"Selecting a record" can be simple "field-A equals", but can also be "field-A in []" or "filed-A match " or "func identify(field-A, field-B)"
"days" might also be "hours" or "in previous month". Hence more flexible then just "days". Usually we have some date/timestamp in the record. The maximum is currently "within 6 months" (cancel within setup phase)
The created events (preferably JSON) needs to contain data from all records which were part of the selection process.
We need an approach that allows to flexibly change (add, modify, delete) the pattern, optionally re-processing the input files.
Any thoughts on how to tackle the problem elegantly? May be some python or java framework, or does any of the public cloud solutions (AWS, GCP, Azure) address the problem space especially well?
thanks a lot for your help
After some discussions and readings, we'll try first Apache Flink with the FlinkCEP library. From the docs and blog entries it seems to be able to do the job. It also seems AWS's choice, running on their EMR cluster. We didn't find any managed service on GCP nor Azure providing the functionalities. Of course we can always deploy and manage it ourselves. Unfortunately we didn't find a Python framework

How to search a job for multiple days using monitor work load in TWS Workload Scheduler?

Please help me to understand how to search a job or stream for multiple days using workload scheduler, i tried to use "Time Data Filter" but it is not working as expected.
The "List plan" allows to select any one archived dates. i tired symnew, but no joy. It allows only to search for a particular date.
I would like to search a job for X days and get the Start/End time as an output.
For example: I need to search a job "XXXX" from 1st march till 30th March.
You can search day by day using archived plans.
To look to jobs at any date, you can use a "Job Run History Report", this will include only a subset of the information, but allows to find jobs on larger date intervals.
History is kept in the database and you can alternatively run queries using SQL.
The history is automatically cleanup up after some days specified by logHistory / lh optman option, that by default is set to 400 days.

Is it possible to schedule refresh different parts of a dataset?

I have a report on PowerBI that has many pages/tabs and each one also has alot of data being displayed. As I didn't design this, I'm going through the report to eliminate as much as I can and possibly splitting the report as alot of the data only requries refreshing once a week.
This is where my query comes in, I have information on one page that requires a refresh every two hours over a 12 hour duration, one field of data that requires a daily refresh and two more fields only require refreshing when required.
Is it possible to segment scheduled refreshes throughout a single part of the report, or does scheduled refresh only allow the entire report to be refreshed? (I.E. Sales status is hourly, Outbound status is daily, and sales summary is weekly)
I'd rather avoid having to split reports, as it is very handy to have them on one page; rather than having to open two and view them on multiple monitors.
I am just starting out on PowerBI reports, having been shown enough to get what I need done; but plan to delve further in, this being my first port of call if it is possible.
Thanks for any reponses in advance.
Brian.
No. It's Not Possible.
PowerBI Internal working like Tabular Model.
In Import mode we can not do incremental refresh also.
So other option is you can create Reporting layer and define denormilized with calucaluated columns Reporting tables.( Sales ,summary )
and use Direct query or Refresh and Do ETL for This table.
So you can schedule ETL for specific Tables i.e.Sales or summary.