I'm trying to calculate the overtime of each resourcePool, what I reached is time-consuming and almost a manual calculation, looking for the correct way to do that, please.
What I'm doing is as follows, for example, I have the pharmacists' resource pool (regular hours, and over time pool, I divide them to be easier for me to get statistics of each shift and for optimization later on):
I added the following traceln in onExist realse of the pharmacists:
traceln(date());
traceln(agent.OrderID);
So it returns the date and time of all passed agents. Then I run the model and copy the traceln outputs from the console and process them in Excel:
the overtime hours of the day = last recorded time of a certain date - the start of overtime shift
I repeat the above for each day which takes too much time, especially since there are other resources that also I need their overtime.
Is there any simplified and quick method to get the daily overtime (and the daily mean overtime) of a resourcePool?
Thank you.
Related
I have created a simple model (see first attachment) in Anylogic. Resource unit W1 is seized in service and resource unit W2 is seized in service 1. The delay time of service and service 1 is both 5 minutes. The interarrival time of source is 10 minutes and interarrival time of source 1 is 6 minutes.
Now I would like to analyse the usage state of both resource units, but in dataset resource_unit_states_log there is only the state "usage_busy" logged. Is there any possibility to also log the usage state "idle" in this dataset? Later in my evaluation I want to know the exact date and time when the resource was in state "idle". Currently I can only read the exact date and time for the state "busy" from the data set (see screenshot in first attachment). Theoretically, I could manually calculate the date and time of the "idle" state based on the existing values, but that would take a long time with thousands of dates.
Another attempt was to track the "idle" state using a timeplot. If I use W1.time() as x-axis value, I get the model time (e.g. 0, 1, 2 ...) in the dataset. But I want instead as in the dataset "resource_unit_states_log" the exact date like 27-12-2021 00:06:00.
Does anyone have an idea how I can solve either of these problems?
AnyLogic internal tables/logs are not modifiable. They are as they are. If you want data in any other format, you need to do it by using your own data collection functions/codes. In your case, the second approach is pretty good. You are collecting information every minute and you can export it. I usually do the post-processing in Python. I work with millions of rows and it takes a few minutes; in your case thousands of rows should take some seconds. Here is how I would do it:
Export the data (in your second plot approach) into Excel. The data should look like this:
Open Jupyter notebook (or any IDE).
Read the data into Python. Lets say you have saved your data as data.xlsx.
Input your start_datetime, i.e. starting date and time of your simulation.
Then just add the minutes from your data to the start_datetime.
Write the modified data in a new Excel file called data_modified.xlsx. It will look like this:
Here is the full code:
import pandas as pd
import numpy as np
from datetime import timedelta
from datetime import datetime as dt
df=pd.read_excel('data.xlsx')
#Input your start date and hour below:
start_datetime='2021-12-31 00:00:00'
df['datetime']=start_datetime
df['datetime'] = pd.to_datetime(df['datetime'])
df['datetime']=df['datetime'].dt.strftime('%d-%m-%Y %H:%M:%S')
df['datetime'] = pd.to_datetime(df['datetime'])
df['time_added'] = pd.to_timedelta(df['x'],'m')
df['datetime']=df['datetime']+df['time_added']
del df['time_added']
df.to_excel('data_modified.xlsx')
Another approach:
You can use the cells On seize unit and On exit inside the Service block and log the time when the resource is seized and released by using the function time() and write this information into a dataset. Then do the calculations.
You can also use some conversion functions of AnyLogic as below:
I am working with this metadata table REP_WFLOW_RUN currently from Infra DB, to get status about workflows. The column run_status_code shows whether this wf is running, succeeded, stopped, aborted etc..
But for my Business use case I also need to report to Business, the estimated time of completion of this particular work flow.
Example: If suppose the workflow generally started at 6:15, then along with this info that work flow has started I want to convey it is also estimated to complete at so and so time.
Could you please guide me if you have any details on how to get this info from Informatica database.
Many thanks in advance.
This is a very good question but no one can answer correctly :)
Now, you can get some logic like other scheduling tool does.
First calculate average time the workflow takes to complete for a successful run. And output should be a decimal value.
select avg(end_time - start_time )*24 avg_time_in_hr, workflow_name
From REP_WFLOW_RUN
Where run_status_code='succeeded'
Group by workflow_name
You can use above value as estimated time to completion for that workflow. Output should be a datetime.
Select sysdate + avg_time_in_hr/24 est_time_to_complete from dual
Now, this value is an estimated figure and not correct value. So on a bad day, if this takes hours, average value will be bad but we cant do much here.
I assumed, your infa metadata is on oracle.
Is there any way to calculate the utilization of a given resource in a specific time frame? I have a machine that works h24, but during daytime hours its utilization is higher than during nighttime hours.
In the final statistics, using the function "machine.utilization()" I get a low result, which is influenced by the night hours. How can I split the two statistics?
Utilization is calculated as (work time)/(available time excluding maintenance). Which means that the measure described in your question can be achieved using 2 ways:
Make the machine 'unavailable' during the night, this way that time will be excluded in calculations
ResourcePool object has 2 properties on resource seize and on resource release which can be used to record specific instances of work time, sum it up and divide only by a period of (8hr * (num of days)) instead of total time from model start
For a little more detail and link to AnyLogic help please see the answer to another question here.
Update:
In ResourcePool's On seize and On release, AnyLogic provides a parameter called unit, which is the specific resource unit agent being processed. So getting actual use time per unit requires following:
2 collections of type LinkedHashMap that maps Agent -> Double. One collection to store start times (let's call it col_start and one to store total use times, let's call it col_worked)
On seize should contain this code: col_start.put(unit, time())
On release should contain:
double updated = col_worked.getOrDefault(unit, 0.0) + (time() - col_start.get(unit));
col_worked.put(unit, updated);
This way at any given point during model execution, col_worked will contain a mapping of resource unit Agent to the total sum of time it was utilised expressed as a double value in model time units.
I've currently got this Cron expression that I'm using to trigger a process in UiPath Orchestrator:
0 0 15 21W * ? *
Runs on the closest working day to the 21st of each month at 3pm.
However I need it to run on the next working day at 3pm if the 21st is a non working day.
Tried searching for an answer and nothing quite fit the brief.
I used this website to build my expression (which is a great tool) but it only had an option for 'nearest day' and not next working day given a specific day of month: https://www.freeformatter.com/cron-expression-generator-quartz.html
As you don't need the nearest day, you can't use the functionality of Orchestrator cronjob. I would recommend creating a wrapper process as follows:
Create a new process, let's call it StartJobByCheckingDate
Now create a trigger that starts StartJobByCheckingDate each day at 3pm
So that process is now your manager of your desired process
Now we need to check if it is the 21th day
Here you have different ways to solve it
You could create a DataTable or even a file in the StartJobByCheckingDate process, that contains all the different days where your desired process should be fired (but this is very manual, you might not want to update this every year, so this might not be the smartest but the easiest solution)
The other idea is to check if the current day is the 21th day. If so check if it is Saturday/Sunday (non-working day).
If true: you could now create a empty dummy file somewhere that tracks that the 21th was a non-working day, and the next day you check that file existing, if it exists you check the current day to be a working day, and if so you delete the file again and start your desired process
If false: just start your desired process directly
I think 2. idea would be that best. Sure you have 365 jobs runs/year. But when you keep that helper process smart this will just be seconds.
Another idea instead of using the dummy file, would be to use Entities. Smarter but need some more time to get familiar with.
We have (had) the exact same issue. Since UiPath doesn't offer a feasible solution out of the box, we will work around the restriction using the following strategy: We trigger the actual job daily, considering a custom-built, static NonWorkingDay-list that will just suppress the execution of the robot every day we don't want it to run.
These steps are needed:
Get a list with of all known bank holidays, saturdays and sundays until 2053 or so...
Build a the static exclusion-list using a script that does something like this (pseudocode. I will update the answer once we have actually implemented the solution):
1. get all valid execution dates
loop through every 28th of the month until end of 2053
if the date is in the bankHolidayList then
loop until the next bankDay is found
add it to the list of valid ExecutionDates
else
add the date to the validExecutionDate-list
2. build exclusion-list
loop through every day until end of 2053
if the date is not in the validExecutionDate-list
add it to the exclusionDate-list
Format the csv accordingly and upload it to the orchestrator tenant as a NonWorkingDay-List
Update your trigger to run daily at your desired time, using the uploaded NonWorkDay-Calendar
While the accepted answer will surely work as well, we prefered to go with this approach because having a separate robot that does nothing but executing a UiPath trigger just doesn't seem right to me. With this approach we have no additional code that we potentially need to maintain.
In my oppinion not having a solution for this concern out of the box is a lack of feature that UiPath will (hopefully) fix until end of 2053 ;-)
Cheers
You can configure your trigger to launch oftener, then manage dates at init of your process, but you must set up a list of "holydays" or check in some way.
Also you can use the calendar option of orchestrator (+info)
I'm trying to validate the data between the widgets Lead Time and Cycle Time with the report that imported from azure devops (https://learn.microsoft.com/en-us/azure/devops/report/powerbi/data-connector-connect?view=azure-devops), but when I do the average the data doesn't match. Is there some place where can I find information about how the calcule is done or which filters to perform?? Isn't that a simple average?
Lead Time Exemple
As far as I know, the lead time and cycle time will also include weekends. So this could affect the average of the data.
From this doc:
Lead time is calculated from work item creation to entering a completed state. Cycle time >is calculated from first entering an In Progress state to entering a Completed state.
In PowerBI, you could check the Completeddate , Cycletimedays column,CompletedDate.
For example:
In chart:
Chart settings:
In Date:
The average Cycle time days:(147.183+133.340+133.340)/5 = 82.77
Here is a doc about creating Power Bi lead time/ cycle time chart.