How long does Static / Dynamic analysis take in Intelix? - sophoslabs-intelix

I am currently testing the Intelix API within my service.
Please could you tell me the average time a static and dynamic analysis will take?
Thank you in advance

This is a little hard to say because the time taken for the analysis depends on the sample. Especially with Dynamic analysis, take for example a PE file, as it is detonated some executables will do lots and run for a long time where as others can finish running in seconds.
In my experience static analysis takes 10-20 seconds on average. Where as Dynamic analysis takes a few minutes. However these figures heavily depend on the exact sample that is submitted.
Obviously it is best to build your logic around the advertised maximum time for each analysis. However, when thinking about average times, think static analysis in seconds, dynamic in minutes.

Related

Calculate the travel time from one point to the other on Anylogic

I am developing a logistics simulation in the factory by Anylogic. It's a pick up and delivery problem, where the AGVs need to pick up the parcel and deliver to the target location. All the AGVs are traveling following paths. The paths have different speed limits.
My goal is to reduce the time of traffic jam or waiting time for jobs to be picked up.
I have the leading time, job delivered time - job generated time.
But I from here, I want to identify the time of traffic jam or waiting time.
Is there a way to calculate the time from one spot to the other considering different speed limit of paths without waiting time or traffic jam? So that I could subtract this from leading time.
Let me know if I need to clarify something.
There is no build-in way to do this, you have to do it yourself. I have 3 ideas:
You compute this mathematically in the model yourself, i.e. write a function that computes the length of the total path and you have the ideal speed already, voila
You run a separate experiment and turn off all speed limits and other traffic: record the time in that ideal case and use that to compare
Similarly, you could do this in the same experiment during a warmup period: drive a fake transporter along the path and compute the perfect durations

How to make my algorithm depends on time?

I used MATLAB to simulate the cascading failure of two interdependent networks/Layers (I generated the two layers based on small world - watts strogatz algorithm).
My code works fine but it is not time dependent.
I want to have time steps, for example, the initial attack on one node happens at t1 then after some time the next vulnerable nodes get failed at different time t2 and so on for the other failure events. My code emulates only telecommunication nodes ( all events happen instantaneously), I want it to work for other logistic networks, say social networks for example, where the timestamp for every event might be in minutes or hours. Your thoughts and ideas would appreciated.
Note: I can provide my code if this helps.

A way/ tool to make an estimation of execution time for an APP/ task

I'am trying to run real-life experiment for an application in Raspberry Pi, and I need to estimate or predicate the execution time for the application. in other words, before execution/ run the task i need to know how long (roughly) this task/app going to take to get the result back. I have identified several techniques and works that have been done before. but most of it are simulation work which doesn't work with real-life experiment. does anyone can help me with any idea or technique (No code). thank you in advance
Estimating the execution time of an application or function is going to be difficult in any context. You might want to look up the halting problem for some insight to why. It's impossible to determine whether a given program will finish executing, and therefore, you can't really tell how long a given program will take to finish executing.
For general computing, varying hardware capabilities of any given system will always have an effect on the execution time of a program. Raspberry Pi is a little more discrete than that, and therefore more predictable in that sense, but those specifications will not always be consistent across its various versions. That adds to the complexity of determining a run time.
Practically, the most reliable way to determine how long a process will take would be to just run it and time it. If you absolutely need predicted times for something, you might be able to do a bit of a composite estimate - time the smaller chunks of the application separately, and then use those to determine how long you expect the application as a whole to run. For most situations, though, it would be much faster to just run the program itself rather than trying to predict it.
Store the time before and after the execution ? Then you could know the execution time

Faster way to run simulink simulation repeatedly for a large number of time

I want to run a simulation which includes SimEvent blocks (thus only Normal option is available for sim run) for a large number of times, like at least 1000. When I use sim it compiles the program every time and I wonder if there is any other solution which just run the simulation repeatedly in a faster way. I disabled Rebuild option from Configuration Parameter and it does make it faster but still takes ages to run for around 100 times.
And single simulation time is not long at all.
Thank you!
It's difficult to say why the model compiles every time without actually seeing the model and what's inside it. However, the Parallel Computing Toolbox provides you with the ability to distribute the iterations of your model across several cores, or even several machines (with the MATLAB Distributed Computing Server). See Run Parallel Simulations in the documentation for more details.

Synchronise real-time workshop in matlab for grt target

I am trying to run a real-time simulation in Simulink using Real-time Workshop. The target is grt(I have tried rtwin, but my simulation refuses to compile for it). I need the simulation to run in real-time so that one second in simulation lasts one second of real time. Grt ignores realtime and finishes the simulation in shortest time possible. Is there any way to synchronise it?
I have tried http://www.mathworks.com/matlabcentral/fileexchange/3175 but could not get it to work(does not compile).
Thank you for any suggestions.
Looks like it is impossible. I was able to slow down the execution by using Sleep(time in ms) function from WinApi and clock function from time.h, which looked quite good for low sample rates. However, when I increased the sample rate the Sleep function was sleeping for too long, which resulted in errors, with one second in simulation lasting more than one real world second.
The idea was to say that one period of iteration should last, let's say 200ms. Then time how long it takes for one iteration of code to execute using the clock function. Then call Sleep(200 - u), where u is the length of the iteration. The problem is that Sleep function sleeps the process and wakes it up when it wants to, not when you tell it to in the argument.
I know this is not a solution, but post this so that if anyone faces the same problem as me they won't try this dead-end solution. I had to rewrite the simulation for rtwin and now it works fine.
Another idea would be to somehow use interrupts, but I guess it would be quite complicated and not worth the trouble.