So I have lectures and time periods and some lectures need to be taught in a specific time period. How do i do that?
Does scoreHolder.addHardConstraintMatch(kcontext, 10); solve this as a hard constraint? Does the value of positive 10 ensure the constraint of courses being in a specific time period?
I'm aware of the Penalty pattern but I don't want to make a lot of CoursePeriodPenalty objects. Ideally, i'd like to only have one CoursePeriodReward object to say that CS101 should be in time period 9:00-10:00
Locking them with Immovable planning entities won't work as I suspect you still want OptaPlanner to decide the room for you - and currently optaplanner only supports MovableSelectionFilter per entity, not per variable (vote for the open jira for that).
A positive hard constraint would definitely work. Your score will be harder to interpret for your users though, for example a solution with a hard score of 0 won't be feasible (either it didn't get that +10 hard points or it lost 10 hard points somewhere else).
Or you could add a new negative hard constraint type that says if != desiredTimeslot then loose 10 points.
Related
I have a problem where I have several events that are occurring in a project, the events happen semi-concurrently, where they do not start at the same time but multiple can still be occurring at once.
Each event is a team of people working on a linear task, starting at the beginning and then working their way to the end. Their progress is based on a physical distance.
I essentially need to figure out each events start time in order for no teams to be at the same location, nor passing eachother, at any point.
I am trying to program this in MATLAB so that the output would be the start and end time for each event. The idea would be to optimize the total time taken for the project.
I am not sure where to begin with something like this so any advice would be greatly appreciated.
If I understand correct, you just want to optimize the "calendar" of events with limited resources (aka space/teams).
This kind of problems are those called NP and there is no "easy" way to search for the best solution.
You here have two options:
Greedy like algorithm: You will have your solution in a resonable time but it won't be the best one.
Brute force like algorithm: You will find the best solution but maybe not in the time you need it.
Usually if the amount of events is low you can go for 2nd option but if don't you may need to go for the first one.
No mather which one you choose first thing you will need to do is to compute if a solution is valid. What does this mean? It means to check for every event wheter if it collisions whith others in time, space and teams.
So lets imagine the problem of making the calendar on a University. There you have to think about:
Students
Teacher
Classroom
So for each event I have to check if another event have same students, teacher or classroom at the same time. First of all I will check the events that match in time with the actual event. Then I will compare the actual event with all the others.
Once you have this done you could just write a greedy algorithm that starts placing events on time just checking if it collides with some other.
I am solving a variation on vehicle routing problem. The model worked until I implemented a change where certain vehicles and/or stops may remain unassigned because the construction filter does not allow the move due to time window considerations (late arrival not allowed).
The problem size is 2 trucks/3 stops. truck_1 has 2 stops (Stop_1 and Stop_2) assigned to it, and consequently 1 truck and 1 stop remain unassigned since truck_2 will arrive late to Stop_3.
I have the following error:
INFO o.o.c.i.c.DefaultConstructionHeuristicPhase - Construction Heuristic phase (0) ended: step total (2), time spent (141), best score (-164hard/19387soft).
java.lang.IllegalStateException: Local Search phase started with an uninitialized Solution. First initialize the Solution. For example, run a Construction Heuristic phase first.
at org.optaplanner.core.impl.localsearch.DefaultLocalSearchPhase.phaseStarted(DefaultLocalSearchPhase.java:119)
at org.optaplanner.core.impl.localsearch.DefaultLocalSearchPhase.solve(DefaultLocalSearchPhase.java:60)
at org.optaplanner.core.impl.solver.DefaultSolver.runPhases(DefaultSolver.java:213)
at org.optaplanner.core.impl.solver.DefaultSolver.solve(DefaultSolver.java:176)
I tried to set the planning variable to null (nullable = true) but it seems it is not allowed in case of chained variables.
I am using Optaplanner 6.2.
Please help,
Thank you,
Piyush
Your construction filter may be too restrictive, it could prevent the construction heuristic from creating an initialized solution. You should remove the time window constraint from the construction filter and add it as a hard score constraint in your score calculator instead.
From the Optaplanner docs:
Instead of implementing a hard constraint, it can sometimes be built in. For example: If Lecture A should never be assigned to Room X, but it uses ValueRangeProvider on Solution, so the Solver will often try to assign it to Room X too (only to find out that it breaks a hard constraint). Use a ValueRangeProvider on the planning entity or filtered selection to define that Course A should only be assigned a Room different than X.
This can give a good performance gain in some use cases, not just because the score calculation is faster, but mainly because most optimization algorithms will spend less time evaluating unfeasible solutions. However, usually this not a good idea because there is a real risk of trading short term benefits for long term harm:
Many optimization algorithms rely on the freedom to break hard constraints when changing planning entities, to get out of local optima.
Both implementation approaches have limitations (feature compatiblity, disabling automatic performance optimizations, ...), as explained in their documentation.
What I was trying to do is , to test if optaplanner is suitable for our requirements etc.
Thus, I created our own dataset of courses, ~280 courses etc.
I "believe" XML I prepared is valid for sample, since it loads and optaplanner can start solving it.
However, right during CH phase, it finds some (-220) hard constraint violations, specifically for the rule "conflictingLecturesDifferentCourseInSamePeriod".
And for how long it tries, those violations still remain.
Then when I check violations, they are actually not real violations.
It is two different course, in same hours, but in different rooms, and teachers are not same. So there should be no violation for this scenario.
Also actually when I scan schedule by eye, I dont see any conflict.
So, I am lost right now....
Here is a link for XML dataset.
Actually I found the problem, well it is not a problem in first place :)
Maybe rule name is little bit misleading.
Anyway, problem is actually in too crowded curriculums. Like we had 30-40 courses, which makes 80-100 lectures. And for a 45 hours week, it is impossible to fit everything.
And I assume the rule "conflictingLecturesDifferentCourseInSamePeriod", checks "different" courses of same curriculum.
So, when I reduce course counts by splitting curriculumns into 4 for each, violations reduced to 0 .
Believe this will be a valuable info to whom couldnt understand mentioned rule's purpose.
Thanks.
I'm using a hash of IP + User Agent as a unique identifier for every user that visits a website. This is a simple scheme with a pretty clear pitfall: identifier collisions. Multiple individuals browse the internet with the same IP + user agent combination. Unique users identified by the same hash will be recognized as a single user. I want to know how frequently this identifier error will be made.
To calculate the frequency, I've created a two-step funnel that should theoretically convert at zero percent: publish.click > signup.complete. (Users have to signup before they publish.) Running this funnel for 1 day gives me a conversion rate of 0.37%. That figure is, I figured, my unique identifier collision probability for that funnel. Looking at the raw data (a table about 10,000 rows long), I confirmed this hypothesis. 37 signups were completed by new users identified by the same hash as old users who completed publish.click during the funnel period (1 day). (I know this because hashes matched up across the funnel, while UIDs, which are assigned at signup, did not.)
I thought I had it all figured out...
But then I ran the funnel for 1 week, and the conversion rate increased to 0.78%. For 5 months, the conversion rate jumped to 1.71%.
What could be at play here? Why is my conversion (collision) rate increasing with widening experiment period?
I think it may have something to do with the fact that unique users typically only fire signup.complete once, while they may fire publish.click multiple times over the course of a period. I'm struggling however to put this hypothesis into words.
Any help would be appreciated.
Possible explanations starting with the simplest:
The collision rate is relatively stable, but your initial measurement isn't significant because of the low volume of positives that you got. 37 isn't very many. In this case, you've got two decent data points.
The collision rate isn't very stable and changes over time as usage changes (at work, at home, using mobile, etc.). The fact that you got three data points that show an upward trend is just a coincidence. This wouldn't surprise me, as funnel conversion rates change significantly over time, especially on a weekly basis. Also bots that we haven't caught.
If you really get multiple publishes, and sign-ups are absolutely a one-time thing, then your collision rate would increase as users who only signed up and didn't publish eventually publish. That won't increase their funnel conversion, but it will provide an extra publish for somebody else to convert on. Essentially, every additional publish raises the probability that I, as a new user, am going to get confused with a previous publish event.
Note from OP. Hypothesis 3 turned out to be the correct hypothesis.
I noticed that for problems such as Cloudbalancing, move factories exist to generate moves and swaps. A "move move" transfers a cloud process from one computer to another. A "swap move" swaps any two processes from their respective computers.
I am developing a timetabling application.
A subjectTeacherHour (a combination of subject and teacher) have
only a subset of Periods to which they may be assigned. If Jane teaches 6 hours at a class, there are 6 subjectTeacherHours each which have to be allocated a Period, from a possible 30 Periods of that class ;unlike the cloudbalance example, where a process can move to any computer.
Only one subjectTeacherHour may be allocated a Period (naturally).
It tries to place subjectTeacherHour to eligible Periods , till an optimal combination is found.
Pros
The manual seems to recommend it.
...However, as the traveling tournament example proves, if you can remove
a hard constraint by using a certain set of big moves, you can win
performance and scalability...
...The `[version with big moves] evaluates a lot less unfeasible
solutions, which enables it to outperform and outscale the simple
version....
...It's generally a good idea to use several selectors, mixing fine
grained moves and course grained moves:...
While only one subjectTeacher may be allocated to Period, the solver must temporarily break such a constraint to discover that swapping two certain Period allocations lead to a better solution. A swap move "removes this brick wall" between those two states.
So a swap move can help lead to better solutions much faster.
Cons
A subjectTeacher have only a subset of Periods to which they may be assigned. So finding intersecting (common) hours between any two subjectTeachers is a bit tough (but doable in an elegant way: Good algorithm/technique to find overlapping values from objects' properties? ) .
Will it only give me only small gains in time and optimality?
I am also worried about crazy interactions having two kinds of moves may cause, leading to getting stuck at a bad solution.
Swap moves are crucial.
Consider 2 courses assigned to a room which is fully booked. Without swapping, it would have to break a hard constraint to move 1 course to a conflicted room and chose that move as the step (which is unlikely).
You can use the build-in generic swap MoveFactory. If you write your own, you can say the swap move isDoable() false when your moving either sides to an ineligible period.