So, I'm quiet new to Zend Framework and stackoverflow (although I read a lot here), so please be gentle!
What I'm trying to do is a digital version of an IAPS test (rate pictures for their arrousal and feeling). I have three sets of pictures and each set contains 72 pictures. Now, if a person starts the online test, a set should be randomly assigned (e.g. set 1 to person A).
Further, each picture in the set should be randomly presented. After the picture is presented, the person has to rate the picture twice (rating-scale 1 to 3, saved into a database). The whole process of presenting the image and the two rating should be limited by 12 seconds.
It looks like this:
[SET 1] & [SET2] & [SET3]
Take random set (note: only one set should be presented)
[PIC 1] & [PIC 2] & [PIC3] & ...
Take random pictures (note: all 72 pictures should be presented in a random order)
[PIC 1] -> Rating 1 -> Rating 2
|................12 seconds ..............|
I'm not sure if this is even possible to code with Zend. I was thinking to use Zend_Session to store the ratings in the session and save it to a database at the end, but I'm not sure if this is the best method (security issues?). Even more headaches causes me the limitation of 12 seconds for the presentation of the picture and the two ratings.
I'd be glad to hear ideas and thoughts.
ps: I'm using Zend Framework Version 1.11.12
edit: Just took out the euqually distrubuted sets. Rule of great numbers says that they will be distributed equally with a great sample...
Related
I am very much puzzled about using H5P in Moodle.
The idea is great, obviously, yet I cannot make it work as I expected.
My principles/idea:
There are bunch of activities in each course
Each activity can build up several Student's skills, say Creative Thinking or Problem Solving
After finishing each activity Student, based on the result, can go
to the next activity or re-do it if failed
For the testing purposes I set up 3 outcomes (0-30 > NO pass, 30-70 >
1 point, 71-100 > 2 points) in the H5P module - this one is working
fine.
The outcome should be passed to Moodle, so the course can then decide
what to do: pass with 1 or 2 points, or fail and request to do the
activity again
This outcome then will be added to Student's skillset
Say, I have this basic crossword. After finishing it the Student can achieve two Skills mentioned above yet still this depends on the outcome, eg. result 1 means 0 in Creative Thinking and +1 in Problem Solving, and/or 2 means +1 in Creative Thinking and +2 in Problem Solving.
The activity itself works as expected, as I mentioned above, see the images (note ONE point circled):
and , but then nothing happens.
The student is NOT taken to the next activity, all s/he can do is to retry same activity over and over again.
Is it possible to force Moodle/H5P to act as described above?
For the testing purposes I used two 'activities': one being 'h5p' itself and the other being 'lesson' with same h5p modules being added inside, see the image:
I run this all on WAMP
I tried to follow xAPI https://h5p.org/documentation/x-api
which resulted in js error:
Sorry for the long post - tried to cover everything.
If anybody knows the answers - this reply will be much appreciated.
Cheers,
Greg
I'm looking to bulk input data from google forms. This involves 2 sections:
Initial Conditions
Observations
Everyone who wants to input data will be inputting 1 set of initial conditions, followed by somewhere between 5 and 20 (maybe more?) observations of multiple variables:
Name
Date
Color
Quantity
Type
The problem is that I don't want to have to make people re-enter the initial conditions each time they submit a form.
Ideally they would be able to select a response after adding one observation:
Add another observation
Conclude session
The thing I don't know how to do: Add another observation would open a new blank observations page.
Anyone have any ideas about how to make the appropriate form?
First of all I am brand new to Moodle. We have a SCORM course that we imported in Moodle. The course have 15 questions in the end. If you answer all 15 questions correctly then it means your score is 100%. But when we complete the course the info tab shows 15% instead of 100%.
What configuration do I make that Grade for attempt and Grade reported show 100% instead of 15% ?
Thanks
**********EDIT*************************
Actually we have two courses. For one course it shows 100% and for other course it is showing 15%. What is happening for course showing 100%, it send cmi.core.score.raw 100 and cmi.core.score.max 100. Here is the screen shot
But for other course if all questions are answer right then it setscmi.core.score.raw 15 and cmi.core.score.max 15. And moodle shows 15%. But if 4 questions are answer right right then it sets cmi.core.score.raw 4 and cmi.core.score.max 15.
For 4 answers right
and finally this is the result of moodle
So I think we have to set actual score as cmi.core.score.raw and cmi.core.score.max. In this way moodle will show the correct percentages. like in case of 4 answers right 27% and incase of all right 100%.
I don't know if there is any settings for this in moodle. These are my findings till yet.
Thanks
Edit your scorm activity setting and go to grade setting, and change maximum grade 100 to 15. Since you have 15 questions and each question is sending 1 score to lms, you have to change maximum grade setting according to score points you are sending to lms by scorm.
I would like to store a list of all en.wikipedia articles in my database. For each article I want to store the pageid, title and the popularity. I thought about using the view count (over the last month) as a measurement for popularity but if that is not possible, I could imagine going for something else (maybe use the number of revisions). I'm aware of http://dumps.wikimedia.org/enwiki/latest/ and that I can get a full list of articles from there (current count 36508337). However, I can not find a clever way to get the view count for each article.
// Updates, Edits, ...
The suggested duplicate does not help me because
a) I was looking for a popularity measurement. The answer to the other questions just states that it is not possible to get the number of watchers for a page, which is fine with me.
b) There is no answer there that gives me the page views (or any other metric) for every page.
Okay I'm finally done. Here is what I did:
I found http://dumps.wikimedia.org/other/pagecounts-ez/ which provides page views per month. This seems promising but they don't mention the pageid so what I'm doing is getting a list of all articles from http://dumps.wikimedia.org/enwiki/latest/, create a mapping name->pageid and then parse the pagecount dump. This takes about 30 minutes, here are some statistics:
68% of the articles in the page count file do not exist in the latest dump. This is probably due to some users linking, for example, Misfits_(TV_series) while other link to Misfits_(tv_series) and even stuff like Misfits_%28TV_series%29... I did not bother with those because my program already took long enough to run.
The top 3 pages are:
2.1. Front page with 639 million views (in the last month)
2.2. Malware with 8.5 million views
2.3. Falcon 9 v1.1 with 4.7 million views (cool!)
I made a histogram for the number of pages with a certain view count, here it is:
I also plotted the number of pages I would have to deal with when I disregard all articles below a certain view count. Here it is:
Say for example I am getting a range of integers from a user:
Generate between nnn and nnn widgets.
Yes, most users will ensure that the first number is equal to or less than the second number:
Generate between 3 and 7 widgets.
But then there's that user, who wants to enter this data "back to front";
Generate between 7 and 3 widgets.
Do you quietly switch the fields around when the user clicks OK/Apply so, in the example above, you change the range of 7 to 3 back to 3 to 7 in the GUI? This might not be so practical on a web form where the user enters some data and then submits the form never to see it again but I'm thinking more in terms of a desktop application's settings page where the user's input is saved and subsequently viewed and edited again by the user.
Is it more important to try and educate users to enter a range that "makes sense" via error/alert messages, or quietly cajole their entries into the shape an application is expecting?
I suspect the "cajoling option" is more preferable, but could this "hey the program messed with my data!" approach be a problem for users?
I am currently writing an application that has a handful of these user-configurable ranges so I'm very interested to follow the general consensus of the SO experts.
If the user enters data incorrectly you shouldn't assume a particular pattern of error and automatically correct it. Either report the error to the user and ask them to correct it or suggest a correction that they can approve. In your example, what if the user intended to type 7 and 13 but simply mistyped. If you changed it to 3 and 7 you've entered incorrect data without the user's knowledge. I'd probably do the simple thing and use a visual alert when incorrect data is entered (but before it's actually submitted) and refuse to accept incorrect data, returning an error if it is submitted incorrectly.