ACR cloud return recent results - acrcloud

I'm using broadcast monitoring to return the current playing song for a radio station website by using the live result and calling with
echo $song[0]['metadata']['music'][0]['artists'][0]['name'] . ' - ';
echo" <a href='https://open.spotify.com/track/{$song[0]['metadata']['music']
[0]['external_metadata']['spotify']['track']['id']}' target='_blank'>
{$song[0]['metadata']['music'][0]['title']}</a> ";
I'd like a dynamic response to display the songs played on that day, without having to manually update the call - I can only see how to return a specified date.
Is this possible? For example, displaying the last 20 songs for example.
Thanks

You can store all the monitoring result (getting from callback url) to the mysql database,
then display the last 20 songs (getting from mysql) in another web url.

Resolved by limiting number of responses to specified number using ?limit=xx

Related

Scala Gatling keep inserting user until end of feed file is reached

I'm trying to inject users to a scenario in such a way that it will keep inserting user until every single entry of the feed file is used since the feed file contains log in information. I would like all the users in the feed file to log in. Right now all I could think of is two possible approaches.
Here I insert the number of rows in the feedfile at once.
scenario("Verified_Login")
.exec(LoginScenario.scn)
.inject(atOnceUsers(number_of_entries_in_feedfile))
Here I insert a very high time duration, for example, 100 seconds and then make the feedfile circular.
scenario("Verified_Login")
.exec(LoginScenario.scn)
.inject(atOnceUsers(1),constantUsersPerSec(1) during(100 seconds)
The problem with the first approach is I have to find the number of entries in the feed file which can be tedious as there could be thousands there. The problem with the second is that entries could and probably will be repeated. So is there a way to keep injecting users till feed file runs out of entries?
According to this source, from last year, Stéphane Landelle - who is the leading contributor of gatling, says that you must provide enough data for a simulation to complete using this method.
The post I linked from Stéphane does suggest to simply read the length of the file and use that to drive the amount of users, as you have already mentioned in your question.
I suggest you read the post as it will give you an alternate method to achieving what you want. Seems to be as close as you will ever get unless things have changed.
Here is their code.
val systemsIdentifier = jdbcFeeder(databaseUrl, databaseUser, databasePassword, sql_systemsIdentifier)
val count = new AtomicInteger(systemsIdentifier.records.size).asLongAs(_ => count.getAndIncrement < systemsIdentifier.records.size)
val comScn = scenario("My scenario")
.repeat(systemsIdentifier.records.size / count) {
feed(systemsIdentifier)
.exec(performActionsChain)
}
setUp(comScn.inject(rampUsers(count) over (60 seconds))).protocols(httpConf)

Is it possible to create an RHQ plugin that collects historic measurements from files?

I'm trying to create an RHQ plugin to gather some measurements. It seems relativity easy to create a plugin that return a value for the present moment. However, I need to collect these measurements from files. These files are created on a schedule, for example one per hour, but they contain much finer measurements, for example a measurement for every minute. The file may look something like below:
18:00 20
18:01 42
18:02 39
...
18:58 12
18:59 15
Is it possible to create a RHQ plugin that can return many values with timestamps for a measurement?
I think you can within org.rhq.core.pluginapi.measurement.MeasurementFacet#getValues return as many values as you want within the MeasurementReport.
So basically open the file, seek to the last known position (if the file is always appended to), read from there and for each line you go
MeasurementData data = new MeasurementDataNumeric(timeInFile, request, valueFromFile);
report.add(data);
Of course alerting on this (historical) data is sort of questionable, as if you only read the file one hour later, the alert can not be retroactively fired at the time the bad value happened :->
Yes it is surely possible .
#Override
public void getValues(MeasurementReport report, Set<MeasurementScheduleRequest> metrics) throws Exception {
for (MeasurementScheduleRequest request : metrics) {
Double result = SomeReadUtilClass.readValueFromFile();
MeasurementData data = new MeasurementDataNumeric(request, result)
report.addData(data );
}
}
SomeReadUtilClass is a utility class to read the file and readValueFromFile is the function, you can write you login to read the value from file.
result is the Double variable that is more important, this result value you can calculate from database or read file. And then this result value you have to provide to MeasurementDataNumeric function MeasurementDataNumeric(request, result));

Get details of cells changed from a Google Spreadsheet change notification in a machine readable format

If I have a Google Spreadsheet e.g.
https://docs.google.com/spreadsheet/ccc?key=0AjAdgux-AqYvdE01Ni1pSTJuZm5YVkJIbl9hZ21PN2c&usp=sharing
And I have set up notifications on it to email me immediately whenever a cell changes.
And I make a change to that spreadsheet via the spreadsheet API - i.e. not by hand.
Then I get an email like this:
Subject: "Notification Test" was edited recently
See the changes in your Google Document "Notification Test": Click
here
other person made changes from 10/01/2014 12:23 to 12:23 (Greenwich
Mean Time)
Values changed
If I open the 'Click here' link then I get this URL which shows me the cell that has changed in the spreadsheet:
https://docs.google.com/a/DOMAINGOESHERE/spreadsheet/ver?key=tn9EJJrk6KnJrAEFaHI8E3w&t=1389356641198000&pt=1389356621198000&diffWidget=true&s=AJVazbUOm5tHikrxX-bQ0oK_XEapjEUb-g
My question is:
Is there a way to get the information about which cell has changed in a format that I can work with programmatically- e.g. JSON?
I have looked through the Google Spreadsheet API:
https://developers.google.com/google-apps/spreadsheets/
and at the Drive API Revisions:
https://developers.google.com/drive/manage-revisions
I have also tried setting up an onEdit() event using Google Apps Script: https://developers.google.com/apps-script/understanding_triggers
I thought this last approach would be the answer.
The problem with this approach is that whilst onEdit can be used to email details of changes, it appears to only be fired if the spreadsheet is edited by hand whereas mine is being updated programmatically via the spreadsheet API.
Any ideas?
You could build a function that checks for changes. One way to do this is by comparing multiple instances of the same spreadsheet. If there are differences, you could email yourself. Using the time driven trigger, you can check every minute, hour, day, or week (depending on your needs).
var sheet = **whatever**;//The spreadsheet where you will be making changes
var range = **whatever**;//The range that you will be checking for changes
var compSheet = **whatever**;//The sheet that you will compare with for changes
function checkMatch(){
var myCurrent = sheet.getRange(range).getValues();
var myComparison = compSheet.getRange(range).getvalues();
if(myCurrent == myComparison){//Checks to see if there are any differences
for(i=0;i<compSheet.length;++i){ //Since getValues returns a 'multi-dimensional' array, 2 for loops are used to compare each element
for(j=0;j<compSheet[i].length;++i){
if(myCurrent[i][j] != myComparison[i][j]){//Determines if there is a difference;
//***Whatever you want to do with the differences, put them here***
}
}
myEmailer(sheet.getUrl());//Passes the url of sheet to youur emailer function
compSheet.getRange(range).setValues(myCurrent);//Updates compSheet so that next time is can check for the next series of changes
}
}
Then from Resources>Current project's triggers you can set checkMatch to run every minute.
Also check out https://developers.google.com/gdata/samples/spreadsheet_sample for pulling data as json

Get google search number of result in scala

I am looking for a way to catch total number of results to a search on google (the "About **** results" field). I searched on google API but it seems that you can't get total number. Does anyone have informations that way ? How can I implement it in Scala (or any other language...) ?
Thank you
You can use HTML agility pack. Simply navigate to
"https://www.google.com/search?q=" + WhatYouWantToSearch
and get content of it. After getting that urlcontent you can use code below
htmlDoc.LoadHtml(urlcontent);
HtmlAgilityPack.HtmlNode hnc = htmlDoc.DocumentNode.SelectSingleNode("//div[#id='resultStats']");
string[] text = hnc.InnerHtml.Split(' ');
return Convert.ToInt32(text[1].Replace(",", "").Replace(".", "").Trim());

Using changestamp in GoogleDocs (null changestamp)

I am trying to find the max changestamp so I can start using it. I tried the following:
URL url = "https://docs.google.com/feeds/default/private/changes?v=3"
ChangelogFeed foo = service.getFeed(url, ChangelogFeed.class);
LargestChangestamp stamp = foo.getLargestChangestamp();
stamp is always null.
Is this the way to get the largest changestamp, or do I need to set it first in order to use it?
The largest changestamp is also available in the user metadata feed. See the "docs:largestChangestamp" element within the response protocol tab here,
I'm not sure the java api exposes the largestChangestamp property directly yet - last time I checked it was hidden in the xmlBlob property, and I had to do an xml parse to grab it out.
This seems to be a bug in the API. I got the changestamps by getting the ChangelogEntrys from the ChangelogFeed:
List<ChangelogEntry> entries = foo.getEntries();
for (ChangelogEntry entry: entries) {
String blob = entry.getXmlBlob().getBlob();
System.out.println("Blob: " + blob);
}
The changestamp for an entry is contained in its blob.