Automation of clicks on webpage using Matlab - matlab

So I'm looking for a way to systematically access data from a website. This data is updated every 15 minutes or so, and is generated through a datamart system that makes custom reports following several input parameters: the desired date interval, the specific dataset.
All these parameters require me to click on some specific buttons; I was wondering if it would be possible to automate these click inputs using Matlab (or something else if need be), to retrieve the data and treat it automatically.
Thanks in advance!

I suggest you take a look at http://www.autohotkey.com/. This is a great tool which allows for the automatic clicking on any window (including a browser page) under Windows. It even will let you "search" your screen for pixel images and then click on those images. This would allow you to make a very small bmp file of the link you would like to click on, and then your script can search your page and click directly on the link.
As far as getting data into matlab I'm not exactly sure of the best way to do this, but you might consider saving the html of the page, and then parsing that from matlab.

Related

How to avoid picture from being copied

I am onto a photo project. Dynamically added a watermark in the large picture preview. But someone with general idea can manipulate the link and get the original picture. Disabling Right button seems to be useless. In this case what can be the appropriate solution?
Disable any access to original pictures. Make server script that receives picture's ID (or name) and sets watermark before picture will be shown. And make your application such way that this script only can access images by users. If manipulation with link allows users to have unauthorized access, then it looks like you have problems with architecture of project.

How to create database in MATLAB

Actually I built a GUI to show the result of my program. When I click on the buttons, they need to read the input from a folder, which I choose for them, but I want to have a database to store all the data and load them just one time when I click on the button, not for all the buttons.
I can give an example to show what I want to do: For example I can put a button just for loading the program and reading the input. Then, when I press any button, they don't need to read the data from the folder any more.
I'll appreciate that anyone can help me.
If you want a real local database, I'd use SQLite. It can be merged with matlab using mksqlite.
However, as pointed out by #eitan-t, you might not need a real database. It's enough to store your data using some Matlab's own type, like an array of stuct's.

Adding a hit counter to Desktop Intelligence/Xi 3/Business Objects webpage?

for my company I am making a report in Xi3/Desktop Intelligence that pulls data via free hand SQL and makes a html file displaying the data, updating every 20mins. We want to incorporate a hit counter that will show us the number of times this report is being viewed.
I found a couple basic templates online. I tried copying and pasting them into a cell, but the output HTML page just displayed the full HTML (unrendered by my browser). I am decent at writing my own HTML, but I just do not understand how to stick my own HTML code in a dynamically updating report in Xi3.
Moreover, I doubt (for legality reasons) my company will be okay with me using a free hit counter template I find online, especially considering they all seem to reference a third party website to do the actual "counting." Any ideas of the best way to implement/learn how to create a visitor counter?
Thanks.
You can include HTML in a DeskI report. In the cell that contains the HTML, click Format Cell; on the "Number" tab, there is a checkbox for "Read as HTML". Make sure that's checked off. Note that you won't see the rendered HTML within DeskI, but it will display when viewed in Infoview.

How to export data from Chrome developer tool?

Network analysis by Chrome when page loads
I would like to export this data to Microsoft Excel so that I will have a list of similar data when loaded at different times. Loading a page one time doesn't really tell me much especially if I want to compare pages.
if you right click on any of the rows you can export the item or the entire data set as HAR which appears to be a JSON format.
It shouldn't be terribly difficult to script up something to transform that to a csv if you really need it in excel, but if you're already scripting you might as well just use the script to ask your questions of the data.
If anyone knows how to drive the "load page, export data" part of the process from the command line I'd be quite interested in hearing how
from Chrome 76, you have Import/Export buttons.
I was trying to copy the size data measured from Chrome Network and stumbled on this post. I just found an easier way to "export" the data out to excel which is to copy the table and paste to excel.
The trick is click Control + A (select all) and once the entire table will be highlighted, paste it to Microsoft Excel. The only issue is if there are too many fields, not all rows are copied and you might have to copy and paste several times.
UPDATED: I found that copying the data only works when I turn off the filter options (the funnel-looking button above the table). – bendur
Right-click and export as HAR, then view it using Jan Odvarko's HAR Viewer
This helps in visualising the already captured HAR logs.
I came across the same problem, and found that easier way is to undock the developer tool's video to a separate window! (Using the right hand top corner toolbar button of developer tools window)
and in the new window , simply say select all and copy and paste to excel!!
In Chrome, in the Developer Tools, under Network, in the Name column, right-click and select "Save as HAR with content". Then open a new tab, go to https://toolbox.googleapps.com/apps/har_analyzer/ and open the saved HAR file.
Note that ≪Copy all as HAR≫ does not contain response body.
You can get response body via ≪Save as HAR with Content≫, but it breaks if you have any more than a trivial amount of logs (I tried once with only 8k requests and it doesn't work.) To solve this, you can script an output yourself using _request.contentData().
When there's too many logs, even _request.contentData() and ≪Copy response≫ would fail, hopefully they would fix this problem. Until then, inspecting any more than a trivial amount of network logs cannot be properly done with Chrome Network Inspector and its best to use another tool.
You can use fiddler web debugger to import the HAR and then it is very easy from their on... Ctrl+A (select all) then Ctrl+c (copy summary) then paste in excel and have fun
I don't see an export or save as option.
I filtered out all the unwanted requests using -.css -.js -.woff then right clicked on one of the requests then Copy > Copy all as HAR
Then pasted the content into a text editor and saved it.
I had same issue for which I came here. With some trials, I figured out for copying multiple pages of chrome data as in the question I zoomed out till I got all the data in one page, that is, without scroll, with very small font size. Now copy and paste that in excel which copies all the records and in normal font.
This is good for few pages of data I think.
In more modern versions of Chrome you can just drag a .har file into the network tab of Chrome Dev Tools to load it.
To get this in excel or csv format- right click the folder and select "copy response"- paste to excel and use text to columns.
You can try use Haiphen, which is a chrome extension that allows you to analyze network traffic and what API calls a web application is making.

How can I return a text file and an error log from a webpage separately

I have a perl script which when run from the command line generates a text file of data with a specific format for use by another application. The script also prints informational warning messages on stderr. I'm writing a web front end for this. In an ideal world when the user clicks 'submit' on the associated form, a page would be displayed in the browser containing the informational messages, and simultaneously a pop-up would appear allowing the user to save the text file of data to disk. I would like this to work on browsers without javascript enabled, so I think exactly what I want is probably not possible.
Some sites I have seen deal with this kind of thing by displaying the page with the informational messages, and a link to the file to be downloaded. This would seem to mean having to store the files and sorting out some sort of security so that another user cannot download your file (not that this is a big deal for the application in question).
I'm wondering if there is a more elegant way of dealing with this? e.g Is it possible to use multipart messages to somehow achieve returning both pieces of information in one go? Is it possible to pop-up a second window with the informational messages without using javascript? Apologies if these seem like basic questions - my programming knowledge is in the domain of DNA sequence manipulation algorithms rather than web page generation..
If (and only if) the data is quick and easy to generate, do it once for error messages and a second time for download. The link or button of the error-message page would regenerate the results and prompt for download.
This is a bit of a hack since you need to consider what to do if the underlying data changes before the user hits the download link. Be careful to set the header correctly for file download vs normal webpage, eg,
if($submit) {
print header(-type=>'application/octet-stream',
-Content_disposition=>'attachment; filename=foobar.dat');
Gen_Results();
}
To be honest, I'd just use a little javascript anyway since it's a pretty safe assumption now a days. Otherwise, use a "noscript" tag for some alternative.