show image on client before upload on server in gwt - gwt

I would like to show an image and its dimensions on the client before uploading it to the server. Whenever I try to create an Image widget of gwt ext, it doesn't accept a local file (on the file system). It only accepts http requests. I also tried String img = GWT.getHostPageBaseURL() + "image_name" and the replace function but with same result. Finally I moved to ImagePreloader but its also needs a URL.
ImagePreloader.load("URL of image", new ImageLoadHandler()
{
public void imageLoaded(ImageLoadEvent event)
{
if (event.isLoadFailed())
Window.alert("Image " + event.getImageUrl() + "failed to load.");
else
Window.alert("Image dimensions: " + event.getDimensions().getWidth() + " x " + event.getDimensions().getHeight());
}
});
Can someone suggest a solution that doesn't include uploading the image first?

Have a look at this related question and answer:
Preview an image before it is uploaded
You can include it using JSNI. I'm now aware of a solution within GWT.
Just found gwtupload which claims to be able to preview images before uploading them.

Please take a look at the sample JS code below:
function readURL(input) {
if (input.files && input.files[0]) {
var reader = new FileReader();
reader.onload = function (e) {
$('#blah').attr('src', e.target.result);
}
reader.readAsDataURL(input.files[0]);
}
}
$("#imgInp").change(function(){
readURL(this);
});
and the associated HTML:
<form id="form1" runat="server">
<input type='file' id="imgInp" />
<img id="blah" src="#" alt="your image" />
</form>

Related

How to get an image that dragged and dropped from the browser to the DropArea QML component

I'm using the DropArea component to transfer files using the Drag&Drop mechanism to my application. The following test code works fine with files located on the system:
import QtQuick 2.15
import QtQuick.Window 2.15
Window {
id: root
width: 640
height: 480
visible: true
DropArea {
id: dropArea
anchors.fill: parent
onEntered: {
drag.accepted = drag.hasUrls
}
onDropped: {
// Use files through drop.urls
drop.accept()
}
}
}
In onEntered I accept the DragEvent if it contains urls, and in onDropped I can use urls to work with dropped files.
But also I need to accept images from the browser through DropArea. At the same time, since the images dragged from the browser do not exist in the file system, I expect to receive raw image data with which I create the image file myself.
The problem is that the DragEvent drop from dropped signal does not have such data. This can be verified with the following logging:
onDropped: {
console.log("drop.urls: " + drop.urls)
console.log("drop.html: " + drop.html)
console.log("drop.text: " + drop.text)
console.log("-----------------------")
for (var i = 0; i < drop.formats.length; ++i) {
console.log(drop.formats[i] + ": " + drop.getDataAsString(drop.formats[i]))
}
}
that will give the following information (dragged and dropped the Qt logo image from the documentation):
qml: drop.urls: https://doc.qt.io/
qml: drop.html: <html>
<body>
<!--StartFragment--><img src="https://doc.qt.io/style/qt-logo-documentation.svg" alt="Qt documentation"><!--EndFragment-->
</body>
</html>
qml: drop.text: https://doc.qt.io/
qml: -----------------------
qml: application/x-qt-windows-mime;value="DragContext":
qml: application/x-qt-windows-mime;value="DragImageBits": ?
qml: application/x-qt-windows-mime;value="chromium/x-renderer-taint":
qml: application/x-qt-windows-mime;value="FileGroupDescriptorW":
qml: application/x-qt-windows-mime;value="FileContents":
qml: text/x-moz-url: h
qml: text/uri-list: https://doc.qt.io/
qml: application/x-qt-windows-mime;value="UniformResourceLocatorW": h
qml: text/plain: https://doc.qt.io/
qml: text/html: <html>
<body>
<!--StartFragment--><img src="https://doc.qt.io/style/qt-logo-documentation.svg" alt="Qt documentation"><!--EndFragment-->
</body>
</html>
Among the available properties (including those obtained through formats), there is no raw image data. drop.html provides useful information about the address of the dropped image, but is it really the only way to get an image is to download it using the received link?
I also thought about whether it is possible to somehow get QMimeData so that it can call imageData() and get the image in this way. I found a similar transfer of mime data to QML from the Krita developers (see DeclarativeMimeData* const m_data, which they define as Q_PROPERTY), but I'm not sure that this is the easiest, and most importantly, working way.
So to sum it up, is there a way to get the raw data of an image dragged and dropped from a browser or the image itself as a QImage, using the standard QML DropArea component?
Before going to C++ and QNetworkAccessManager there are extra things you can do in QML to validate whether you indeed have an image.
The following demonstrates that we can either:
check the contents of the drop.formats
run an XMLHTTPRequest "HEAD" request
The latter is particularly useful because we can infer the mime type from the content type, and, if available, determine how big something is by reading the content length.
Also, if you wish to have an in-memory copy of the image, you can create a second XMLHttpRequest. This time you can set "GET" and set responseType to "arraybuffer".
DropArea {
id: dropArea
anchors.fill: parent
property url dropUrl
property string contentType
property bool isImage
property int contentLength
onDropped: function (drop) {
console.log("formats: ", JSON.stringify(drop.formats));
if (!drop.hasUrls) return;
let xhr = new XMLHttpRequest();
dropUrl = drop.urls[0];
console.log("dropUrl: ", dropUrl);
xhr.open("HEAD", dropUrl, false);
xhr.send();
contentType = xhr.getResponseHeader("Content-Type");
console.log("contentType: ", contentType);
isImage = contentType.startsWith("image/");
contentLength = xhr.getResponseHeader("Content-Length") ?? 0;
console.log("contentLength: ", contentLength);
let xhr2 = new XMLHttpRequest();
xhr2.open("GET", dropUrl, false);
xhr2.responseType = "arraybuffer";
xhr2.send();
let data = xhr2.response;
console.log(data.byteLength);
}
}

Powershell webinvoke with Google Script webapp

Is there a way to web invoke from PowerShell connection to web app created in google app script ?
When im running the request on normal sites I will receive back information containing Forms[], Images[], InputFields[] etc. However when I'm trying to run the same request on a https://script.google.com/a/macros/ web app all those fields are blank and i can only see a variable called el linking to field called sandboxFrame.
The app is a simple upload site to one of my google folders, everything is working when I'm in browser. I'm trying to automate the process through PowerShell script
HTML file
<!DOCTYPE html>
<html>
<body>
<input name="file" id="files" type="file" multiple>
<input type='button' value='Upload' onclick='getFiles()'>
</body>
<script>
function getFiles() {
const f = document.getElementById('files');
[...f.files].forEach((file, i) => {
const fr = new FileReader();
fr.onload = (e) => {
const data = e.target.result.split(",");
const obj = {fileName: f.files[i].name, mimeType: data[0].match(/:(\w.+);/)[1], data: data[1]};
google.script.run.withSuccessHandler((id) => {
console.log(id);
}).saveFile(obj);
}
fr.readAsDataURL(file);
});
}
</script>
</html>
GS script
function saveFile(obj) {
var folder = DriveApp.getFolderById('1w586veZcOZN_NnB90jaTZ12DF-jP005u');
var blob = Utilities.newBlob(Utilities.base64Decode(obj.data), obj.mimeType, obj.fileName);
return folder.createFile(blob).getId();
}
You would need to take advantage of the doPost function.
I assume you already know about the doGet function, but there is another function that you can use as part of a web app called doPost. This allows you to post data using something like the following from powershell:
Invoke-WebRequest https://script.google.com/a/macros/[SCRIPTID]?[QUERYSTRING] -Method POST
Where the [QUERYSTRING] is something like:
name=bartosz&stack=BartoszWolas&reputation=1000
Then within the doPost on the web app side you would write a function like this:
function doPost(e) {
const name = e.parameter.name; // bartosz
const stackAlias = e.parameter.stack; // BartoszWolas
const reputation = e.parameter.reputation; // 1000
}
Reference
web app

How do I embed a Facebook Feed in the new Google Sites (using Apps Script)?

I've read that you can make a Google Apps Script that shows a Facebook Feed, and then embed this in a Google Site, but I can't find any more information on how to do it and I can't figure it out myself.
When I try to make an Apps Script web app with a Facebook feed I get errors like:
Uncaught DOMException: Failed to set the 'domain' property on 'Document': Assignment is forbidden for sandboxed iframes.
This is from copying the "Facebook Javascript SDK" and "Page Feed" from Facebook Developers into an HTML file and deploying it as a web app. I gather it has something to do with how Apps Script sandboxes your code but I don't know what I have to do here.
For that matter, even if I try to make a simpler Apps Script with some static HTML, when I try to embed it from Drive into the site I get an error "Some of the selected items could not be embedded".
The New Google Sites doesn't support Google Apps Script.
Related question: Google App Scripts For New Google Sites Release
The new Google Sites does now support embedding apps script (make sure to deploy the apps script as a web app, set the right permissions, and use the /exec url and not your /dev one to embed).
I found I couldn't use the facebook SDK for videos because of the sandboxing. I used an iframe solution instead for videos, but maybe you could try something like this for the feed (I'm assuming you've registered your app in fb so you can get generate tokens):
In apps script, create a .gs file and an html file, roughly along the lines below (I haven't actually worked with returning feeds, so check the returned data structure and adjust accordingly)
//**feed.gs**
function doGet(e) {
return HtmlService
.createTemplateFromFile('my-html-file')
.evaluate();
}
function getToken() { //use your fb app info here (and make sure this script is protected / runs as you
var url = 'https://graph.facebook.com'
+ '/oauth/access_token'
+ '?client_id=0000000000000000'
+ '&client_secret=0x0x0x0x0x0x0x0x0x0x0x0x'
+ '&grant_type=client_credentials';
var response = UrlFetchApp.fetch(url, {'muteHttpExceptions': true});
var json = response.getContentText();
var jsondata = JSON.parse(json);
return jsondata.access_token;
}
function getFeed() {
var url = 'https://graph.facebook.com'
+ '/your-page/feed'
+ '?access_token=' + encodeURIComponent(getToken());
var response = UrlFetchApp.fetch(url, {'muteHttpExceptions': true});
var json = response.getContentText();
var jsondata = JSON.parse(json);
//Logger.log(jsondata); //check this and adjust following for loop and html showFeed function accordingly
var posts = {};
for (var i in jsondata) {
posts[i] = {"post":jsondata[i].message};
}
return posts;
}
<!--**my-html-file.html**-->
<!DOCTYPE html>
<html>
<head>
<base target="_top">
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js"></script>
<script>
// The code in this function runs when the page is loaded (asynchronous).
$(function() {
google.script.run
.withSuccessHandler(showFeed)
.withFailureHandler(onFailure)
.getFeed(); //this function is back in .gs file and must return an array or object which gets auto-passed to the showFeed function below
});
function showFeed(posts) { //parameter name must match array or object returned by getFeed in gs file
var html = '';
for (var p in posts) {
html += '<p>' + posts[p].post + '</p>'; //instead of a string, you can build an array for speed
}
$('#feed').empty().append(html); //if you used an array for the html, you'd split it here
}
function onFailure(error) {
$('#feed').empty().append("Unable to retrieve feed: " + error.message); ;
}
</script>
</head>
<body>
<div id="feed">
Loading...
</div>
</body>
</html>

Cordova/Ionic framework, send/share audio file thru whatsapp

I'm new with Cordova/Ionic, I've managed to reproduce mp3 files with my app using NativeAudio plugin wrapped with ng-cordova.
I have CordovaSocial plugin installed, but I can't figure out how to send audio file to whatsapp.
the mp3 is in www/audio/bass.mp3
With these lines I can reproduce the mpfile
$scope.play = function(audioFile) {
$cordovaNativeAudio.preloadSimple(audioFile, 'audio/' + audioFile + '.mp3');
$cordovaNativeAudio.play(audioFile);
}
Here my code
in html file something like
<div class="buttons" ng-click="shareViaWhatsApp(null,null,'audio/bass.mp3')">
<button class="button">Whatsapp</button>
</div>
and in the js file
$scope.shareViaWhatsApp = function(message, image, link) {
$cordovaSocialSharing.canShareVia("whatsapp", message, image, link).then(function(result) {
$cordovaSocialSharing.shareViaWhatsApp(message, image, link);
//$cordovaSocialSharing.shareViaWhatsApp('test message', null, null); //this works
}, function(error) {
alert("Cannot share on WhatsApp " + error);
});
I get URL_NOT_SUPPORTED when trying to send mp3 file thru whatsapp

Uploading file using Google Apps Script using HtmlService

How can I upload files to google drive?
I want to create a web app using google app script - htmlservice.
I don't know how to point form in html to existing google app script.
I am having hard time to find a right example in google documentation.
I found hundreds of examples using UI but according to https://developers.google.com/apps-script/sunset it will be deprecated soon.
Thank you in advance!
Janusz
<html>
<body>
<form>
<input type="file"/>
<input type="button">
</form>
</body>
</html>
Script
function doGet() {
return HtmlService.createHtmlOutputFromFile('myPage');
}
function fileUploadTest()
{
var fileBlob = e.parameter.upload;
var adoc = DocsList.createFile(fileBlob);
return adoc.getUrl();
}
Have the button run the server side function using google.script.run, passing in the entire form as the only parameter. (Inside the button's onClick, 'this' is the button, so 'this.parentNode' is the form.) Make sure to give the file input a name.
<html>
<body>
<form>
<input type="file" name="theFile">
<input type="hidden" name="anExample">
<input type="button" onclick="google.script.run.serverFunc(this.parentNode)">
</form>
</body>
</html>
On the server, have your form handling function take one parameter - the form itself. The HTML form from the client code will be transformed into an equivalent JavaScript object where all named fields are string properties, except for files which will be blobs.
function doGet() {
return HtmlService.createHtmlOutputFromFile('myPage');
}
function serverFunc(theForm) {
var anExampleText = theForm.anExample; // This is a string
var fileBlob = theForm.theFile; // This is a Blob.
var adoc = DocsList.createFile(fileBlob);
return adoc.getUrl();
}
If you actually want to use that URL you are generating and returning, be sure to add a success handler to the google.script call. You can modify it like this:
// Defined somewhere before the form
function handler(url) {
// Do something with the url.
}
<input type="button" onclick=
"google.script.run.withSuccessHandler(handler).serverFunc(this.parentNode)">
try: return HtmlService.createTemplateFromFile('myPage').evaluate();
More: html service reference
I found an answer for my question.
Submit a Form using Google App Script's HtmlService
The code in the Google App Script link below is:
function doGet(e) {
var template = HtmlService.createTemplateFromFile('Form.html');
template.action = ScriptApp.getService().getUrl();
return template.evaluate();
}
function doPost(e) {
var template = HtmlService.createTemplateFromFile('Thanks.html');
template.name = e.parameter.name;
template.comment = e.parameter.comment;
template.screenshot = e.parameter.screenshot;
return template.evaluate();
}
https://script.google.com/d/1i65oG_ymE1lreHtB6WBGaPHi3oLD_-wPd5Ter1nsN7maFAWgUA9DbE4C/edit
Thanks!