Can anyone help me find out if/how you can get image data off of the 'camera roll' in an Android device, using (Appcelorator) Titanium ? I have found a 3rd party module for IOS that does this but I am desperate to find one for Android. Otherwise I'll have to scrap the Titanium and go true native.
What I need is a function that returns an array of data about the images on the device. Although I would love to get 'geolocation' data ( if it exists ), all I really need is a 'create date', and a path to the image, or the actual TiBlob.
Seems simple but i get no responses on the Appcelerator forums, which worries me. There must be at least an Android 'module' that achieves this?
Ti.Media.openPhotoGallery({
allowEditing : true,
success : function(event) {
var image = require('/modules/parts/squarecropper').crop(event.media);
setImage(image);
Ti.Media.hideCamera();
},
cancel : function() {
},
saveToPhotoGallery : false,
mediaTypes : [Ti.Media.MEDIA_TYPE_PHOTO],
});
The above method would do your job. Now then either access it directly or get into a file system and encode and decode the data.
var f = Titanium.Filesystem.getFile(currIamge);
var temp = f.read();
var encodeData = Ti.Utils.base64encode(temp);
alert("encodeData = "+encodeData);
Related
I'm making a Flutter Web App which has to access the microphone and streams the audio data as an array of integers for further processing.
I already succeeded doing this in plain JavaScript.
Things I've tried:
The flutter_sound library, but I couldn't get it to work. I also can't find any working examples for that library.
dart:web_audio seems to be a thing, but apparently you can't even import it yet in normal Flutter Apps.
dart:js is what im trying to do right now. I was able to create an AudioContext with var audioContext = JsObject(context['AudioContext']);. However, after that I dont know what syntax can be used to transfer the JavaScript code into Dart. Here is what I'm doing in JavaScript:
function initAudio() {
try {
audioCtx = new AudioContext();
const GotAudioStream = function(stream) {
const audioSource = audioCtx.createMediaStreamSource(stream);
const audioProcessor = audioCtx.createScriptProcessor(bufSize, 1, 1);
audioSource.connect(audioProcessor);
audioProcessor.connect(audioCtx.destination);
audioStarted = true;
audioProcessor.onaudioprocess = function(e) {
checkAudioBuffer(e.inputBuffer);
};
};
navigator.mediaDevices.getUserMedia({ audio: true, video: false }).then(GotAudioStream);
}
catch (err) {
console.log(err);
}
}
Does anyone have experience with the dart:js library or another Idea on how to implement a simple (live!) audio stream in Flutter Web?
Regards,
Kaisky
After adding a button to my jitsi install (via this thread), I am now trying to use htlm2canvas to take a screenshot of the video conference.
However, when I run the function, it returns the video as black, even though its showing on display.
screenshot
(Feed on the left should show video but its black)
And as you can see, the icons are also all messed up.
Is there a fix around this? or an alternative?
This is because you might be trying to capture screenshot from outside code and jitsi is running video in iframe. Security features of browser does not allow to read iframe content. you need to implement custom logic in jitsi to handle your scenario.
I have looked around, found logic in ScreenshotCaptureEffect.js. It works now…
You must have in focus video which you want to screenshot, or you can change script to send all video streams.
const storedCanvas = document.createElement('canvas');
const storedCanvasContext = storedCanvas.getContext('2d');
var vids = $('video#largeVideo');
vids[0].play();
storedCanvas.height = parseInt(vids[0].videoHeight, 10);
storedCanvas.width = parseInt(vids[0].videoWidth, 10);
storedCanvasContext.drawImage(vids[0], 0, 0, vids[0].videoWidth, vids[0].videoHeight);
storedCanvas.toBlob(
blob => {
console.debug(blob);
var data = new FormData();
data.append('file', blob);
$.ajax({
url: S3_API_URL,
cache: false,
contentType: false,
processData: false,
method: 'POST',
data: data
});
},
'png',
1.0,
);
I'm trying to get user location inside a Facebook Messenger Chat-Extension.
I open the webview and ask as usual :
var options = {
enableHighAccuracy: true,
timeout: 5000,
maximumAge: 0
};
function success(pos) {
var crd = pos.coords;
console.log('Your current position is:');
console.log(`Latitude : ${crd.latitude}`);
console.log(`Longitude: ${crd.longitude}`);
console.log(`More or less ${crd.accuracy} meters.`);
};
function error(err) {
console.warn(`ERROR(${err.code}): ${err.message}`);
};
navigator.geolocation.getCurrentPosition(success, error, options);
I'm getting this issue : ERROR(1): User denied Geolocation.Is there a way to get user location through Chat Extension Webview ? Thank you
geolocation should work fine on most iOS devices, but will fail on most (if not all) Android devices. That said there is no cross-platform solution for retrieving a user's geo location properly within a webview or Chat extension.
I am using phonegap 2.0.0 on iOS and using basically the stock image capture code. After the image is captured it is saved to the phone, the URI is saved to the database along with the rest of the items info and the image is displayed on the page.
All this is working on both iOS and android. The issue is that when the iOS phone is turned off and allowed to sit for a period of time (overnight) the images no longer display. The rest of the data is retrieved from the database and displayed but the images just show a black square where the image should be (indicating the info is still inn the database)
Does iOS rename images after being turned off and allowed to sit for a some time? any suggestions? if the phone is turned off and back on this does not happen.. only after the phone sits for some time...
this seems very relevant...
Capturing and storing a picture taken with the Camera into a local database / PhoneGap / Cordova / iOS
for anybody else having this issue the following code worked for me...
function capturePhoto() {
navigator.camera.getPicture(movePic, onFail,{ quality : 70 });
}
function movePic(file){
window.resolveLocalFileSystemURI(file, resolveOnSuccess, resOnError);
}
function resolveOnSuccess(entry){
var d = new Date();
var n = d.getTime();
var newFileName = n + ".jpg";
var myFolderApp = "myPhotos";
window.requestFileSystem(LocalFileSystem.PERSISTENT, 0, function(fileSys) {
fileSys.root.getDirectory( myFolderApp,
{create:true, exclusive: false},
function(directory) {
entry.moveTo(directory, newFileName, successMove, resOnError);
},
resOnError);
},
resOnError);
}
function successMove(imageUri) {
document.getElementById('smallImage').src = imageUri.fullPath;
//hidden input used to save the file path to the database
document.getElementById('site_front_pic').value = "file://"+imageUri.fullPath;
smallImage.style.display = 'block';
}
function onFail(message) {
alert('Failed to load picture because: ' + message);
}
function resOnError(error) {
alert(error.code);
}
I'm making a mobile-app using Phonegap and HTML. Now I'm using the google maps/places autocomplete feature. The problem is: if I run it in my browser on my computer everything works fine and I choose a suggestion to use out of the autocomplete list - if I deploy it on my mobile I still get suggestions but I'm not able to tap one. It seems the "suggestion-overlay" is just ignored and I can tap on the page. Is there a possibility to put focus on the list of suggestions or something that way ?
Hope someone can help me. Thanks in advance.
There is indeed a conflict with FastClick and PAC. I found that I needed to add the needsclick class to both the pac-item and all its children.
$(document).on({
'DOMNodeInserted': function() {
$('.pac-item, .pac-item span', this).addClass('needsclick');
}
}, '.pac-container');
There is currently a pull request on github, but this hasn't been merged yet.
However, you can simply use this patched version of fastclick.
The patch adds the excludeNode option which let's you exclude DOM nodes handled by fastclick via regex. This is how I used it to make google autocomplete work with fastclick:
FastClick.attach(document.body, {
excludeNode: '^pac-'
});
This reply may be too late. But might be helpful for others.
I had the same issue and after debugging for hours, I found out this issue was because of adding "FastClick" library. After removing this, it worked as usual.
So for having fastClick and google suggestions, I have added this code in geo autocomplete
jQuery.fn.addGeoComplete = function(e){
var input = this;
$(input).attr("autocomplete" , "off");
var id = input.attr("id");
$(input).on("keypress", function(e){
var input = this;
var defaultBounds = new google.maps.LatLngBounds(
new google.maps.LatLng(37.2555, -121.9245),
new google.maps.LatLng(37.2555, -121.9245));
var options = {
bounds: defaultBounds,
mapkey: "xxx"
};
//Fix for fastclick issue
var g_autocomplete = $("body > .pac-container").filter(":visible");
g_autocomplete.bind('DOMNodeInserted DOMNodeRemoved', function(event) {
$(".pac-item", this).addClass("needsclick");
});
//End of fix
autocomplete = new google.maps.places.Autocomplete(document.getElementById(id), options);
google.maps.event.addListener(autocomplete, 'place_changed', function() {
//Handle place selection
});
});
}
if you are using Framework 7, it has a custom implementation of FastClicks. Instead of the needsclick class, F7 has no-fastclick. The function below is how it is implemented in F7:
function targetNeedsFastClick(el) {
var $el = $(el);
if (el.nodeName.toLowerCase() === 'input' && el.type === 'file') return false;
if ($el.hasClass('no-fastclick') || $el.parents('.no-fastclick').length > 0) return false;
return true;
}
So as suggested in other comments, you will only have to add the .no-fastclick class to .pac-item and in all its children
I was having the same problem,
I realized what the problem was that probably the focusout event of pac-container happens before the tap event of the pac-item (only in phonegap built-in browser).
The only way I could solve this, is to add padding-bottom to the input when it is focused and change the top attribute of the pac-container, so that the pac-container resides within the borders of the input.
Therefore when user clicks on item in list the focusout event is not fired.
It's dirty, but it works
worked perfectly for me :
$(document).on({
'DOMNodeInserted': function() {
$('.pac-item, .pac-item span', this).addClass('needsclick');
}
}, '.pac-container');
Configuration: Cordova / iOS iphone 5