Service Worker caching/playback of videos: what is the most compatible approach? - progressive-web-apps

Im currently working on a small PWA which should also include several local video files offline.
(Using a precache szenario)
I've already noticed that the caching of such files is not straightforward, especially for iOS devices because of the limited range request.
I've already tried this solution proposed at at similar question:
PWA - cached video will not play in Mobile Safari (11.4)
But for me that didn't work either.
They only working solutions I found online used some form of blob handling in combination with either the File API or Indexed DB Storage
https://simpl.info/video/offline/
Is it possible to cache an entire HTML5 video using the Service Worker API for offline use?
As this where rather old posts I was wondering, which strategy would be appropriate concerning current apis (but also targeting older iOs Devices)
Thank you in advance.

If you're willing to go all-in on Workbox, the information at "Serve cached audio and video" should be accurate across multiple browsers and platforms.
If you'd rather handle caching yourself but just want some help creating an HTTP 206 partial response, given an incoming Request and a Response object that you've read from a cache, you can just use the createPartialResponse() method from workbox-range-requests, without using the rest of Workbox:
import {createPartialResponse} from 'workbox-range-requests';
self.addEventListener('fetch', (event) => {
const {request} = event;
if (request.headers.has('range')) {
event.respondWith((async () => {
const cache = await caches.open('media');
const fullResponse = await cache.match(request);
if (fullResponse) {
return createPartialResponse(request, fullResponse);
}
// If there's a cache miss, fall back to the network.
return fetch(request);
})());
}
});
You can also take a look at the createPartialResponse() method's source if you want to implement that yourself.

Related

Bandwidth overflow with video

I'm testing my app, and analytics say that I used 2 MB/day for the first month of tests. Yesterday I added a function that allowed to upload a video, and in order to test it I uploaded a video (or maybe 2, but surely not more) of less than 10MB of size. Now the problem is that I overflew the allowed bandwidth, using 1.5GB and I don't know why. Should I look at the method where I upload to the storage, or the ones where I download, or both?
This is the only method used to upload a video:
Future<String> uploadVideo(imageFile) async {
UploadTask uploadTask =
storageRef.child("post_$postId.mp4").putFile(imageFile);
TaskSnapshot storageSnap = await uploadTask;
String downloadUrl = await storageSnap.ref.getDownloadURL();
return downloadUrl;
}
While for downloading, since I use Chewie as video player, I have:
Chewie(
controller: ChewieController(
...
videoPlayerController: VideoPlayerController.network(widget.post.mediaUrl)
),
)
I don't know if it could be something else that causes this problem and how to fix it, so if you have any suggestion please tell me.
I understand that you are looking for a way to determine what used the bandwidth in your Firebase Storage causing it to go over the quota.
Firebase logs all requests to storage as per link. You can monitor the Cloud Storage activity for your Firebase project by following the link.
Log events provide insight on what is happening in your app, such as user actions, system events, or errors.
So, after you have configured the FirebaseApp instance, you need to log this in your application by following a process similar to the example mentioned here. This would allow you to record details of when a request is made, the size of the data transfer, and how often it occurred.
Apart from this, you can also open a support request or billing query to Google, where you can ask for a more detailed breakdown of the usage. You may also refer to the link for details on pricing.

How to achieve realtime synchronization between my own back-end and my mobile app?

I'm a bit jealous of services like Google Cloud Firestore which achieve realtime sync between mobile app (or web app) and back-end. It makes me see plain-old HTTP GET back-ends as prehistoric.
I'm wondering if, in 2O19, it exists quite simple/scalable solutions/frameworks to achieve this on my own back-end.
I've heard about sockets but it looks costly and quite difficult to setup (maybe I'm wrong). Silent notifications maybe? (but again it adds a layer of complexity of managing that and we don't have 100% confidence that every notification will reach its target).
I understand conflicts is the most sensitive topic so even a readonly solution (only back-end can modify entities) would be great.
There are solutions for real-time synchronization of data. Most of them use WebSockets or SSE (Server Side Events) for transportation, which also goes for Google Cloud Firestore. And many of the existing solutions are databases with synchronization capabilities, just like Firestore. But there are other options too.
Existing solutions
Assuming you are not looking for push solutions, such as PubNub or Pusher, but rather seamless synchronization of data, then I can mention a few to get you started:
Resgate
CouchDB
RethinkDB
DeepStream
Example using Resgate
Realtime API gateways such as Resgate (which I am clearly baised towards :) ), lets you write (micro)services in the language of your choice in a similar fashion as you would write an ordinary HTTP webservice. The gateway then exposes the resources in a REST and realtime API, and keeps the clients synchronized.
C# service using ResgateIO.Service
ResService service = new ResService("example");
service.AddHandler("mymodel", new DynamicHandler()
.SetGet(r => r.Model(new {
message = "Hello, World!",
}))
.SetAccess(r => r.AccessGranted()));
service.Serve("nats://127.0.0.1:4222");
But with the addition that you can send events to update and synchronize all clients:
service.With("example.mymodel", r => r.ChangeEvent(new Dictionary<string, object> {
{ "message", "Hello, StackOverflow!" }
}));
The client can then fetch the data, and listen for updates:
Javascript client using ResClient
let client = new ResClient('ws://localhost:8080');
client.get('example.mymodel').then(model => {
console.log(model.message); // Hello, World!
model.on('change', () => {
console.log(model.message); // Updated to: Hello, StackOverflow!
});
});
Considerations
All of the solutions mentioned above (and there are more for those who seek) have with their strengths and weaknesses in areas such as:
Resilience - handling of lost messages and connections
Access control - granting access and withdrawing access to a subscription
Live queries - fetching partial or filtered data that is updated live
Offline support - working with the data while offline
Scaling
Database requirements
Simplicity of usage
Just look around and see which solution suits your needs best.
There are many solutions, the more I search the more I find.
I use the couchbase lite stack which consists of:
Front end cross platform Couchbase Lite (CBL) database which runs with an invisible synchronizer
Backend Couchbase database cluster
Couchbase sync gateway service which synchronizes data between FE and BE on WebSocket basis
More details: Couchbase Mobile

Angular PWA Offline Storage

I’m building a new web application which needs to work seamlessly even when there is no internet connection. I’ve selected Angular and am building a PWA as it comes with built-in functionality to make the application work offline. So far, I have the service worker working perfectly and driven by the manifest file, this very nicely caches the static content and I’ve set it to cache a bunch of API requests which I want to use whilst the application is offline.
In addition to this, I’ve used localStorage to store attempts to invoke put, post and delete API requests when the user is offline. Once the internet connection is re-established, the requests stored in localStorage are sent to the server.
This far in my proof of concept, the user can access content whilst offline, edit data and the data gets synced with the server once the user’s internet connection is re-established. This is where my quandary begins though. There is API request data cached automatically by the service worker as defined in the manifest file, and there is a separate store of data for data edits whilst offline. This leads to a situation where the user edits some data, saves the data, refreshes the page and the data is served by the service worker cached API.
Is there a built in mechanism to update API data cached automatically by the service worker? I don’t fancy trying to unpick this manually as it seems hacky and I can’t imagine it’ll be future proof as service workers evolve.
If there isn’t a standard way to achieve what I need to do, is it common for developers to take full control of offline data by storing it all in IndexedDB/localStorage manually? i.e. I could invoke API requests and write some code which caches the results in a structured format in IndexedDB to form an offline database, then writes back to the offline database whenever the user edits some data, and uploads any data edits when the user is back online. I don’t envisage any technical problems with doing this, it just seems like a lot of effort to achieve something which I was hoping to be standard functionality.
I’m fairly new to developing with Angular, but have many years of other development experience. So please forgive me if I’m asking obvious questions, I just can’t seem to find a good article on best practices for working with data storage with service workers.
Thanks
I have a project where my users can edit local data when they are offline and I use Cloud Firestore to have a local database cached available. If I understood you correctly, this would be exactly your requirement.
The benefit of this solution is that with just one line of code, you get not only a local db, but also all the changes made offline are automatically synchronised with the server once the client gets online again.
firebase.firestore().enablePersistence()
.catch(function(err) {
// log the error
});
// Subsequent queries will use persistence, if it was enabled successfully
If using this NoSQL database is an option for you I would go with it, otherwise you need to implement the local updates by yourself as there is not a built in solution for that.

Flex mobile project for IOS, server side proxy

I am trying to write an iphone app that loads a video from an inbuilt web server running off a camera (connect to iphone via wifi).
I am using flash builder / flex mobile project - not particularly familiar but finding it easier to understand than xcode !!
The files from the camera have the wrong file extension so will not play on the ios video app, can I set up a server side proxy in flex mobile and use this to alter the file extension and then pass this link to the ios video app ?
If so any help anybody could give me ( examples etc) would be really grateful received , I have been trying to get round this problem for a couple of weeks .
Cheers
Toby
I can explain, conceptually, what a server side proxy would do in this case. Let's say you are retrieving a URL, like this:
http://myserver.com/somethingSomething/DarkSide/
to retrieve a video stream from the server. You say it won't be played because there is no file extension; so you have to, in essence, use a different URL with the extension. Set up 'search engine friendly' URLs on the server. And do something like this:
http://myserver.com/myProxy.cfm/streamURL/somethingSomething%5CDarkSide/Name/myProxyVid.mp4
Here is some information on how to deal with Search Engine Friendly URLs in ColdFusion. Here is some information on how to deal with Search Engine Friendly URls in PHP. I'm sure Other technologies will come up in a Google Search.
In the URL above; this is what you have:
http://myserver.com/: This is your server
myProxy.cfm: This is your server side file; that is a proxy
streamURL/somethingSomething%5CDarkSide/Name/myProxyVid.mp4: This is the query string. It consists of two name value pairs. The first is the streamURL. This is the URL you want to retrieve with your proxy. The second is just random; but as long as it ends with the file extension .mp4 the URL should be seen as an 'mp4 file'
The code behind your myProxy.cfm should be something like this, in psuedo-code:
Parse URL Query String
Retrieve Stream.
Set mimeType on return value.
Return stream data
I used a similar approach on TheFlexShow.com to track the number of people who watch our screencast on-line vs downloading it first. I also used the same approach to keep track of impressions of advertiser's banner ads. For example, the browser can't tell that this is not a JPG image:
http://www.theflexshow.com/blog/mediaDisplay.cfm?mediaid=51
Based on this, and one of your previous questions; I am not convinced this is the best solution, though. I make a lot of assumptions here. I assume that the problem with playing the file does relate to the extension and not the file data. I assume that you are not actually streaming video with an open connection on both client and server to send data back and forth.

Can Google App Engine be used for "check for updated" and download binary file web service?

I'm a Google App Engine newbie and would be grateful for any help. I have an iPhone app which sources data from an sqlite db stored localling on the device.
I'd like to set up a Google App Engine web service which my iPhone client will talk to and check if there is a newer version of the sqlite database it needs to download.
So iPhone client hits the web service with some kind of version number/timestamp and if there is a newer file, the App Engine will notify the client and the client will then request the new database to download which the App Engine will serve.
Is it possible to set up a web service in Google App Engine to do this? Could anyone point me to any sample code / tutorials please?
Many Thanks
What I would do is keep the SQLite DB as a gzipped blob in the datastore. Use the SHA1 hash as an etag. The client does a GET request with the etag header, and the server either responds with a 304 Not Modified or a 200 and the blob contents in the response body.
There is an API specifically for blobs, called the Blobstore API, but to use it you need to have billing enabled. Without billing enabled you can still easily serve blobs, but you'll be limited to 10MB per request, response, and entity size. If your zipped database is larger than that, you could conceivably break up the download into multiple requests, since you control both the client and server code. A custom blob handler that just uses the datastore might look like this:
class MyModel(db.Model):
body = db.BlobProperty()
class MyBlobHandler(webapp.RequestHandler):
def get(self):
entity_key = self.request.get("entity_key")
entity = MyModel.get(entity_key)
self.response.headers['Content-type'] = 'what/ever'
self.response.out.write(entity.body)
def put(self):
entity = MyModel(body=db.Blob(self.request.body))
entity.put()
self.response.out.write(entity.key())
This is entirely possible with App Engine, given that you're making HTTP requests.
The best code and tutorials, in my opinion, is the official Google App Engine docs. Everything you'll need is there.