Making a nice mobile site with jQuery.
The splash screen works great except on an iPhone 5 with iOS6 where it continues to show the old screen from day 1 despite that image not even being on the server and clearing cache endless time, and certainly deleting the app storied on the phone desktop.
... so where is that thing stored on the phone? if not in the cache?
Try this link, it worked for me on my iPhone.
Here are the steps:
Go to Settings
Select 'Safari'
Hit 'Clear Cache'
EDIT: I just saw the comment that you tried to clear the cache. Did you do it this way?
Are you using PHP?
Add the following code to your site, it will send a header which prevents the browser from caching your site.
header("Expires: Mon, 26 Jul 12012 05:00:00 GMT");
header("Last-Modified: " . gmdate("D, d M Y H:i:s") . " GMT");
header("Cache-Control: no-store, no-cache, must-revalidate");
header("Cache-Control: post-check=0, pre-check=0", false);
header("Pragma: no-cache");
Are you using ASP.NET?
Check out this stackoverflow question to learn about preventing the browser to cache your site.
Related
I have an application that works logging in with the api and save the cookies(authentication). Everything works fine, but once you close the application and lunche the application again you find the 'NotLoggedIn' message and you find that the cookies have been deleted. Any Help ?
thank you...
Use localstorage instead of cookies
localstorage has following + points
persists even when the browser is closed and reopened.
with no expiration date,
cleared only through JavaScript, or clearing the Browser cache / Locally Stored Data.
Storage limit is the maximum amongst the three.
Set LocalStorage
localStorage.setItem('userdetails',UserData);
Get LocalStorage
localStorage.setItem('userdetails');
thank you I found a solution this is my method of saving cookies value in local storage and I test if the value not null and i set cookies to my website and all it works .
for get and set cookies i use this plugin : cordova-plugin-cookie-polyfill
I have a page that among other things has "Add to Calendar" links. These download Icalendar (.ics) events. It's for a travel situation, so there can be two events (outbound and return journeys) and each is offered as a separate download.
If I click one (e.g. outbound) it downloads and offers it to open. I do so and click save to calendar and it adds it to my calendar. So far, so good. Then I click the other one (return) and it downloads but when I open it it opens the already saved event for the outbound instead of a new event for the return. Thus, the wrong data and I have no option to save it (since the event already exists).
This is happening on both iPhone (safari browser) and an Android phone (chrome). No problems on the desktop. Closing the calendar app doesn't help.
It only happens if I add the first event to the calendar - if I just back without saving there there is no problem. It doesn't matter if I try the outbound or return first, the first one added to the calendar takes over! If I delete that from the calendar, I can then add the other one.
Each has a file name which includes the route (so the outbound and return have different file names) given in a Content-Disposition header. I also ensure a new copy is always used. The full headers (before echoing the contents of the ics file and dieing) are:
header( 'Cache-Control: no-cache, must-revalidate', true ); // HTTP/1.1
header( 'Expires: Sat, 26 Jul 1997 05:00:00 GMT', true ); // Date in the past
header( 'Content-Type: text/calendar; charset=utf-8', true );
header( 'Content-Disposition: attachment; filename="' . $filename . '"', true );
header( 'HTTP/1.0 200 OK', true, 200 );
I'm kind of out of ideas at this point.
OK. Turns out the phones were correct and outlook was wrong. The back-end was re-using the email address as UID, so all the events had the same UID...
I have fixed the back-end code to use a more sensible UID now. And it works
Ok, I have a technical question here. We've developed an integration component in XStudio so that we can pick VersionOne's "Stories" (as "Requirements" in XStudio) and "Defects" (as "Bugs" in XStudio). This way you can execute your tests and manage the results, metrics etc. from XStudio but also manage the complete traceability matrix (Products -> Requirements -> Tests -> Test campaigns -> Bugs) in XStudio using VersionOne's items. We handle the links our side.
To do this, we implemented the connector using VersionOne's REST API.
Everything works great! very fast etc.
We tested it using a free server from VersionOne with no problem. Our Java code manage cookies so that it authenticates using "Basic Authentication" protocol, we retrieve the cookie from VersionOne, store it in the local CookieStore and provide that cookie in the next requests so that we do not have to authenticate again and again. All this worked fine from our side.
The content looks like this:
{X-Instart-Request-ID=[7405870175418545839:SEN01-NPPRY09:1396448658:44],
null=[HTTP/1.1 200 OK],
Date=[Wed, 02 Apr 2014 14:24:17 GMT],
Content-Length=[16063],
Expires=[-1],
VersionOne=[Ultimate/14.0.7.6706; Scrum],
Set-Cookie=[.V1.Ticket.ncnuaaa=HFZlcnNpb25PbmUuV2ViLkF1dGhlbnRpY2F0b3LqgwAAB1hTdHVkaW+CjqLWdBzRCP8/N/R1KMorEByFu31RuGY+eqVCi1FHvTE=; path=/; HttpOnly],
Connection=[keep-alive],
Content-Type=[text/xml; charset=utf-8],
Server=[Microsoft-IIS/8.0],
Pragma=[no-cache],
Cache-Control=[no-cache]}
BUT... when we run our code on our client's environment, we don't get the original cookie for any reason !?
{cache-control=[no-cache],
content-type=[text/xml; charset=utf-8],
null=[HTTP/1.1 200 OK], expires=[-1],
content-length=[16063],
server=[Microsoft-IIS/8.0],
date=[Wed, 02 Apr 2014 12:34:08 GMT],
pragma=[no-cache]}
When our code get the header fields from the connection and we try to get the "Set-Cookie" field it can't find it and a popup is automatically display.
Map<String, List<String>> headerFields = connection.getHeaderFields();
List<String> cookiesHeader = headerFields.get("Set-Cookie");
The popup is asking to authenticate (by the way on "www6.v1host.com/192.33.31.50" while it was more expected "www6.v1host.com/abcded" - maybe ther's a clue here?).
If we authenticate on your server here everything continues normally and everything works ok.
But we shouldn't have to authenticate again as we do it in the connection before:
String plainAuth = username + ":" + password;
encodedAuth = ("Basic " + new String(Base64.encode(plainAuth.getBytes()))).replaceAll("\n", "");
connection.setRequestMethod("GET");
connection.setRequestProperty("Authorization", encodedAuth);
connection.setRequestProperty("Connection", "keep-alive");
So, not sure if this is because the authentication is not working (it would explain why the cookie is not returned and the popup ask the user to explicitly authenticate) or there is something specific in terms of cookie management...
Do you have any idea what could be going on here?
This code is working well on many different other REST APIs using Basic Auth. and Cookies.
Thanks in advance,
It sounds like you work for XQual, the developers of XStudio. If so, please reach out to me. We are always happy to list another integration.
Assuming this is intended to work for more than just 1 customer, I have a couple pieces of advice:
Provide a meaningful user agent header. This helps us be pro-active with your integration. The header is useful even for custom, one-off products, but is even more important when other vendors are involved.
Use OAuth2 for authentication. There are plenty of good libraries for OAuth2 in Java. We have an upcoming blog post where we show how it can be done with Apache Oltu.
To your specific question, I have some hunches:
You might be expecting too much about how VersionOne is deployed. VersionOne is offered both on-premise and on-demand. Your customer may be putting "192.33.31.50" into the configuration to represent on-premise, while you are expecting an instance name for on-demand. Also beware that not all on-demand instances are on "www6".
If on-premise, VersionOne also offers an installation option for Windows Integrated Authentication. In which case, you may not be getting the headers you were expecting. This is one reason I recommended OAuth2 above. OAuth2 is always available for API calls, regardless of the user authentication mechanism.
To better diagnose, could you share some code that shows how you construct the request URL?
I am aware that Facebook caches the Like data for specific pages on your site once they're visited for the first time, and that entering the url into the debugger page clears the cache. However, we've now improved our Facebook descriptions/images/etc and we need to flush the cache for the entire site (about 300 pages).
Is there a simple way to do this, or if we need to write a routine to correct them one by one, what would be the best way to achieve this?
Is there a simple way to do this,
Not as simple as a button that clears the cache for a whole domain, no.
or if we need to write a routine to correct them one by one, what would be the best way to achieve this?
You can get an Open Graph URL re-scraped by making a POST request to:
https://graph.facebook.com/?id=<URL>&scrape=true&access_token=<app_access_token>
So you’ll have to do that in a loop for your 300 objects. But don’t do it too fast, otherwise you might hit your app rate limit – try to leave a few seconds between the requests, according to a recent discussion in the FB developers group that should work fine. (And don’t forget to URL-encode the <URL> value properly before inserting it into the API request URL.)
The simple solution in wordpress, go to permalinks and change the permalinks and use a custom permalink, in my case I just added an underscore so did this...
/_%postname%/
Facebook then has no info on the (now) new urls so they scrape it all fresh.
I was looking for this same answer and all the answers were super complicated for me as a non coder.
Turned out there is a very simple answer and I came up with it all by myself :) .
I have a wordpress website that with a variety of plugins I've bulk uploaded over 4,000 images that created 4,000 posts.
The problem was I uploaded them and then tried setting up the facebook share plugins before sorting the og:meta tag issue so the total 4,000 posts were scraped by FB with no og:meta so when I then added them it made no difference. The fb debugger could not be used as I had over 4k posts.
I must admit I'm a bit excited, for many years I have got helpfull answers from google searches sending me to this forum. Often the suggestions I found were well over my head as I'm not a coder, I'm a "copy paster".
I'm so happy to be able to give back to this great forum and help someone else out :)
Well i also got the same scenario and used hack and it works but obviously as #Cbroe mentioned in his answer that the API call has some limitation with rate limiting so i guess you should take care of it in my case i only have 100 URLs to re-scrape.
So here is the solution:
$xml = file_get_contents('http://example.com/post-sitemap.xml'); // <-- Because i have a wordpress site which has sitemap.
$xml = simplexml_load_string($xml); // Load it as XML
$applicationAccessToken = 'YourToken'; // Application Access Token You can get it from https://developers.facebook.com/tools/explorer/
$urls = [];
foreach($xml->url as $url) {
$urls[] = $url->loc; // Get URLS from site map to our new Array
}
$file = fopen("response.data", "a+"); // Write API response to another file so later we can debug it.
foreach($urls as $url) {
echo "\033[Sending URL for Scrape $url \n";
$data = file_get_contents('https://graph.facebook.com/?id='.$url.'&scrape=true&access_token='.$applicationAccessToken);
fwrite($file, $data . "\n"); //Put Response in file
sleep(5); // Sleep for 5 seconds!
}
fclose($file); // Close File as all the urls is scraped.
echo "Bingo It's Compelted!";
I have an enterprise account, and I am distributing a private Application, so I have a web page that requires a login/password and then has a link to the .plist manifest file, which also has the link to the .ipa file, the problem is both files are in the same server, and that server requires a login.
Which is why whenever I click the link even though I am already logged in I get the "unable to connect" pop-up. I though the credentials were passed since I am already logged in, but clearly they are not.
How can I secure the download of the .ipa file without requiring the password (or can the password be supplied somehow by the user), without someone being able to create their own manifest file and supplying the link to my .ipa file?.
I tried looking online for a solution for this, but found nothing.
Any help will be greatly appreciated.
Thank you.
You could create your own login system using php and mysql.
Put the ipa file into a folder before public_html or htdocs or www_root or whatever you use as the root of your website. This way people can't link to it.
Then use some code like this:
$file = '/absolute/path/to/application.ipa';
if (file_exists($file)) {
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
exit;
}
This will force the download. If the user isn't logged into your session then you can serve an error message or similar.
Have you looked at TestFlight https://testflightapp.com/ or HockeyApp http://www.hockeyapp.net/ ?
For iOS only I'd recommend TestFlight, but if you distribute Android apps as well HockeyApp might be better. (TestFlight doesn't support Android at the moment)