Google cloud storage: How can I reset edge cache? - google-cloud-storage

I updated an image (from PHP) but still the old version of the image is downloaded.
If I download the image on the GCS console, I can download the new version of the image. However, this url below returns the old version.
https://storage.googleapis.com/[bucket name]/sample-image.png
It seems that the old image is on the Google's edge cache.
Some articles say that I should delete the image object then insert the new image object so that the edge cache is cleared.
Does anyone know about this?
Update 1
This is my PHP code which is on GCE.
$obj = new \Google_Service_Storage_StorageObject();
$obj->setName($path . "/" . $name);
$client = new \Google_Client();
$client->useApplicationDefaultCredentials();
$client->addScope(\Google_Service_Storage::DEVSTORAGE_FULL_CONTROL);
$storage = new \Google_Service_Storage($client);
$bucket = 'sample.com';
$binary = file_get_contents($_FILES['files']['tmp_name']);
$fileInfo = new finfo(FILEINFO_MIME_TYPE);
$mimeType = $fileInfo->buffer($binary);
$storage->objects->insert($bucket, $obj, [
'name' => $path . "/" . $name,
'data' => $binary,
'uploadType' => 'media',
'mimeType' => $mimeType,
]);
It seems that only these parameters are valid. I don't think I can set any cache settings.
// Valid query parameters that work, but don't appear in discovery.
private $stackParameters = array(
'alt' => array('type' => 'string', 'location' => 'query'),
'fields' => array('type' => 'string', 'location' => 'query'),
'trace' => array('type' => 'string', 'location' => 'query'),
'userIp' => array('type' => 'string', 'location' => 'query'),
'quotaUser' => array('type' => 'string', 'location' => 'query'),
'data' => array('type' => 'string', 'location' => 'body'),
'mimeType' => array('type' => 'string', 'location' => 'header'),
'uploadType' => array('type' => 'string', 'location' => 'query'),
'mediaUpload' => array('type' => 'complex', 'location' => 'query'),
'prettyPrint' => array('type' => 'string', 'location' => 'query'),
);
https://github.com/google/google-api-php-client/blob/master/src/Google/Service/Resource.php
I tried this way but not work so far. This is for only GAE...? (Or mounting may be necessary)
$image = file_get_contents($gs_name);
$options = [ "gs" => [ "Content-Type" => "image/jpeg"]];
$ctx = stream_context_create($options);
file_put_contents("gs://<bucketname>/".$fileName, $gs_name, 0, $ctx);
How do I upload images to the Google Cloud Storage from PHP form?
Update 2
API doc shows cacheControl property of Request body. I guess that using API directly (not via SDK) is a way. I will try it.
https://cloud.google.com/storage/docs/json_api/v1/objects/insert
cacheControl string Cache-Control directive for the object data. writable
I think I found it finally!
$obj->setCacheControl('no-cache');
Update 3
$bucket_name = 'my-bucket';
$file = "xxx.html";
$infotowrite = "999";
$service = new Google_Service_Storage($client);
$obj = new Google_Service_Storage_StorageObject();
$obj->setName($file);
$obj->setCacheControl('public, max-age=6000');
$results = $service->objects->insert(
$bucket_name,
$obj,
['name' => $file, 'mimeType' => 'text/html', 'data' => $infotowrite, 'uploadType' => 'media']
);
Set Cache-Control php client on Google Cloud Storage Object
We can check the result
gsutil ls -L gs://...

By default, if an object is publicly accessible to all anonymous users and you do not otherwise specify a cacheControl setting, GCS will serve a Cache-Control header of 3600 seconds, or 1 hour. If you're getting stale object data and haven't been messing with cache control settings, I assume you're serving publicly accessible objects. I'm not sure if Google itself is caching your object data or if there's some other cache between you and Google, though.
In the future, you can fix this by explicitly setting a shorter Cache-Control header, which can be controlled on a per-object basis with the cacheControl setting.
Right now, you can probably get around this by tacking on some made up extra URL query parameter, like ?ignoreCache=1
More: https://cloud.google.com/storage/docs/xml-api/reference-headers#cachecontrol

Related

Upload files to Owncloud

I wonder, can i upload files to OwnCloud by some post or put request?
My goal - user uploads files to one server1, after submitting form his data
handles and sends to another server2 with Owncloud installed on, then returns
path to file in owncloud back. So record in server1 will have some filename
property points to owncloud storage.
(Note: I am not talking about WebDAV).
Any other capabilities?
The Own Cloud API exposes an endpoint which makes this possible (both for POST and PUT):
post(string $uri, array $options = array()) : \OCP\Http\Client\IResponse
and
put(string $uri, array $options = array()) : \OCP\Http\Client\IResponse
Parameters
string $uri array $options Array such as
'body' => [ 'field' =>
'abc', 'other_field' => '123', 'file_name' => fopen('/path/to/file',
'r'), ], 'headers' => [ 'foo' => 'bar', ], 'cookies' => [' 'foo' =>
'bar', ], 'allow_redirects' => [ 'max' => 10, // allow at most 10
redirects. 'strict' => true, // use "strict" RFC compliant redirects.
'referer' => true, // add a Referer header 'protocols' => ['https'] //
only allow https URLs ], 'save_to' => '/path/to/file', // save to a
file or a stream 'verify' => true, // bool or string to CA file
'debug' => true,
see https://doc.owncloud.org/api/classes/OCP.Http.Client.IClient.html for the relevant section in the API docs

Symfony2 - Functional Testing File uploads with dynamically created fields

I'm fighting with functional testing of files uploading.
I will try to simplify my situation. Let's say I have
a company entity, which has 3 fields.
Company {
protected name;
protected tags;
protected images;
}
Images is array of CompanyImage entities, which serve for
storing image files and tags contains array of Tag entities,
which can be m:n connected with companies.
In the form I use jquery, for adding tags and images dynamically.
(you can create images and add them to the company similar to the
collection type symfony tutorial)
Because images and tag arrays are created with jquery, I cannot
simply use something like the turorial line below in the functional test of the company form.
$form['images'][0]->upload('/path/to/image.jpg');
For the setting
of the form values I use simple a little trick described by sstok here
(https://github.com/symfony/symfony/issues/4124)
public function testCompanyCreation() {
...
//option1
$image = new UploadedFile(
'/path/to/image.jpg',
'image.jpg',
'image/jpeg',
123
);
//or option2
//$image = array('tmp_name' => '/path/to/image.jpg', 'name' => 'image.jpg', 'type' => 'image/jpeg', 'size' => 300, 'error' => UPLOAD_ERR_OK);
$companyFormNode = $companyCrawler->selectButton('Create');
$companyForm = $companyFormNode->form();
$values = array(
'company' => array(
'_token' => $companyForm['company[_token]']->getValue(),
'name' => 'test company',
'tags' => array('1'),
'images' => array('0' => (array('file' =>$image))),
),
);
$companySubmitCrawler = $client->request($companyForm->getMethod(), $companyForm->getUri(), $values, $companyForm->getPhpFiles());
}
this works perfectly until I try to upload the image file.
With the option1 I get following exception
Exception: Serialization of 'Symfony\Component\HttpFoundation\File\UploadedFile' is not allowed
when I use option2 I get this
Argument 1 passed to Acme\myBundle\Entity\CompanyImage::setFile() must be an instance of Symfony\Component\HttpFoundation\File\UploadedFile, array given, called in ...\PropertyAccess\PropertyAccessor.php on line 347 and defined (500 Internal Server Error)
I would also like to point out, that the whole form and uploading of the files works without any problems in the browser. I also tried to make the entities serializable, and it didn't help. Do I have a bug somewhere?
I have figured it out (took couple of hours). Files have to be uploaded in a separate array.
$companyForm = $companyFormNode->form();
$values = array(
'company' => array(
'_token' => $companyForm['company[_token]']->getValue(),
'name' => 'test company',
'tags' => array('1')
),
);
$files = array(
'company' => array('images' => array('0' => (array('file' => $image))))
);
$companySubmitCrawler = $client->request(
$companyForm->getMethod(),
$companyForm->getUri(),
$values,
$files
);

Zend Framework 2 invalidate translator cache

I have this code:
'translator' => array(
...
'cache' => array(
'adapter' => array(
'name' => 'Filesystem',
'options' => array(
'cache_dir' => __DIR__ . '/../../../data/cache',
'ttl' => '3600'
)
),
'plugins' => array(
array(
'name' => 'serializer',
'options' => array()
),
'exception_handler' => array(
'throw_exceptions' => true
)
)
)
The question is, how do I invalidate it not by TTL?
For example, I KNOW when the translation was changed so I want to invalidate in on demand but I have not found a way to do it.
The translator component does not utilize the TaggableInterface so you have to know the cacheId which the translator generates to clear the item from you storage adapter. You can use the following code to simply generate the same id and remove the item. Call this from your service or some event listener.
$translator = $sm->get('McvTranslator');
$textDomain = 'default';
$locale = 'en';
$cacheId = 'Zend_I18n_Translator_Messages_' . md5($textDomain . $locale);
$translator->getCache()->removeItem($cacheId);
I think you could set Ttl = 0 (always), and when the cache (file) is not valid anymore -- delete it.
Another way to do it:
Find a point in your code where you call addTranslation.
For example:
$translate = Zend_Registry::get('Zend_Translate');
$translate->addTranslation(array(
'content' => "$dir/$locale.mo",
'locale' => $locale
));
Change the addTranslation function to add reload => true , like this:
$translate->addTranslation(array(
'content' => "$dir/$locale.mo",
'locale' => $locale,
'reload' => true
));
Refresh your page.
Voila.
Remeber to remove reload after that, otherwise you will have no cache.

CurlUrlInvalidException : Facebook Real time update

I was trying to implementing facebook real time update. It looks like i was able to subscribe.
`$param = array(
'access_token' => $user_access_token,
'object' => 'user',
'fields' => 'name',
'callback_url' =>'http://127.0.0.1/storm/callback.php',
'verify_token' => 'XYZ',
'active' => true
);
$subs = $facebook->api('/'.$app_id.'/subscriptions', 'POST', $param);
`
I get this error:
{"message":"http:\/\/127.0.0.1\/storm\/callback.php?hub.mode=subscribe&hub.challenge=1229793076&hub.verify_token=XYZ is an internal url, but this is an external request.","type":"CurlUrlInvalidException"}}
Does this have anything to do with me testing it locally??
How can i fix this? Please let me know.
You need to use your real ip address in 'callback_url'

Zend_Feed_Reader Not supported Schema

I'm using Zend FW and wanted to make a feed reader. I did the following:
$feed = Zend_Feed_Reader::import('feed://blog.lookup.cl/?feed=rss2');
$data = array(
'title' => $feed->getTitle(),
'link' => $feed->getLink(),
'dateModified' => $feed->getDateModified(),
'description' => $feed->getDescription(),
'language' => $feed->getLanguage(),
'entries' => array(),
);
foreach ($feed as $entry) {
$edata = array(
'title' => $entry->getTitle(),
'description' => $entry->getDescription(),
'dateModified' => $entry->getDateModified(),
'authors' => $entry->getAuthors(),
'link' => $entry->getLink(),
'content' => $entry->getContent()
);
$data['entries'][] = $edata;
}
And it throws the following exception: Scheme "feed" is not supported
The blog was made using Wordpress.
What's wrong? If "feed it's not supported", how can I change the type of feed that Wordpress does?
Thanks in advance,
Take care,
Solved.
I had to put http instead of feed.