Sitefinity site in local machine with out any changes, but image is not loading - content-management-system

Hosted sitefinity site in local machine with existing code and db backup. But the image is not loading in the site front end and also in the admin side. Need some help on this. The existing live site is having images but trying to replicate it in local, it is not showing any images.

From your screenshot it looks the images are stored in the File System.
Make sure you manually copy all files from App_Data / Storage folder.

Related

Upload image error -> blob:http://localhost:3000/48c7da66-42c0-4ed3-8691-2dedd5ce4984:1 Failed to load resource: net::ERR_FILE_NOT_FOUND [duplicate]

I build a MERN app and hosted on heroku.
I saved the user's images on server by multer and it works fine for some time i.e. uploaded image is fetched successfully.
But after closing the application for long that image is not available on server.
On searching I found that each dyno on heroku boots with a clean copy of the filesystem from the most recent deploy.
But then how and where to save images?
Dyno file system is ephemeral so you need to store the file on an external storage (ie S3, Dropbox) or use an Heroku plugin (ie for FTP).
Check Files on Heroku to understand (free) options for storing/managing files (the examples are in Python but the concept is valid for other stacks too).

Tableau Server Client - Embedded Dashboard Images Not Loading To Server

While using server.workbooks.publish() from the TableauServerClient in python, the images that I have embedded in a dashboard do not load into Tableau Server. These load fine when I manually load to Server from desktop. I've attempted using images saved from the Tableau default images location and from dropbox and from OneDrive with no success. Its obviously not a critical element of the dashboard but something that the client wants to see. Has anyone done this successfully?
Desktop:
Server:
For anyone running into the same issue, you must be loading a .twbx file for the images to be included. This was answered by a developer on github.
https://github.com/tableau/server-client-python/issues/400

How do I exclude a sub folder/directory from the Azure Backup for an App Service?

How do I exclude a sub folder/directory from the Azure Backup for an App Service?
Our backup system seems to fail because this folder makes the site exceed the backup limit. So I'd like to exclude only that folder.
The website + database size exceeds the 10 GB limit for backups. Your
content size is 10 GB.
#AjayKumar-MSFT's link to Partial Backups works, but here's the details:
create a file called _backup.filter
list one directory or file per line
Upload _backup.filter file to the D:\home\site\wwwroot\ directory of your site
Example:
\site\wwwroot\Images\brand.png
\site\wwwroot\Images\2014
\site\wwwroot\Images\2013
You may filter out unnecessary files from the backup by Configuring Partial Backups. You can exclude static content that does not often change such as videos/images or unneeded log files (directories).
Partial backups allow you to choose exactly which files you want to backup.
Firstly I suggest you could check your app usage(Make sure your web app has already use 10GB). You can determine folder content size using the Azure Web Apps Disk Usage Site Extension.
You could follow this steps to see the disk usage:
Browse the Kudu site for your web app: https://sitename.scm.azurewebsites.net Click on Site extensions and then click on Gallery.
Search for Azure Web Apps Disk Usage site extension. Click on + icon to install the site extension.
Click run to start the Disk Usage Site Extension
If your web site is not exceed the 10GB. I suggest you could create a new service plan and move the your web app to the new service plan and test again. Maybe something wrong with the web app server.
If this still doesn't solve your issue. You don't exceed the 10GB but still show this error. I suggest you could create an Azure support request.
How do I exclude a sub folder/directory from the Azure Backup for an App Service?
As Ajay says, you could use Partial backups.

Magento New Installation - Backend is working however frontend points to the old URL

Tried the recommendation on several threads. Here is the issue. The admin is working and updated the new DB fine. When you view the site from the frontend the product data, images, and links all point to the original site.
I Began by creating a new DB and importing a back up of the other Magento DB. The DB has a different name, the login and pw are different as well.
I modified the secure and unsecure URLs in the core_config_data table. I then installed a fresh copy of Magento (same version as my old site ver. 1.7.0.2).
I emptied the log tables which contained the old URL as well.
I looked at the local.xml file which had the correct settings.
Please keep in mind I have not added any source files from the old site. The olny thing that is from the old site is the DB.
Any other advice would be huge! Thanks in advance.
I never could get to the bottom of it. I simply uploaded fresh install of Magento and imported the data and it worked.

Importing live website to Localhost

I am using Amazon EC2 instance for my website which is live right now.
I am struggling to import the latest version of live website onto my local server. I archived the website & downloaded, downloaded the db & after extracting to 'www/' in localhost & then creating & importing the db, the localhost does not load up the website. On the localhost page, it changes the path to 'restricted page', yet doesnt even show (load) the restricted page.
What is the best way of importing live website from EC2 to localhost?
Apologies, the db was corrupt. We had to extract the data & recreate the whole site by installing everything one by one.