how to make german web site - encoding

what type of encoding or what do I have to do to make my web site display properly the text with German characters like this: Käse and not like this: K�se ?
Here is what I use for doctype:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN" "http://www.w3.org/TR/html4/strict.dtd">
and here is what I use for encoding:
<meta http-equiv="content-type" content="text/html; charset=utf-8" />
the collation in mysql that I use is utf8_general_ci, I have never done web sites with other languages except for english (from scratch). I dont know what I am missing!
Thank you for your time!

Your encoding choise looks fine.
There is just two steps left: You have to make sure that the content type in the HTTP header also says the same, and you have to make sure that what you actually send is encoded using UTF-8.

UTF-8 should used for sites that cater for many languages, so is suitable for your needs.
The meta tag is correct too, though you may want to ensure that the server is sending the right Content-Type header.
Ensure that the HTML file is also encoded with UTF-8 and not ASCII or another codepage.
In general, you need to ensure that all steps from the DB to the browser use UTF-8 (so, DB columns are UTF-8, transferred to the server as UTF-8, rendered as UTF-8, transferred to the browser as UTF-8 with the right headers and meta tags).

From my expiriense, for utf-8 to work right:
MySql data needs to be in some of the "utf-8" collations
The meta tag needs to define charset as "utf-8"
The MySql connector needs to be set to "utf-8" (for php, its mysql_set_charset)
The server-side file (*.php or the like) needs to be saved in utf-8 (not actually necesary, but it saves some pain)

Related

How to add Unicode emoji to the Internet Archive?

When visiting a website that contains Unicode emoji through the Wayback Machine, the emoji appear to be broken, for example:
https://web.archive.org/web/20210524131521/https://tmh.conlangs.de/emoji-language/
The emoji "😀" is rendered as "😀" and so forth:
This effect happens if a page is mistakenly rendered as if it was ISO-8859-1 encoded, even though it is actually UTF-8.
So it seems that the Wayback Machine is somehow confused about the character encoding of the page.
The original page source has a HTML5 <!doctype html> declaration and is valid HTML according to W3C's validator. The encoding is specified as utf-8 using a meta charset tag.
The original page renders correctly on all major platforms and browsers, for example Chrome on Linux, Safari on Mac OS, and Edge on Windows.
Does the Internet Archive crawler require a special way of specifying the encoding, or are emoji through UTF-8 simply not supported yet?
tl;dr The original page must be served with a charset in the HTTP content-type header.
As #JosefZ pointed out in the comments, the Wayback Machine mistakenly serves the page as windows-1252 (which has a similar effect as ISO-8859-1).
This is apparently the default encoding that the Internet Archive assumes if no charset can be detected.
The meta charset tag in the original page's source never takes effect when the archived page is rendered by the browser, because with all the extra JavaScript and CSS included by the Wayback Machine, the tag comes after the first 1024 bytes, which is too late according to the HTML5 specification: https://www.w3.org/TR/2012/CR-html5-20121217/document-metadata.html#charset
So it seems that the Internet Archive does not take into account meta charset tags when crawling a page.
However, there are other archived pages such as https://web.archive.org/web/20210501053710/https://unicode.org/emoji/charts-13.0/full-emoji-list.html where Unicode emoji are displayed correctly.
It turns out that this correctly rendered page was originally served with a HTTP content-type header that includes a charset: text/html; charset=UTF-8
So, if the webserver of the original page is configured to send such a content-type HTTP header that includes the UTF-8 encoding, the Wayback Machine should display the page correctly after reindexing.
How the webserver can be configured to send the encoding with the content-type header depends on the exact webserver that is being used.
For Apache, for example, adding
AddDefaultCharset UTF-8
to the site's configuration or .htaccess file should work.
Note that for the Internet Archive to actually reindex the page, you may have to make a change to the original page's HTML content, not just change the HTTP headers.

Jetty 9, character encoding UTF-8

I have a problem with encoding and Jetty.
All my files are encoded in UTF-8 and include the correct HTML meta tag to specify UTF-8.
Until now all my UTF-8 files had a BOM and I had no problem. But now I am using a different text editor and I noticed that my UTF-8 files are now generated without a BOM which from what I read is rather a good thing so I decided to go without BOM from now.
But the problem is that it seems that Jetty converts all my JSP files to ISO8859-1 before sending them to the browser if they don't have a BOM. It causes problem because since they have a meta tag for UTF-8 the browser interprets the files as UTF-8 and accents and other special characters do not work.
I found one workaround so far which is to start all my JSP files with :
<%# page contentType="text/html;charset=UTF-8" language="java" %>
This works but it is kindof annoying because I have to add this at the start of every file and I would rather have some server wide parameter to avoid that, if it is possible, but as I spent hours browsing the web for a solution I am beginning to think there is none.
I tried to add
JAVA_OPTIONS+=("-Dfile.encoding=UTF-8")
to my JAVA_OPTIONS when starting jetty as suggested in an other thread but it doesn't seem to do anything.
Any help would be greatly appreciated.
Looks like you are just missing the pageEncoding attribute.
<%# page language="java" contentType="text/html; charset=UTF-8" pageEncoding="UTF-8"%>
Another option that worked for me in the case of handling UTF-8 encoded files on Jetty was to change the webdefault.xml content to support UTF-8 encoding instead of the default ISO-8859-1.
You can find this file in the {{JETTY_HOME}}/etc/webdefault.xml
<locale-encoding-mapping>
<locale>en</locale>
<encoding>UTF-8</encoding>
</locale-encoding-mapping>
Hope this helps.

Encoding UTF-8 for Czech chars

I want to ask you, as a beginner, what basic settings for the document encoding are you doing with UTF-8?
An example how I do it below and am asking about repair if something is wrong. I want to rely on all devices in different browsers with different user settings will render the text as it should, so I will do the following:
I use Notepad ++ , first in the Format tab choose "change the encoding to UTF-8 (if its already not)";
Because I use <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd"> mostly or . <!DOCTYPE html>, then select the correct attribute for the meta tag in the head, so either <meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/> , respectively . <meta charset="UTF-8" />
I'm concerned mainly about the Czech characters
Am I right or isn´t it that simple if I expect cooperation between HTML, PHP or JS, maybe MySQL?
Thank you for your answers and sorry for incomplete English.
If you read text from a Database make sure that it is set to utf8 and that the columns are as well. Then you can use SET NAMES UTF8 to make sure the connection encoding is utf8 as well. Just make it your first query to the databse.

why "»" shows as a question mark("?") in my page?

Is there any restrictions for it to show normally?
Sounds like an encoding problem. For special characters like that, I prefer to use HTML entities. In this case, try »
After my experience, a question mark usually replaces undecodable special characters when you encode your special characters with utf8, because web browsers by default decode the web page using iso-latin1. You can/should explicitely declare the encoding of your web page using the following directive:
<?xml version="1.0" encoding="UTF-8" ?>
for xhtml, or
<meta http-equiv="Content-Type" content="text/html"; charset="utf-8">
(inside the element), for HTML.
Regard this post as a supplement, because I guess that using the xml/html entities like » or » mentioned above are the better way to go.
You can also use »
If your Apache server is configured with...
AddDefaultCharset UTF-8
...in the httpd.conf file (which, strangely, was the default on my server), then Content-Type specs in the .html files (e.g., <meta http-equiv=Content-Type content="text/html; charset=windows-1252">) will be ignored, causing character codes above 127 to be interpreted incorrectly.
Comment out the AddDefaultCharset line and restart Apache.

iPhone's Mobile Safari: Special Characters

The iPhone app I'm working on uses html help files and special characters such as
ü and ê
are being mangled my iPhone's mobile Safari. Anything I can do to correct this?
If you're using XHTML, ensure that the content of your files really is the encoding specified in the doctype. If you're using just plain HTML, consider using XHTML instead, or
Use HTML entities (e.g. é)
Use the META tag to specify an encoding
Have you tried using numerical character references? Alternatively, perhaps you can use a <meta http-equiv="content-type" ... element. Also, maybe there's a better way to tell mobile Safari the character encoding of HTML files (equivalent to the server's HTTP Content-Type header)