Web hosting used to be a lot more lenient and allowed different language encoding on a per page basis, by using a META tag in the <HEAD> section like:

<meta http-equiv="Content-Type" content="text/html; charset=big5"> 

But recently discovered that my web hosting server (and perhaps many others) have a server-defined language character set encoding, and the META tag is ignored, so to speak.

When Did UTF-8 Take Over as the Web Standard

Don’t know exactly when things changed, cf. Declaring character encodings in HTML, but now the web standard is mainly UTF-8. But it happened around 10 years ago:

Any barriers to using Unicode are very low these days. In fact, in January 2012 Google reported that over 60% of the Web in their sample of several billion pages was now using UTF-8. Add to that the figure for ASCII-only web pages (since ASCII is a subset of UTF-8), and the figure rises to around 80%.

Also learned that character sets like Big5 and GB-2312 should be avoided because they lack interoperability.

For documents with older character encoding

To accommodate some old .HTM files that have older character sets, this .HTACCESS file placed in a per subdirectory basis, can keep the content displaying properly on the NGINX server. The syntax is pretty much self-explanatory.

<FilesMatch \.htm$>
 ForceType text/html;charset=gb2312
</FilesMatch>

<FilesMatch "^(chinese-gb2312|filename1|etc1)\.htm$">
 ForceType text/html;charset=gb2312
</FilesMatch>

<FilesMatch "^(chinese-big5|filename2|etc2)\.htm$">
 ForceType text/html;charset=big5
</FilesMatch>

Aside

Good utility and reference sites: https://validator.w3.org/i18n-checker/and https://httpheadercheck.com/ and https://htaccesscheatsheet.com/

What about just converting texts and content files into UTF-8? Try one of these: https://www.njstar.com/cms/cjk-code-to-unicode-conversion or https://github.com/yookoala/big5_folder_to_utf8 or https://subtitletools.com/convert-text-files-to-utf8-online