emacs-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: can not decode 0x93 and 0x94 to correct char


From: Eli Zaretskii
Subject: Re: can not decode 0x93 and 0x94 to correct char
Date: Fri, 28 Sep 2007 16:45:29 +0200

> From: Stefan Monnier <address@hidden>
> Date: Fri, 28 Sep 2007 09:50:47 -0400
> Cc: address@hidden
> 
> > 1. emacs -q
> > 2. open char_err_clip.c
> > 3. \223GPL License\224
> 
> > please check screen shots for detail.
> 
> The problem here seems to be the default coding system used by Emacs.
> Apparently it uses something like latin-1 rather than something
> like cp1252.

Yes.  However, I don't think this is a problem, see below.

> I don't know enough about how such things are specified in
> general (outside of Emacs) under w32 to be able to help any further, but all
> I know is that maybe Emacs should try and figure out that your default coding
> system should be cp1252.  Maybe the problem is that Emacs doesn't try to do
> it, or maybe ti doesn't know how to do it, or maybe it does it wrong, or
> maybe it doesn't want to do it (e.g. because cp1252 covers the whole 256
> possible bytes so the auto-detection can't work well).

Emacs on Windows looks up the UI language of the current user, and
then sets up the language environment for that language.  Most
language environments do not specify cpNNNN as their preferred
encodings, so neither does Emacs.




reply via email to

[Prev in Thread] Current Thread [Next in Thread]