emacs-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: can not decode 0x93 and 0x94 to correct char


From: Stefan Monnier
Subject: Re: can not decode 0x93 and 0x94 to correct char
Date: Fri, 28 Sep 2007 09:50:47 -0400
User-agent: Gnus/5.11 (Gnus v5.11) Emacs/23.0.50 (gnu/linux)

> Could you confirm the issue?

> version: GNU Emacs 23.0.0.1
> platform: winxp + sp2

> Steps:

> 1. emacs -q
> 2. open char_err_clip.c
> 3. \223GPL License\224

> please check screen shots for detail.

The problem here seems to be the default coding system used by Emacs.
Apparently it uses something like latin-1 rather than something
like cp1252.  I don't know enough about how such things are specified in
general (outside of Emacs) under w32 to be able to help any further, but all

I know is that maybe Emacs should try and figure out that your default coding
system should be cp1252.  Maybe the problem is that Emacs doesn't try to do
it, or maybe ti doesn't know how to do it, or maybe it does it wrong, or
maybe it doesn't want to do it (e.g. because cp1252 covers the whole 256
possible bytes so the auto-detection can't work well).


        Stefan




reply via email to

[Prev in Thread] Current Thread [Next in Thread]