|
From: | i-bugzilla-sourceware-org-kasujfzh at rf dot risimo.net |
Subject: | [Bug binutils/23242] aarch64: objdump requires ignores bits to be set in ldar |
Date: | Mon, 28 May 2018 10:52:05 +0000 |
https://sourceware.org/bugzilla/show_bug.cgi?id=23242 Raimar Falke <i-bugzilla-sourceware-org-kasujfzh at rf dot risimo.net> changed: What |Removed |Added ---------------------------------------------------------------------------- Status|UNCONFIRMED |RESOLVED Resolution|--- |INVALID --- Comment #2 from Raimar Falke <i-bugzilla-sourceware-org-kasujfzh at rf dot risimo.net> --- Usually these constrains are expressed in the decoding or operation section of an instruction. For example look at LDAXP for t==t2. So I'm a bit confused when it states for LDARB: integer n = UInt(Rn); integer t = UInt(Rt); integer t2 = UInt(Rt2); // ignored by load/store single register integer s = UInt(Rs); // ignored by all loads and store-release So two contradicting statements from the ARM doc. Looks like I should complain to ARM here. But I understand now why objdump behaves like it does. I will resolve the issue then. -- You are receiving this mail because: You are on the CC list for the bug.
[Prev in Thread] | Current Thread | [Next in Thread] |