octave-maintainers
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

octave64 - SGI Itanium - LU function error


From: Clinton Chee
Subject: octave64 - SGI Itanium - LU function error
Date: Thu, 28 Apr 2005 16:00:00 +1000
User-agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.7.2) Gecko/20040805 Netscape/7.2

I've managed to compile Octave64 on SGI Altix Itanium with Intel Compilers

However, segmentation fault occurs when doing the following:

a=[1,2;3,4]
lu(a)

John, or anybody who has got octave64 compiled on SGI Intel Itaniums,
can you verify if you get the seg fault or if it runs OK on your SGI
Intel Itanium platform.

Funny thing is that this problem was not there on SGI Origin using
MIPSpro compilers.



Clinton.




John W. Eaton wrote:
> On 27-Apr-2005, Clinton Chee <address@hidden> wrote:
> 
> | What version of ifort and icc/icpc do you use in Itanium, for your
> | successful build?
> 
> The following is installed on the system I have access to:
> 
>   $ ifort -V
>   Intel(R) Fortran Itanium(R) Compiler for Itanium(R)-based applications
>   Version 8.0   Build 20040716 Package ID: l_fc_pc_8.0.046_pl050.1
>   Copyright (C) 1985-2004 Intel Corporation.  All rights reserved.
> 
>   $ uname -a
>   Linux XXX 2.4.21-sgi304r1 #2 SMP Wed Mar 30 16:12:04 EST 2005 ia64 ia64 
> ia64 GNU/Linux
> 
> I built from the current CVS files earlier today with the commands
> 
>   configure --enable-64 F77=ifort FFLAGS="-i8 -O"
>   make
> 
> This is on a system without ATLAS or other fast blas/lapack library
> installed.  The resulting binary is linked against the following
> libraries:
> 
>   liboctinterp.so => /usr/lib/liboctinterp.so (0x2000000000058000)
>   liboctave.so => /usr/lib/liboctave.so (0x2000000000564000)
>   libcruft.so => /usr/lib/libcruft.so (0x200000000082c000)
>   libreadline.so.4 => /usr/lib/libreadline.so.4 (0x2000000000900000)
>   libncurses.so.5 => /usr/lib/libncurses.so.5 (0x2000000000974000)
>   libdl.so.2 => /lib/libdl.so.2 (0x2000000000a14000)
>   libz.so.1 => /usr/lib/libz.so.1 (0x2000000000a2c000)
>   libm.so.6.1 => /lib/libm.so.6.1 (0x2000000000a58000)
>   libifport.so.6 => /usr/local/intel-8.0-20040412/lib/libifport.so.6 
> (0x2000000000aec000)
>   libifcoremt.so.6 => /usr/local/intel-8.0-20040412/lib/libifcoremt.so.6 
> (0x2000000000b34000)
>   libimf.so.6 => /usr/local/intel-8.0-20040412/lib/libimf.so.6 
> (0x2000000000d14000)
>   libcxa.so.6 => /usr/local/intel-8.0-20040412/lib/libcxa.so.6 
> (0x2000000000ea8000)
>   libunwind.so.6 => /usr/local/intel-8.0-20040412/lib/libunwind.so.6 
> (0x2000000000f14000)
>   libpthread.so.0 => /lib/libpthread.so.0 (0x2000000000f48000)
>   libstdc++.so.5 => /usr/lib/libstdc++.so.5 (0x2000000000ffc000)
>   libgcc_s.so.1 => /lib/libgcc_s.so.1 (0x200000000115c000)
>   libc.so.6.1 => /lib/libc.so.6.1 (0x2000000001190000)
>   libgpm.so.1 => /usr/lib/libgpm.so.1 (0x2000000001418000)
>   /lib/ld-linux-ia64.so.2 => /lib/ld-linux-ia64.so.2 (0x2000000000000000)
> 
> It passed failed 4 of the 1207 tests run by make check:
> 
>   FAIL: octave.test/io/load-save.m
>   FAIL: octave.test/linalg/qr-7.m
>   FAIL: octave.test/matrix/rand-1.m
>   FAIL: octave.test/matrix/randn-1.m
> 
> It's not surprising that it failed the load-save test since load and
> save are not completely working for 64-bit systems.  Probably rand
> failed because ints in the Fortran random number generator are
> expected to be 32-bits wide, or because of some other initialization
> problem (I see that it works when I run the test interactively).  The
> The qr test is failing the accuracy test at the end of the qr-7.m
> file.
> 
> jwe
> 

-- 


----------------------------------------------------------------------------
Clinton Chee
Computational Scientist
High Performance Computing Unit
Room 2075, Red Centre
University of New South Wales
Australia 2035
chee at parallel stop hpc stop unsw stop edu stop au
Tel: 61 2 9385 6915
----------------------------------------------------------------------------



reply via email to

[Prev in Thread] Current Thread [Next in Thread]