gnucobol-users
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[open-cobol-list] To correct OpenCOBOL 1.1 Programmers Guide page 5-2 ?


From: vince
Subject: [open-cobol-list] To correct OpenCOBOL 1.1 Programmers Guide page 5-2 ?
Date: Wed, 7 Mar 2012 22:51:16 +0000
User-agent: KMail/1.13.5 (Linux/2.6.33.7-server-2mnb; KDE/4.4.5; i686; ; )

Hi;

On Wednesday 07 Mar 2012 21:59:19 john Culleton wrote:

> > > > and the program without the FROM ... to ... entry results in an

> > > > error message at runtime:

> > > > libcob: Record overflow (STATUS = 44) File : 'outfile'

> > > >

> > > > N.B. The same program without the entry FROM integer-5 [ TO

> > > > integer-6 ]CHARACTERS results in the same runtime error message

> > > > sited above.

> > >

> > > It is beginning to feel that the record you are reading in is larger

> > > than the declared size according to the 01 record description which

> > > is exactly what error 44 means.

> >

> > Thank you, I'm afraid I'm fairly new to openCOBOL (not new to COBOL

> > however, since 1970).

> > The sources / input file and a README are still available at

> > http://data.mobach.nl/opencobol/fdclause/with-or-without-fromto.tgz

> > The records in the input file are 52 bytes (including the EOL) and in

Hold on, OC is no different than another primary compiler eg, IBM, ICL, MF etc. however, all compilers have their own funny way of doing things that is outside of the specifications in use at the time and as the compilers have been upgraded, one way or another the funnies have continued in order to maintain compatibility! This one, has always been a major issue when migrating across platforms and compilers even on the same platform (hey, IBM ?).

Right now back to your problem:

> >

> > the recordarea of the source they are specified as :

> > fd outfile

> >

> > record is varying in size

> > from 4 to 256 characters

> > depending on rec-length-out.

> >

> > 01 rec-out-record.

> >

> > 03 rec-out-byte pic x(0001) occurs 256.

First off, the varying statement , like ODO's (occurs depending on) is also a frig, ie, the size is always the maximum, in your case always 256. This is because for whatever reason the compiler writer does not put in the very much extra code to cope due to the design of the CPU which often cannot handle placing such on secondary stacks as most have a limit of such even if they can handle more than one!

One small point, this size is only the size of the data the programmer has declared and NOT the real size, there is one or more bytes added on at the end, eg, null (x'00') on LS files and some other types so always check the last few byes of a record to see what your compiler does!

> > Corresponding specifications without the from 4 to 256 characters

> > results in the errors indicated. However, Gary informed me that the

> > presence of the OCCURS...DEPENDING ON can also result in this error.

Ok, a possible problem could result from what version of the compiler you are using. If you are using v1.0 upgrade it asap, if you are using v1.1 check what build it is.

> I listed the program lenseqx.cob and don't understand one piece

> of it. You read a record from infile into rec-out-record and later move

> rec-in-record to rec-out-record. Isn't the move redundant? Read

> into has the same effect as a read followed by a move.

Not seeing this code I have to guess what it is doing but basically you can read file-a into WS and write RD from WS (working storage) if needed. You do not re-move after the read or before the write as the process has been done!

If you are worried about the behaviour of such a process, then move after the basic read or before the write, etc.

Again as I have not seen the code and assuming that the organisation type works, set the file type as Line Sequential and retest, as LS will produce the right EOR (End of Record) byte sequence. If that works then you must check how the original file was defined.

Variable records can be a right pain and sometimes it is easier to use basic file processing to handle heavy variable length records (Not IS) and this works for me on LS format files. [Done as block processing where the programmer is responsible for all deblocking and record processing etc], There is an example in the FAQ near the end, that I posted a few years ago that might help but this was done for a special purpose i.e., for LS files, eg Cobol source code.

If the above does not fit your/this situation please provide more info on the program and processing steps etc.

A small point, unless you are handling a lot of records it is easier just to keep them as fixed length with a byte count field at the front or some other methodology that does a similar job (there is always more than one way to 'skin the cat') nuts, do the job. (sorry, too many years in Cobol (49)!!!

Vince


reply via email to

[Prev in Thread] Current Thread [Next in Thread]