[plug] Can this be done?
Russell Steicke
russellsteicke at gmail.com
Mon Dec 14 21:17:05 WST 2009
On Sun, Dec 13, 2009 at 12:26 PM, Richard Meyer <meyerri at westnet.com.au> wrote:
>
> Hi there,
>
> Someone working with MVS on a system/Z mainframe wants to know about
> transferring a workload from MVS to z/linux.
>
> His question concerns a batch job he runs that concatenates a number of
> datasets on tape into one input file for the batch job
>
> On MVS he achieves it by coding the following in his JCL (Job Control
> Language)
>
> //INPUT DD DSN=input1,UNIT=TAPE, ......
> // DD DSN=input2,UNIT=TAPE, ......
> // DD DSN=input3,UNIT=TAPE, ......
> // DD DSN=input4,UNIT=TAPE, ......
>
> What this does is when the program opens a logical file called INPUT, it
> goes and opens a physical file called input1, at end it closes the file
> and opens input2, etc, through input3 and input4 as well, returning End
> Of File at the end of the last file.
>
> Without catting them into one file, how can we do this under Linux?
In the general case, I don't think this can be done, at least without
kernel support. But in specific circumstances it can be emulated
using a fifo.
In one terminal
$ for z in 0 1 2 ; do echo This is file $z > file$z ; done
$ mkfifo allfiles
$ while : ; do cat file? > allfiles ; done
In another terminal
$ cat allfiles
This is file 0
This is file 1
This is file 2
$
This doesn't work with multiple readers (each reader will see only
parts of the data), and requires a process hanging around waiting.
But depending on what the job is doing, something like this may be
enough.
--
Virus found in this message.
More information about the plug
mailing list