Here in this post  i will explain about on how to use a datastage job as service via remote SSH invocation
Task :
Decompress the files , read the data , transform the data and write into output file.
The file should not be transmitted over network, but data can be streamed.
Here is the simple implementation :
On Remote DataStage Server:
1) Write a job that take a input file name as parameter and output file name as parameter.
2) One record in input file is a just a single string.
3) Write transformation rule and output to the output file.
On Local Unix Server:
Write the below script:
Task :
Decompress the files , read the data , transform the data and write into output file.
The file should not be transmitted over network, but data can be streamed.
Here is the simple implementation :
On Remote DataStage Server:
1) Write a job that take a input file name as parameter and output file name as parameter.
2) One record in input file is a just a single string.
3) Write transformation rule and output to the output file.
On Local Unix Server:
Write the below script:
piped_ds_job.sh============
#!/bin/bash dshome=`cat /.dshome` . $dshome/dsenv export PATH=$PATH:$DSHOME/bin pid=$$ fifodir=/data/datastage/tmp infname=$fifodir/infname.$pid outfname=$fifodir/outfname.$pid
mkfifo $infname
mkfifo $outfname
dsjob  -run -param inputFile=$infname \
    -param outputFile=$outfname dstage1 ds_sort.$pid 2> /dev/null  &
                
if [ $? -ne 0 ]; then
    echo "error calling DataStage job."
    rm $infname
    rm $outfname
    exit 1
fi
(cat $outfname;rm $outfname)&
                
if [ -z $1 ]; then
    cat > $infname
else
    cat $1 > $infname
fi
                
rm $infname
======================
Decompressing the file and running the above script on dataStage server
zcat compressedfile.gz |ssh -l dsadm@victory.ibm.com piped_ds_job.sh >> outputfilename
 
 
No comments:
Post a Comment