One file is injected into two programs in a script

Hi I have a script that launches two programs

#Script file 
./prog1
./prog2

      

prog1 is a C program

#include <stdio.h>
#include <stdlib.h>

int main(int argc, char **argv){
  printf("prog1 running\n");
    int tmp;
    scanf("%d", &tmp);
    printf("%d\n", tmp+10);
    printf("prog1 ended\n");
    return 0;
}

      

prog 2 is also a C program

#include <stdio.h>
#include <stdlib.h>

int main(int argc, char **argv){
    printf("prog2 running\n");
    int tmp;
    scanf("%d\n", &tmp);
    printf("%d\n", tmp+10);
    printf("prog2 ended\n");
    return 0;
}

      

I run the command

./script <file

where is the file

123
456

      

Output signal

prog1 running
133
prog1 ended
prog2 running
10
prog2 ended

      

It seems like prog2 hasn't received any input from the file, what's going on under the hood?

Is it possible that prog2 took \ n "\ n" instead?

+3


source to share


2 answers


Your script should be like this:

#!/bin/bash
exec 3<&1
tee  >(./prog2 >&3) | ./prog1

      



Use the command tee

to duplicate stdin and a recent >()

bash function to open a temporary filedescriptor. (using filedesriptor 3 is done to split stdout without parallelism).

See this answer to read the full story.

0


source


scanf

reads buffered input. So when your first program is read from stdin

, it speculates ahead of all available input to make future reads from stdin

faster (avoiding the need to make so many system calls). When the second program starts, there is no input to the left, and (since you were unable to verify the result scanf()

), you will get 0 in tmp

.



You should be able to modify the buffering strategy in your application (at the expense of speed) using a standard function setvbuf()

.

0


source







All Articles