Bash: wait for a specific command to exit before continuing

I know there are several posts that ask similar things, but none are addressing the problem I'm running into.

I am working on a script that handles connections to various devices Bluetooth low energy

, reads them from their descriptors with gatttool

and dynamically creates a file .json

with those values.

The problem I am facing is that the commands gatttool

take a while to execute (and not always successfully connect to devices due to device is busy

or similar messages). These "errors" not only translate incorrect data to fill the file .json

, but also allow script lines to continue writing to the file (eg adding extras }

or similar). An example of the commands used would be the following:

sudo gatttool -l high -b <MAC_ADDRESS> --char-read -a <#handle>

      

How can I approach this in such a way as to wait for a certain exit? In this case, the ideal way out when you --char-read

use gatttool

is:

Characteristic value/description: some_hexadecimal_data`

      

This way I can make sure that I follow the script line by line instead of these "jumps".

+3


source to share


1 answer


grep

allows you to filter the output gatttool

for the data you are looking for.

If you're really looking for a way to wait until a certain output is found, expect might be what you're looking for.



From the man page:

expect [[-opts] pat1 body1] ... [-opts] patn [bodyn]
         waits  until  one of the patterns matches the output of a spawned
         process, a specified time period has passed, or an end-of-file is
         seen.  If the final body is empty, it may be omitted.

      

+2


source







All Articles