There are a bunch of uses of the bash syntax, "${!var}", in the Hadoop scripts.  Can anyone explain to me what that syntax was supposed to achieve?  According to the user guide:

    ${!name[@]}
    ${!name[*]}

    If name is an array variable, expands to the list of array indices
    (keys) assigned in name. If name is not an array, expands to 0 if
    name is set and null otherwise. When ‘@’ is used and the expansion
    appears within double quotes, each key expands to a separate word.


That makes sense, but the usage is odd.  In many cases it's used in a conditional, e.g. [[ -z ${!1} ]], which just seems like an archaic alternative to the more readable "$1" != "".  In other cases, I'm not convinced it's being used as intended, e.g.:

hadoop-common-project/hadoop-common/src/main/bin/hadoop-functions.sh: declare array=("${!arrref}") hadoop-common-project/hadoop-common/src/main/bin/hadoop-functions.sh: eval "$1"='$(cygpath -p -w "${!1}" 2>/dev/null)' hadoop-common-project/hadoop-common/src/main/bin/hadoop-functions.sh: if [[ ${!uvar} !=  "${USER}" ]]; then

I'm asking because bash 4.4 on Ubuntu 18 appears to reject that syntax with a bad substitution warning message.  Bash 4.3 on Ubuntu 16 is OK with it.  I filed a support request in the bash project for the regression, but I'm wondering if we shouldn't just replace it in the Hadoop scripts.

I'd love any input, especially from Allen.

Thanks!
Daniel

Reply via email to