[ https://issues.apache.org/jira/browse/SPARK-21573?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16107683#comment-16107683 ]
shane knapp commented on SPARK-21573: ------------------------------------- (sorry, was off the grid for the past week) we've been off of py2.6 for a long time. our current python installation is managed by anaconda. this behavior popped up w/the ADAM builds a couple of weeks ago. the fix there was to put the anaconda bin dir in to the PATH (/home/anaconda/bin/) in the builds, and be sure to source activate properly before tests are run (the environment is 'py3k'). i really don't know why this is randomly happening... if it failed consistently it'd be easier to track down. anyways, i'll make a PR for [~joshrosen] for the jenkins build configs and add anaconda to the PATH. > Tests failing with run-tests.py SyntaxError occasionally in Jenkins > ------------------------------------------------------------------- > > Key: SPARK-21573 > URL: https://issues.apache.org/jira/browse/SPARK-21573 > Project: Spark > Issue Type: Test > Components: Tests > Affects Versions: 2.3.0 > Reporter: Hyukjin Kwon > Priority: Minor > > It looks default {{python}} in the path at few places such as > {{./dev/run-tests}} use Python 2.6 in Jenkins and it fails to execute > {{run-tests.py}}: > {code} > python2.6 run-tests.py > File "run-tests.py", line 124 > {m: set(m.dependencies).intersection(modules_to_test) for m in > modules_to_test}, sort=True) > ^ > SyntaxError: invalid syntax > {code} > It looks there are quite some places to fix to support Python 2.6 in > {{run-tests.py}} and related Python scripts. > We might just try to set Python 2.7 in few other scripts running this if > available. > Please also see > http://apache-spark-developers-list.1001551.n3.nabble.com/Tests-failing-with-run-tests-py-SyntaxError-td22030.html -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org