ents
> sudo apt-get install python-sphinx
> sudo gem install pygments.rb
>
>
> Hope that helps!
> If not, I can try putting together doc change but I’d rather you could
> make progress :)
>
>
>
>
>
> On Mon, Jan 18, 2016 at 6:36 AM -0800, "Andrew Weiner&
>
>
> On Fri, Jan 15, 2016 at 5:01 PM, Bryan Cutler wrote:
>
> Glad you got it going! It's wasn't very obvious what needed to be set,
> maybe it is worth explicitly stating this in the docs since it seems to
> have come up a couple times before too.
>
> Bryan
before too.
>
> Bryan
>
> On Fri, Jan 15, 2016 at 12:33 PM, Andrew Weiner <
> andrewweiner2...@u.northwestern.edu> wrote:
>
>> Actually, I just found this [
>> https://issues.apache.org/jira/browse/SPARK-1680], which after a bit of
>> googling and reading
/path/to/python
While both this solution and the solution from my prior email work, I
believe this is the preferred solution.
Sorry for the flurry of emails. Again, thanks for all the help!
Andrew
On Fri, Jan 15, 2016 at 1:47 PM, Andrew Weiner <
andrewweiner2...@u.northwestern.edu>
ARK_PYTHON environment variable to be used in my yarn
environment in cluster mode.
Thank you for all your help!
Best,
Andrew
On Fri, Jan 15, 2016 at 12:57 PM, Andrew Weiner <
andrewweiner2...@u.northwestern.edu> wrote:
> I tried playing around with my environment variables, and here is an
that my environment variables are being used when I
first submit the job, but at some point during the job, my environment
variables are thrown out and someone's (yarn's?) environment variables
are being used.
Andrew
On Fri, Jan 15, 2016 at 11:03 AM, Andrew Weiner <
andrewweiner2...
s
> raise RuntimeError("\n"+str(sys.version_info) +"\n"+
> str([(k,os.environ[k]) for k in os.environ if "PY" in k]))
>
> On Thu, Jan 14, 2016 at 8:37 AM, Andrew Weiner <
> andrewweiner2...@u.northwestern.edu> wrote:
>
>> Hi Bryan,
>>
&g
ples/src/main/python/pi.py 10*
>
> That is a good sign that local jobs and Java examples work, probably just
> a small configuration issue :)
>
> Bryan
>
> On Wed, Jan 13, 2016 at 3:51 PM, Andrew Weiner <
> andrewweiner2...@u.northwestern.edu> wrote:
>
>> T
that
> the spark assembly jar is reachable, then I would check to see if you can
> submit a local job to just run on one node.
>
> On Fri, Jan 8, 2016 at 5:22 PM, Andrew Weiner <
> andrewweiner2...@u.northwestern.edu> wrote:
>
>> Now for simplicity I'm testing wi
Now for simplicity I'm testing with wordcount.py from the provided
examples, and using Spark 1.6.0
The first error I get is:
16/01/08 19:14:46 ERROR lzo.GPLNativeCodeLoader: Could not load native gpl
library
java.lang.UnsatisfiedLinkError: no gplcompression in java.library.path
at java.la
10 matches
Mail list logo