Hi,
I believe you can simply use nslookup nnn.nnn.n.n command for this purpose
C:\Users\Admin>nslookup 192.168.8.1
Server: hi.link
Address: 192.168.8.1
Name:hi.link
Address: 192.168.8.1
On a UNIX host it would be pretty easy.
HTH,
Mich
From: Mark Sunderlin [ma
Gopal, can you confirm the doc change that Jone Zhang suggests? The second
sentence confuses me: "You can choose Spark1.5.0+ which build include the
Hive jars."
Thanks.
-- Lefty
On Thu, Nov 19, 2015 at 8:33 PM, Jone Zhang wrote:
> I should add that Spark1.5.0+ is used hive1.2.1 default whe
Hive is supposed to work with any version of Hive (1.1+) and a version of
Spark w/o Hive. Thus, to make HoS work reliably and also simply the
matters, I think it still makes to require that spark-assembly jar
shouldn't contain Hive Jars. Otherwise, you have to make sure that your
Hive version match
I'm using hive1.2.1 . I want to run hive on spark model,but there is some
issues.
have been set spark.master=yarn-client;
spark version 1.4.1 which run spark-shell --master yarn-client there is no
problem.
log
2015-11-23 13:54:56,068 ERROR [main]: spark.SparkTask
(SessionState.java:printError
Thanks Xuefu!
-- Lefty
On Mon, Nov 23, 2015 at 1:09 AM, Xuefu Zhang wrote:
> Hive is supposed to work with any version of Hive (1.1+) and a version of
> Spark w/o Hive. Thus, to make HoS work reliably and also simply the
> matters, I think it still makes to require that spark-assembly jar
> sho
Anyone
On Sat, Nov 21, 2015 at 1:32 PM, Dasun Hegoda wrote:
> Thank you very much but I would like to do the integration of these
> components myself rather than using a packaged distribution. I think I have
> come to right place. Can you please kindly tell me the configuration
> steps run H