hive.server2.authentication.ldap.url
ldap://example.com
hive.server2.authentication.ldap.Domain
example.com
Do I need any additional configurations in hive server/ EMR Hadoop to pull in
my group information. Any response will be appreciated.
Thanks,
Praveen.
I have `export
SPARK_HOME=/home/analysis/Installations/spark-1.4.1-bin-hadoop1` in my
.bashrc. Once I comment it out, it works fine.
Thanks,
Praveen
On Fri, Aug 14, 2015 at 10:19 AM, Jörn Franke wrote:
> Maybe there is another older log4j library in the classpath?
>
> Le ven. 14 août
a HotSpot(TM) 64-Bit Server VM (build 25.51-b03, mixed mode)
analysis@analysis-vm:~/Installations/apache-hive-1.2.1-bin$ uname -a
Linux analysis-vm 3.19.0-25-generic #26~14.04.1-Ubuntu SMP Fri Jul 24
21:16:20 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux
Thanks,
Praveen
Hi Joshua,
MSCK REPAIR TABLE source_system; will work.
Thanks and Regards,
Praveen Akinapally.
On Mon, Feb 2, 2015 at 8:42 AM, Joshua Eldridge
wrote:
> I'm hoping someone else has had this problem. I tried searching, but
> couldn't find anything ...
>
> I'm running
Hi Hulbert,
Select id, last_value(address,true) over (partition by id order by
file_date) as address from address_table; works in Hive 0.13.1. Not sure
about Hive 0.11. Try and let me know.
Regards,
Praveen Akinapally
On Fri, Dec 19, 2014 at 5:43 AM, Hulbert, Leland wrote:
>
ct col1, col2 tuple
hive> SELECT DISTINCT(col1), col2 FROM t1;
1 (3 or 4)
2 5
Does anyone think on the same line and suggest Hive supporting this feature.
Thanks,
Praveen
On Fri, Nov 9, 2012 at 10:33 PM, Praveen Kumar K J V S <
praveenkjvs.develo...@gmail.com> wrote:
> Thank you very mu
out like semi joins that might
> come in handy for this query or queries in the future.
>
> https://cwiki.apache.org/Hive/languagemanual-joins.html
>
> Mark
>
>
>
> On Fri, Nov 9, 2012 at 8:00 AM, Praveen Kumar K J V S <
> praveenkjvs.develo...@gmail.com> wrote:
>
>
Thanks Mark, I do understand that how Hive works with Distinct keyword.
What I was looking for is a solution for my requirement in Hive, I am not
an expert in SQL, hence looking for suggestions
On Fri, Nov 9, 2012 at 9:54 AM, Mark Grover wrote:
> Hi Praveen,
> Let's take an examp
Is there a better way to use Hive to sessionize my log data ? I'm not
sure that I'm doing so, below, in the optimal way:
The log data is stored in sequence files; a single log entry is a JSON
string; eg:
{"source": {"api_key": "app_key_1", "user_id": "user0"}, "events":
[{"timestamp": 1330988326,
nning below query,
$ ./hive_tri.sh temp_table real_table 2011-12-10 2011-12-23
But, after running above command I am getting error:
FAILED: Parse Error: line 2:31 cannot recognize input ''TABLE_NAME1'' in
join source
please help where I it is going wrong
Thanks
Praveen
On
Hello,
I want to run hive queries through a .hql file in command prompt where i
need to pass start_date,end_date & table_name as command line parameter
How can I achieve this?
Please help
--
Regards,
Praveen
Hi All,
I am loading data into Hive from a script and I get following error
WARNING: org.apache.hadoop.metrics.jvm.EventCounter is deprecated. Please use
org.apache.hadoop.log.metrics.EventCounter in all the log4j.properties files.
Hive history file=/tmp/praveen
Gotcha, thanks !
pk
Can anyone point me to an example ( including queries, code ) of using hive to
sessionize data by time ( eg, a web app clickstream log, where sessions are
split where time gaps occur ) ?
Thanks,
pk
Sent from my iPhone
internals ?
Thanks,
pk
Sent from my iPhone
On Jun 10, 2011, at 11:18 PM, Tim Spence wrote:
> Praveen,
> This would be best accomplished with a UDF because Hive does not support
> cursors.
> Best of luck,
> Tim
>
>
>
>
> On Fri, Jun 10, 2011 at 10:29 PM,
If I have table timestamps:
hive> desc timestamps;
OK
ts bigint
hive> select ts from timestamps order by ts
OK
1
2
3
4
5
6
7
8
9
10
30
32
34
36
38
40
42
44
46
48
50
70
74
78
100
105
110
115
and I want to make groups of the values where splits between groups
occur where two time-consecu
I have a log table with a single column, where each row contains JSON
string in the following format; here are two log entries:
{
'foo0': { 'bar0': 'A',
'bar1': 'B'}
'foo1': [ { 'params': { 'key0': 'valX', 'key1' : 'val1'},
'time': 'time0'},
17 matches
Mail list logo