Hi,
Your code seems ok to me, the only difference with what I do is that I
explicitly pass hdfs path to bulkSave, I am not sure how "/bulk is resolved.
I am very beginner with spark, hbase, phoenix etc. but if you'd like to
use this code I could try to investigate your problem, but I need the
full stack trace.
On 11.06.2015 00:53, Yiannis Gkoufas wrote:
Hi Dawid,
yes I have been using your code. Probably I am invoking the classes in
a wrong way.
valdata = readings.map(e => e.split(","))
.map(e => (e(0),e(1).toLong,e(2).toDouble,e(3).toDouble))
valtableName ="TABLE";
valcolumns =Seq("SMID","DT","US","GEN");
valzkUrl =Some("localhost:2181");
valfunctions =newExtendedProductRDDFunctions(data);
valhfiles = functions.toHFile(tableName,columns,newConfiguration,zkUrl);
valloader =newBulkPhoenixLoader(hfiles);
loader.bulkSave(tableName,"/bulk",None);
Does the above seem the correct way to you?
Thanks a lot!
On 10 June 2015 at 19:13, Dawid <wysakowicz.da...@gmail.com
<mailto:wysakowicz.da...@gmail.com>> wrote:
Thx a lot James. That's the case.
On 10.06.2015 19:50, James Taylor wrote:
David,
It might be timestamp related. Check the timestamp of the
rows/cells
you imported from the HBase shell. Are the timestamps later
than the
server timestamp? In that case, you wouldn't see that data. If
this is
the case, you can try specifying the CURRENT_SCN property at
connection time with a timestamp later than the timestamp of the
rows/cells to verify.
Thanks,
James
On Wed, Jun 10, 2015 at 10:14 AM, Dawid
<wysakowicz.da...@gmail.com
<mailto:wysakowicz.da...@gmail.com>> wrote:
Yes, that's right I have generated HFile's that I managed
to load so to be
visible in HBase. I can't make them 'visible' to phoenix.
What I noticed today is I have rows loaded from the
generated HFiles and
upserted through sqlline when I run 'DELETE FROM TABLE'
only the upserted
one disappears. The loaded from HFiles still persist in HBase.
Yiannis how do you generate the HFiles? You can see my
code here:
https://gist.github.com/dawidwys/3aba8ba618140756da7c
On 10.06.2015 17:57, Yiannis Gkoufas wrote:
Hi Dawid,
I am trying to do the same thing but I hit a wall while
writing the Hfiles
getting the following error:
java.io.IOException: Added a key not lexically larger than
previous
key=\x00\x168675230967GMP\x00\x00\x00\x01=\xF4h)\xE0\x010GEN\x00\x00\x01M\xDE.\xB4T\x04,
lastkey=\x00\x168675230967GMP\x00\x00\x00\x01=\xF5\x0C\xF5`\x010_0\x00\x00\x01M\xDE.\xB4T\x04
You have reached the point where you are generating the
HFiles, loading them
but you dont see any rows in the table?
Is that correct?
Thanks
On 8 June 2015 at 18:09, Dawid <wysakowicz.da...@gmail.com
<mailto:wysakowicz.da...@gmail.com>> wrote:
Yes, I did. I also tried to execute some upserts using
sqlline after
importing HFiles, and rows from upserts are visible
both in sqlline and
hbase shell, but
the rows imported from HFile are only in hbase shell.
On 08.06.2015 19:06, James Taylor wrote:
Dawid,
Perhaps a dumb question, but did you execute a
CREATE TABLE statement
in sqlline for the tables you're importing into?
Phoenix needs to be
told the schema of the table (i.e. it's not enough
to just create the
table in HBase).
Thanks,
James
On Mon, Jun 8, 2015 at 10:02 AM, Dawid
<wysakowicz.da...@gmail.com
<mailto:wysakowicz.da...@gmail.com>>
wrote:
Any suggestions? Some clues what to check?
On 05.06.2015 23:21, Dawid wrote:
Yes I can see it in hbase-shell.
Sorry for the bad links, i haven't used
private repositories on github.
So I
moved the files to a gist:
https://gist.github.com/dawidwys/3aba8ba618140756da7c
Hope this times it will work.
On 05.06.2015 23:09, Ravi Kiran wrote:
Hi Dawid,
Do you see the data when you run a simple
scan or count of the table
in
Hbase shell ?
FYI. The links lead me to a 404 : File not found.
Regards
Ravi
On Fri, Jun 5, 2015 at 1:17 PM, Dawid
<wysakowicz.da...@gmail.com
<mailto:wysakowicz.da...@gmail.com>>
wrote:
Hi,
I was trying to code some utilities to
bulk load data through HFiles
from
Spark RDDs.
I was trying to took the pattern of
CSVBulkLoadTool. I managed to
generate
some HFiles and load them into HBase, but
i can't see the rows using
sqlline. I would be more than grateful for
any suggestions.
The classes can be accessed at:
https://github.com/dawidwys/gate/blob/master/src/main/scala/pl/edu/pw/elka/phoenix/BulkPhoenixLoader.scala
https://github.com/dawidwys/gate/blob/master/src/main/scala/pl/edu/pw/elka/phoenix/ExtendedProductRDDFunctions.scala
Thanks in advance
Dawid Wysakowicz
--
Pozdrawiam
Dawid
--
Pozdrawiam
Dawid
--
Pozdrawiam
Dawid
--
Pozdrawiam
Dawid
--
Pozdrawiam
Dawid
--
Pozdrawiam
Dawid