yes, i could not able to see data in HDFS, and i have appended new data,but
still I'm unable to see the data in HDFS /user directory.
Thanks,
Prabhu
On Thu, Sep 20, 2012 at 8:18 PM, Brock Noland wrote:
> It actually looks like it's working. Are you sure no data is showing
> up in hdfs and tha
It actually looks like it's working. Are you sure no data is showing
up in hdfs and that new data is being appended to the file you are
tailing?
On Thu, Sep 20, 2012 at 3:48 AM, prabhu k wrote:
> Hi Brock,
>
> I have changed, as per your suggestion,but still the issue same, script
> seems stuck,
Hi Brock,
I have changed, as per your suggestion,but still the issue same, script
seems stuck, pasted flume.log file.
flume.log
==
12/09/20 14:07:40 INFO lifecycle.LifecycleSupervisor: Starting lifecycle
supervisor 1
12/09/20 14:07:40 INFO node.FlumeNode: Flume node starting - agent1
12/0
This line
agent1.sinks.HDFS.hdfs.channel = MemoryChannel-2
should be
agent1.sinks.HDFS.channel = MemoryChannel-2
Brock
On Tue, Sep 18, 2012 at 6:27 AM, prabhu k wrote:
> Hi,
>
> Please find the following flume.conf & flume.log files.
>
> I have marked in red colour below is that having any issu
Hi,
Please find the following flume.conf & flume.log files.
I have marked in red colour below is that having any issue?
flume.conf
=
agent1.sources = tail
agent1.channels = MemoryChannel-2
agent1.sinks = HDFS
agent1.sources.tail.type = exec
agent1.sources.tail.command = tail
/usr/loc
Yes they should work together. Please send the updated conf and log file.
--
Brock Noland
Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
On Tuesday, September 18, 2012 at 5:49 AM, prabhu k wrote:
> I have tried both ways, but still not working
> can you please confirm flume 1.2.0
I have tried both ways, but still not working
can you please confirm flume 1.2.0 support hadoop 1.0.3 version?
Thanks,
Prabhu.
On Tue, Sep 18, 2012 at 3:32 PM, Nitin Pawar wrote:
> can you write something in file continuously after you start flume-ng
>
> if you do tail -f it will start gettin
can you write something in file continuously after you start flume-ng
if you do tail -f it will start getting only new entries
or you can just change the command in the config file from tail -f to
tail so each time it bring default last 10 lines from the the file
~nitin
On Tue, Sep 18, 2012 at
Hi Nitin,
While executing flume-ng, i have updated the flume_test.txt file,still
unable to do HDFS sink.
Thanks,
Prabhu.
On Tue, Sep 18, 2012 at 2:35 PM, Nitin Pawar wrote:
> Hi Prabhu,
>
> are you sure there is continuous text being written to your file
> flume_test.txt.
>
> if nothing is writ
Hi Prabhu,
are you sure there is continuous text being written to your file
flume_test.txt.
if nothing is written to that file, flume will not write anything into hdfs.
On Tue, Sep 18, 2012 at 2:31 PM, prabhu k wrote:
> Hi Brock,
>
> Thanks for the reply.
>
> As per your suggestion, i have modi
Hi Brock,
Thanks for the reply.
As per your suggestion, i have modified,but still same issue.
My hadoop version is : 1.0.3 & Flume version is 1.2.0. Please let us know
is there any incompatible version?
On Mon, Sep 17, 2012 at 8:01 PM, Brock Noland wrote:
> Hi,
>
> I believe, this line:
> age
Hi,
I believe, this line:
agent1.sinks.HDFS.hdfs.type = hdfs
should be:
agent1.sinks.HDFS.type = hdfs
Brock
On Mon, Sep 17, 2012 at 5:17 AM, prabhu k wrote:
> Hi Users,
>
> I have followed the below link for sample text file to HDFS sink using tail
> source.
>
> http://cloudfront.blogspot.in/2
12 matches
Mail list logo