Hey, I think I found out the problem. Turns out that the file I saved is
too large.
On 21 May 2015 at 16:44, Akhil Das wrote:
> Can you share the code, may be i/someone can help you out
>
> Thanks
> Best Regards
>
> On Thu, May 21, 2015 at 1:45 PM, Allan Jie wrote:
>
>> Hi,
>>
>> Just check th
Sure, the code is very simple. I think u guys can understand from the main
function.
public class Test1 {
public static double[][] createBroadcastPoints(String localPointPath, int
row, int col) throws IOException{
BufferedReader br = RAWF.reader(localPointPath);
String line = null;
int rowIndex =
Can you share the code, may be i/someone can help you out
Thanks
Best Regards
On Thu, May 21, 2015 at 1:45 PM, Allan Jie wrote:
> Hi,
>
> Just check the logs of datanode, it looks like this:
>
> 2015-05-20 11:42:14,605 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /
>
Hi,
Just check the logs of datanode, it looks like this:
2015-05-20 11:42:14,605 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /
10.9.0.48:50676, dest: /10.9.0.17:50010, bytes: 134217728, op: HDFS_WRITE,
cliID: DFSClient_NONMAPREDUCE_804680172_54, offset: 0, srvID:
39fb78
This is more like an issue with your HDFS setup, can you check in the
datanode logs? Also try putting a new file in HDFS and see if that works.
Thanks
Best Regards
On Wed, May 20, 2015 at 11:47 AM, allanjie wrote:
> Hi All,
> The variable I need to broadcast is just 468 MB.
>
>
> When broadcas