I upgraded to mahout 0.9. The same error persists. Here is the full dump.
Incidentally, I am using local file system and not hadoop.


[ashokharnal@master ~]$ mahout recommendfactorized --input
/user/ashokharnal/seqfiles  --userFeatures $res_out_file/U/ --itemFeatures
$res_out_file/M/ --numRecommendations 1 --output /tmp/reommendation
--maxRating 1

MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
Running on hadoop, using
/opt/cloudera/parcels/CDH/lib/hadoop-0.20-mapreduce/bin/hadoop and
HADOOP_CONF_DIR=/etc/hadoop/conf
MAHOUT-JOB:
/opt/cloudera/parcels/CDH-5.2.0-1.cdh5.2.0.p0.36/lib/mahout/mahout-examples-0.9-cdh5.2.0-job.jar
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/opt/cloudera/parcels/CDH-5.2.0-1.cdh5.2.0.p0.36/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/opt/cloudera/parcels/CDH-5.0.0-1.cdh5.0.0.p0.47/lib/zookeeper/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
14/11/23 17:51:35 WARN driver.MahoutDriver: No recommendfactorized.props
found on classpath, will use command-line arguments only
14/11/23 17:51:35 INFO common.AbstractJob: Command line arguments:
{--endPhase=[2147483647], --input=[/user/ashokharnal/seqfiles],
--itemFeatures=[/user/ashokharnal/uexp.out/M/], --maxRating=[1],
--numRecommendations=[1], --numThreads=[1], --output=[/tmp/reommendation],
--startPhase=[0], --tempDir=[temp],
--userFeatures=[/user/ashokharnal/uexp.out/U/]}
14/11/23 17:51:36 INFO Configuration.deprecation: session.id is deprecated.
Instead, use dfs.metrics.session-id
14/11/23 17:51:36 INFO jvm.JvmMetrics: Initializing JVM Metrics with
processName=JobTracker, sessionId=
14/11/23 17:51:36 WARN mapred.JobClient: Use GenericOptionsParser for
parsing the arguments. Applications should implement Tool for the same.
14/11/23 17:51:36 INFO input.FileInputFormat: Total input paths to process
: 1
14/11/23 17:51:37 INFO mapred.LocalJobRunner: OutputCommitter set in config
null
14/11/23 17:51:37 INFO mapred.JobClient: Running job:
job_local1520101691_0001
14/11/23 17:51:37 INFO mapred.LocalJobRunner: OutputCommitter is
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter
14/11/23 17:51:37 INFO mapred.LocalJobRunner: Waiting for map tasks
14/11/23 17:51:37 INFO mapred.LocalJobRunner: Starting task:
attempt_local1520101691_0001_m_000000_0
14/11/23 17:51:37 WARN mapreduce.Counters: Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
14/11/23 17:51:37 INFO util.ProcessTree: setsid exited with exit code 0
14/11/23 17:51:37 INFO mapred.Task:  Using ResourceCalculatorPlugin :
org.apache.hadoop.util.LinuxResourceCalculatorPlugin@3f7b4c84
14/11/23 17:51:37 INFO mapred.MapTask: Processing split:
hdfs://master:8020/user/ashokharnal/seqfiles/part-m-00000:0+194
14/11/23 17:51:37 INFO zlib.ZlibFactory: Successfully loaded & initialized
native-zlib library
14/11/23 17:51:37 INFO compress.CodecPool: Got brand-new decompressor
[.deflate]
14/11/23 17:51:37 INFO mapred.LocalJobRunner: Map task executor complete.
14/11/23 17:51:37 WARN mapred.LocalJobRunner: job_local1520101691_0001
java.lang.Exception: java.lang.RuntimeException:
java.lang.ClassCastException: org.apache.hadoop.io.Text cannot be cast to
org.apache.hadoop.io.IntWritable
    at
org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:406)
Caused by: java.lang.RuntimeException: java.lang.ClassCastException:
org.apache.hadoop.io.Text cannot be cast to org.apache.hadoop.io.IntWritable
    at
org.apache.hadoop.mapreduce.lib.map.MultithreadedMapper.run(MultithreadedMapper.java:151)
    at
org.apache.mahout.cf.taste.hadoop.als.MultithreadedSharingMapper.run(MultithreadedSharingMapper.java:60)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
    at
org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:268)
    at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
    at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:744)
Caused by: java.lang.ClassCastException: org.apache.hadoop.io.Text cannot
be cast to org.apache.hadoop.io.IntWritable
    at
org.apache.mahout.cf.taste.hadoop.als.PredictionMapper.map(PredictionMapper.java:44)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:140)
    at
org.apache.hadoop.mapreduce.lib.map.MultithreadedMapper$MapRunner.run(MultithreadedMapper.java:268)
14/11/23 17:51:38 INFO mapred.JobClient:  map 0% reduce 0%
14/11/23 17:51:38 INFO mapred.JobClient: Job complete:
job_local1520101691_0001
14/11/23 17:51:38 INFO mapred.JobClient: Counters: 0
14/11/23 17:51:38 INFO driver.MahoutDriver: Program took 2563 ms (Minutes:
0.04271666666666667)
14/11/23 17:51:38 ERROR hdfs.DFSClient: Failed to close inode 24596
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException):
No lease on
/tmp/reommendation/_temporary/_attempt_local1520101691_0001_m_000000_0/part-m-00000
(inode 24596): File does not exist. Holder
DFSClient_NONMAPREDUCE_253037938_1 does not have any open files.
    at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkLease(FSNamesystem.java:3319)
    at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.completeFileInternal(FSNamesystem.java:3407)
    at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.completeFile(FSNamesystem.java:3377)
    at
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.complete(NameNodeRpcServer.java:673)
    at
org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.complete(AuthorizationProviderProxyClientProtocol.java:219)
    at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.complete(ClientNamenodeProtocolServerSideTranslatorPB.java:520)
    at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    at
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:587)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2013)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2009)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2007)

    at org.apache.hadoop.ipc.Client.call(Client.java:1411)
    at org.apache.hadoop.ipc.Client.call(Client.java:1364)
    at
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
    at com.sun.proxy.$Proxy16.complete(Unknown Source)
    at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.complete(ClientNamenodeProtocolTranslatorPB.java:435)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
    at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    at com.sun.proxy.$Proxy17.complete(Unknown Source)
    at
org.apache.hadoop.hdfs.DFSOutputStream.completeFile(DFSOutputStream.java:2180)
    at
org.apache.hadoop.hdfs.DFSOutputStream.close(DFSOutputStream.java:2164)
    at
org.apache.hadoop.hdfs.DFSClient.closeAllFilesBeingWritten(DFSClient.java:908)
    at org.apache.hadoop.hdfs.DFSClient.close(DFSClient.java:926)
    at
org.apache.hadoop.hdfs.DistributedFileSystem.close(DistributedFileSystem.java:861)
    at org.apache.hadoop.fs.FileSystem$Cache.closeAll(FileSystem.java:2687)
    at
org.apache.hadoop.fs.FileSystem$Cache$ClientFinalizer.run(FileSystem.java:2704)
    at
org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)












On 23 November 2014 at 09:22, Andrew Musselman <[email protected]>
wrote:

> Please upgrade to Mahout version 0.9, as many things have been fixed since.
>
> > On Nov 22, 2014, at 7:00 PM, Ashok Harnal <[email protected]> wrote:
> >
> > I use mahout 0.7 installed in Cloudera. After creating user-feature and
> > item-feature matrix in hdfs, I run the following command:
> >
> > mahout recommendfactorized --input /user/ashokharnal/seqfiles
> > --userFeatures $res_out_file/U/ --itemFeatures $res_out_file/M/
> > --numRecommendations 1 --output $reommendation --maxRating 1
> >
> > After some time, I get the following error:
> >
> > :
> > :
> > 14/11/23 08:28:20 INFO mapred.LocalJobRunner: Map task executor complete.
> > 14/11/23 08:28:20 WARN mapred.LocalJobRunner: job_local954305987_0001
> > java.lang.Exception: java.lang.RuntimeException:
> > java.lang.ClassCastException: org.apache.hadoop.io.Text cannot be cast to
> > org.apache.hadoop.io.IntWritable
> >    at
> > org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:406)
> > Caused by: java.lang.RuntimeException: java.lang.ClassCastException:
> > org.apache.hadoop.io.Text cannot be cast to
> org.apache.hadoop.io.IntWritable
> >    at
> >
> org.apache.hadoop.mapreduce.lib.map.MultithreadedMapper.run(MultithreadedMapper.java:151)
> >    at
> >
> org.apache.mahout.cf.taste.hadoop.als.MultithreadedSharingMapper.run(MultithreadedSharingMapper.java:60)
> >    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
> >    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
> >    at
> >
> org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:268)
> >    at
> > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> >    at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> >    at
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> >    at
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> >    at java.lang.Thread.run(Thread.java:744)
> > Caused by: java.lang.ClassCastException: org.apache.hadoop.io.Text cannot
> > be cast to org.apache.hadoop.io.IntWritable
> >    at
> >
> org.apache.mahout.cf.taste.hadoop.als.PredictionMapper.map(PredictionMapper.java:44)
> >    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:140)
> >    at
> >
> org.apache.hadoop.mapreduce.lib.map.MultithreadedMapper$MapRunner.run(MultithreadedMapper.java:268)
> >
> >
> > Not sure what is wrong.
> > Request help.
> >
> > Ashok Kumar Harnal
> >
> >
> >
> >
> > --
> > Visit my blog at: http://ashokharnal.wordpress.com/
>



-- 
Visit my blog at: http://ashokharnal.wordpress.com/

Reply via email to