Hi,

Can you try building the system with this configuration:
ALPHA_MOESI_hammer? it may fix the problem

Best,


On Thu, Jun 27, 2013 at 1:07 PM, amina belhaj messaoud <
amina.belhajmassa...@gmail.com> wrote:

> hello
> I am running this command :
> build/ALPHA/gem5.opt  configs/example/ruby_fs.py   --topology=Mesh
> --garnet-network=fixed
> --kernel=/home/amina/Gem5/FSfILES/m5_system_2.0b3/binaries/vmlinux
> --disk-image=/home/amina/Gem5/FSfILES/m5_system_2.0b3/disks/linux-latest.img
> --script=configs/boot/hello.rcS  --num-cpu=2 --num-dirs=2 --cpu-type=timing
> --mesh-rows=2 --caches --l2cache --num-l2cache=2
> --checkpoint-dir=/home/amina/gemo/checks
>
> hello.rcS contains :
>
> /sbin/m5 checkpoint
> echo hello
> ls
>
> echo hello again
>
> ls
> /sbin/m5 exit
>  but when  after that the checkpoint is benn writen  , the system aborted
> with message : panic  : possible deadlock detected
>
> when I want to restore from the checkpoint :
>  build/ALPHA/gem5.opt configs/example/ruby_fs.py --topology=Mesh
> --garnet-network=fixed
> --kernel=/home/amina/Gem5/FSfILES/m5_system_2.0b3/binaries/vmlinux
> --disk-image=/home/amina/Gem5/FSfILES/m5_system_2.0b3/disks/linux-latest.img
> --script=configs/boot/hello.rcS --num-cpu=2 --num-dirs=2 --mesh-rows=2
> --caches --l2cache --num-l2cache=2 --checkpoint-dir=/home/amina/gemo/checks
> -r 1
>
> I have this :
>  panic: RubyPort::M5Port::recvAtomic() not implemented!
>  @ cycle 2438047479500
> [recvAtomic:build/ALPHA/mem/ruby/system/RubyPort.cc, line 139]
> Memory Usage: 703072 KBytes
> Program aborted at cycle 2438047479500
> Aborted
>
> PLEASE HELP !!
>
> amina
>
>
>
> _______________________________________________
> gem5-users mailing list
> gem5-users@gem5.org
> http://m5sim.org/cgi-bin/mailman/listinfo/gem5-users
>



-- 
Erfan Azarkhish
Micrel Lab - Viale Carlo Pepoli 3/2 - 40123, Bologna
DEIS - University of Bologna, Italy
_______________________________________________
gem5-users mailing list
gem5-users@gem5.org
http://m5sim.org/cgi-bin/mailman/listinfo/gem5-users

Reply via email to