Using V8 v13.8.258.18 I'm consistently hitting the following DCHECK failure
during heap garbage collection:
```
#
# Fatal error in ../../../v8/src/heap/memory-reducer.cc, line 229
# Debug check failed: 0 < delay_ms (0 vs. 0).
#
```
This is the stack trace at the time of the failure:
```
* thread #1, queue = 'com.apple.main-thread', stop reason = EXC_BREAKPOINT
(code=1, subcode=0x1076866cc)
* frame #0: 0x00000001076866cc libjs.dylib`v8::base::OS::Abort()
[inlined] v8::base::OS::Abort()::$_0::operator()(this=<unavailable>) const
at platform-posix.cc:729:7 [opt]
frame #1: 0x00000001076866cc libjs.dylib`v8::base::OS::Abort() at
platform-posix.cc:729:7 [opt]
frame #2: 0x00000001076713c0 libjs.dylib`V8_Fatal(file=<unavailable>,
line=<unavailable>, format=<unavailable>) at logging.cc:215:3 [opt]
frame #3: 0x0000000107670cb8 libjs.dylib`v8::base::(anonymous
namespace)::DefaultDcheckHandler(file=<unavailable>, line=<unavailable>,
message=<unavailable>) at logging.cc:59:3 [opt]
frame #4: 0x00000001063feff0
libjs.dylib`v8::internal::MemoryReducer::ScheduleTimer(this=0x00006070000014b0,
delay_ms=0) at memory-reducer.cc:229:3 [opt]
frame #5: 0x00000001063ff1a4
libjs.dylib`v8::internal::MemoryReducer::NotifyMarkCompact(this=0x00006070000014b0,
committed_memory_before=<unavailable>) at memory-reducer.cc:123:5 [opt]
frame #6: 0x0000000106337084
libjs.dylib`v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace,
v8::internal::GarbageCollectionReason,
v8::GCCallbackFlags)::$_0::operator()(this=0x000000016fdfe318) const at
heap.cc:1714:26 [opt]
frame #7: 0x0000000106336ab0 libjs.dylib`void
heap::base::Stack::SetMarkerAndCallbackImpl<v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace,
v8::internal::GarbageCollectionReason,
v8::GCCallbackFlags)::$_0>(stack=0x000061d000001ba0,
argument=<unavailable>, stack_end=<unavailable>) at stack.h:185:5 [opt]
frame #8: 0x000000010781febc
libjs.dylib`PushAllRegistersAndIterateStack + 40
frame #9: 0x0000000106311e88
libjs.dylib`v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace,
v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [inlined] void
heap::base::Stack::SetMarkerIfNeededAndCallback<v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace,
v8::internal::GarbageCollectionReason,
v8::GCCallbackFlags)::$_0>(this=<unavailable>, callback=(unnamed class) @
0x000000016fdfe318) at stack.h:81:7 [opt]
frame #10: 0x0000000106311e80
libjs.dylib`v8::internal::Heap::CollectGarbage(this=0x0000633000010938,
space=<unavailable>, gc_reason=kFinalizeMarkingViaTask,
gc_callback_flags=<unavailable>) at heap.cc:1655:11 [opt]
frame #11: 0x000000010631f51c
libjs.dylib`v8::internal::Heap::FinalizeIncrementalMarkingAtomically(v8::internal::GarbageCollectionReason)
[inlined] v8::internal::Heap::CollectAllGarbage(this=0x0000633000010938,
gc_flags=<unavailable>, gc_reason=kFinalizeMarkingViaTask,
gc_callback_flags=<unavailable>) at heap.cc:1266:3 [opt]
frame #12: 0x000000010631f50c
libjs.dylib`v8::internal::Heap::FinalizeIncrementalMarkingAtomically(this=0x0000633000010938,
gc_reason=kFinalizeMarkingViaTask) at heap.cc:3995:3 [opt]
frame #13: 0x000000010633bf64
libjs.dylib`v8::internal::IncrementalMarkingJob::Task::RunInternal(this=0x0000606000016760)
at incremental-marking-job.cc:135:34 [opt]
```
I have a test case
at https://github.com/holepunchto/libjs/blob/main/test/threads-platform-loop.c
that reproduces it reliably. It runs an isolate on the main thread that
does a bunch of large allocations to trigger garbage collection and then a
separate thread that runs background tasks posted to the embedder platform
implementation.
It doesn't seem to cause any observable behavior in production builds where
the corresponding DCHECK is disabled, but I still can't help but worry that
an invariant is being violated somewhere in the embedder platform
implementation.
--
--
v8-users mailing list
[email protected]
http://groups.google.com/group/v8-users
---
You received this message because you are subscribed to the Google Groups
"v8-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To view this discussion visit
https://groups.google.com/d/msgid/v8-users/81831323-681c-454b-816c-66a1f3772b99n%40googlegroups.com.