On Wed, May 13, 2020 at 5:06 PM Patrick Baldwin <pbald...@myersinfosys.com>
wrote:

> On Wed, May 13, 2020 at 1:31 PM Coty Sutherland <csuth...@redhat.com>
> wrote:
>
> > Hi,
> >
> > Please see responses in line below. I'm top posting a bit because the
> > thread got off in the weeds about permissions it seems, which are
> important
> > but not exactly relevant to your problem IMO.
> >
> >
> Indeed, thank you.
>
>
> > On Tue, May 12, 2020 at 11:28 AM Patrick Baldwin <
> > pbald...@myersinfosys.com>
> > wrote:
> >
> > > I've gotten passed an odd (to me, anyway) issue with one of our clients
> > > CentOS systems.
> > >
> > > When our webapp starts running, tomcat dies shortly thereafter with an
> > > OutOfMemoryError. This apparently just started a few days ago.
> > >
> >
> > The issue isn't really odd. The JVM is telling you that something is
> > preventing the garbage collector from being effective and therefore
> > exhausting your heap space. See
> >
> >
> https://docs.oracle.com/javase/8/docs/technotes/guides/vm/gctuning/parallel.html#parallel_collector_excessive_gc
> > for more information about the particular OOME you noted that you're
> > experiencing.
> >
> >
> Reading that, I'm not quite sure if this error would happen if tomcat is
> honoring the memory restrictions that have been (hopefully?) set in config.
> One of our devs thinks the issue is with tomcat not honoring memory
> restrictions, so I'm trying to see if there's any way I can make sure it
> is.
>
> I'm also trying to figure out if this could be an issue with the Java code,
> and not tomcat config per se.
>
>
> >
> > > System info:
> > >
> > > Tomcat Version: Apache Tomcat/7.0.76
> > >
> > > JVM version: 1.8.0_191-b12
> > >
> > > OS: CentOS Linux release 7.6.1810 (Core)
> > >
> > >
> > > This seemed to indicate that catalina.sh isn’t the place for
> environment
> > > variables on Tomcat 7 for Linux:
> > >
> > > https://forums.centos.org/viewtopic.php?t=54207
> > >
> > >
> > > Since there isn’t a setenv.sh in /usr/local/tomcat/bin, we create one:
> > >
> > >
> >
> https://stackoverflow.com/questions/9480210/tomcat-7-setenv-sh-is-not-found
> > >
> > > 195$ ls -l /usr/local/tomcat/bin/setenv.sh
> > >
> > > -rwxrwxrwx. 1 root tomcat 110 May 11 12:56
> > /usr/local/tomcat/bin/setenv.sh
> > >
> > > 45$ cat /usr/local/tomcat/bin/setenv.sh
> > >
> >
> > Assuming you've installed tomcat using yum, the startup doesn't use
> > startup.sh at all so the setenv.sh script is ignored. Instead you want to
> > put your settings into /etc/tomcat/tomcat.conf which is sourced by the
> > systemd service unit. If you want to learn more about how that works,
> check
> > out the unit file to see which scripts it calls
> (/usr/libexec/tomcat/server
> > -> /usr/libexec/tomcat/preamble -> /usr/libexec/tomcat/functions).
> >
> >
> >
> To /etc/tomcat/tomcat.conf I added:
>
> # You can pass some parameters to java here if you wish to
> #JAVA_OPTS="-Xminf0.1 -Xmaxf0.3"
>
> JAVA_OPTS="-Xmx2048m -XX:MaxPermSize=2048m"
>
> And now see:
>
>  sudo journalctl -u tomcat -f :
>
> May 13 15:50:01 protrack server[24306]: OpenJDK 64-Bit Server VM warning:
> ignoring option MaxPermSize=2048m; support was removed in 8.0
>
> ...
>
> May 13 15:50:01 protrack server[24306]: INFO: Command line argument:
> -Xmx2048m
>
> May 13 15:50:01 protrack server[24306]: May 13, 2020 3:50:01 PM
> org.apache.catalina.startup.VersionLoggerListener log
>
> May 13 15:50:01 protrack server[24306]: INFO: Command line argument:
> -XX:MaxPermSize=2048m
>
> May 13 15:50:01 protrack server[24306]: May 13, 2020 3:50:01 PM
> org.apache.catalina.startup.VersionLoggerListener log
>
> May 13 15:50:01 protrack server[24306]: INFO: Command line argument:
> -Xms2048m
>
> May 13 15:50:01 protrack server[24306]: May 13, 2020 3:50:01 PM
> org.apache.catalina.startup.VersionLoggerListener log
>
> May 13 15:50:01 protrack server[24306]: INFO: Command line argument:
> -Xmx2048m
>
> ...
>
> May 13 15:51:23 protrack server[24306]: SEVERE: Unexpected death of
> background thread ContainerBackgroundProcessor[StandardEngine[Catalina]]
>
> May 13 15:51:23 protrack server[24306]: java.lang.OutOfMemoryError: GC
> overhead limit exceeded
>
> May 13 15:51:23 protrack server[24306]: Exception in thread
> "ContainerBackgroundProcessor[StandardEngine[Catalina]]"
> java.lang.OutOfMemoryError: GC overhead limit exceeded
>

>From this you can see that the options are being added (Xmx is there twice,
no problem though since the value is the same), and then you hit an OOME
within 1.5 minutes. Have you tried increasing the heap size to see if it
helps at all?


> So, it is now definitely picking up that memory restriction, but it seems
> to be ignoring because it’s deprecated.
>
> I’ve also found this:
>
>
> https://stackoverflow.com/questions/22634644/java-hotspottm-64-bit-server-vm-warning-ignoring-option-maxpermsize
>
> Specifically, “I think this was downvoted because it implies that you
> should switch previous uses of MaxPermGen with MaxMetaSpaceSize which is
> misleading, since their roles have practically reversed. Before Java 8
> class metadata space resided in PermGen which was limited by 32/64MB, and
> MaxPerGen was used to increase it. Starting from Java 8 however, PermGen is
> no more and class metadata space is unlimited, so MaxMetaspace size is
> actually used to decrease it. “
>
> So that seems to suggest Java is not running out of heap memory and making
> tomcat die, but that it really is just spending a lot of time on garbage
> collection?
>

You get an OOME because you're effectively out of memory. You don't have
enough for the application running in the JVM to function because the
collector is constantly running. Try a heap increase to see if that helps.
Does the OOME occur without any traffic to the instance (so it just fails
after you start it with no user interaction)?


>
>
>
> > > export CATALINA_OPTS="-server -Xms2048m -Xmx2048m"
> > >
> > > export JAVA_OPTS="-XX:PermSize=256m -XX:MaxPermSize=2048m"
> > >
> > > 46$
> > >
> > >
> > > System memory before starting tomcat:
> > >
> > > 188$ free -h
> > >
> > >               total        used        free      shared  buff/cache
> > > available
> > >
> > > Mem:            11G        2.3G        2.2G        2.0G        7.1G
> > > 6.7G
> > >
> > > Swap:          8.0G        1.0G        7.0G
> > >
> > >
> > > Started tomcat,  with sudo service tomcat start
> > >
> > > Tomcat journal error:
> > >
> > >
> > > May 11 17:48:59 protrack server[7298]: SEVERE: Unexpected death of
> > > background thread
> ContainerBackgroundProcessor[StandardEngine[Catalina]]
> > >
> > > May 11 17:48:59 protrack server[7298]: java.lang.OutOfMemoryError: GC
> > > overhead limit exceeded
> > >
> > > May 11 17:48:59 protrack server[7298]: Exception in thread
> > > "ContainerBackgroundProcessor[StandardEngine[Catalina]]"
> > > java.lang.OutOfMemoryError: GC overhead limit exceeded
> > >
> > > May 11 17:49:38 protrack server[7298]: Exception:
> > > java.lang.OutOfMemoryError thrown from the UncaughtExceptionHandler in
> > > thread "http-bio-8080-AsyncTimeout"
> > >
> > > May 11 17:49:39 protrack server[7298]: Exception:
> > > java.lang.OutOfMemoryError thrown from the UncaughtExceptionHandler in
> > > thread "ajp-bio-8009-AsyncTimeout"
> > >
> > > May 11 17:49:42 protrack server[7298]: Exception in thread
> > >
> > >
> >
> "org.springframework.scheduling.quartz.SchedulerFactoryBean#0_QuartzSchedulerThread"
> > >
> > >
> > > Application log error:
> > >
> > > Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
> > >
> > > 2020-05-11 17:49:50
> > > [org.springframework.scheduling.quartz.SchedulerFactoryBean#0_Worker-2]
> > > ERROR o.s.t.i.TransactionInterceptor - Application exception overridden
> > by
> > > rollback exception
> > >
> > > java.lang.OutOfMemoryError: GC overhead limit exceeded
> > >
> > >
> > > System memory while tomcat is up, after the OutOfMemoryError pops:
> > >
> > > ksmq_tv 191$ free -h
> > >
> > >               total        used        free      shared  buff/cache
> > > available
> > >
> > > Mem:            11G        3.5G        1.0G        2.0G        7.1G
> > > 5.5G
> > >
> > > Swap:          8.0G        1.0G        7.0G
> > >
> > >
> > > Stopped with  sudo service tomcat stop
> > >
> > >
> > >
> > > System memory after tomcat stopped:
> > >
> > > ksmq_tv 194$ free -h
> > >
> > >               total        used        free      shared  buff/cache
> > > available
> > >
> > > Mem:            11G        795M        3.7G        2.0G        7.1G
> > > 8.2G
> > >
> > > Swap:          8.0G        1.0G        7.0G
> > >
> > >
> > >
> > > It sure doesn't look like something is actually running the system out
> of
> > > memory at a system level; usage is definitely impacted by starting our
> > app,
> > > but that's expected.
> > >
> >
> > The system isn't running out of memory, Tomcat's JVM is. This could be
> due
> > to numerous things, so you'll have to do some digging to find out why
> that
> > is. I'd start by enabling/collecting/reviewing GC logging and a heap dump
> > from the time of the OOME, which you may have to take manually (I don't
> > recall if the HeapDumpOnOutOfMemory argument triggers with a GC overhead
> > error). As a simple solution try and increase the amount of heap that you
> > give the instance to see if the problem goes away or if it occurs after a
> > longer duration. If it takes longer to reproduce, then a heap dump may
> show
> > you a bit more obviously what the problem is.
> >
> >
> > > Assuming no one finds any obvious errors with how we implemented
> > setenv.sh,
> > > is there some way to verify what memory limitations tomcat is actually
> > > running under?
> > >
> >
> > I don't recall if Tomcat 7's logger includes it, but the
> > org.apache.catalina.startup.VersionLoggerListener class in Tomcat 9 dumps
> > all the arguments passed into the JVM so you should be able to see the
> Xms
> > and Xmx settings in the catalina.log. Alternatively hit the instance with
> > jconsole and check the VM Summary tab.
> >
> > HTH,
> >
> >
> > > I was also wondering if anyone knew an open source webapp that would be
> > > good to deploy to see if this problem is tomcat specific or an issue
> with
> > > our webapp?  I figure if I deploy something else that doesn't promptly
> > > throw an  OutOfMemoryError, then it might be more of a dev issue and
> less
> > > of a tomcat config issue.  Trying to at least figure out what
> direction I
> > > need to be looking in, any help much appreciated.
> > >
> >
>

Reply via email to