[ovs-discuss] ovs-vswitchd memory consumption behavior

Fernando Casas Schössow casasfernando at outlook.com
Sat Feb 16 22:00:46 UTC 2019


Hi Ben,

Indeed OVS is logging memory usage. As you can see after running for around 4 months OVS memory usage grows as high as 15GB.:

2018-09-07T22:02:54+02:00 vmsvr01 ovs-vswitchd: ovs|00030|memory|INFO|10316 kB peak resident set size after 10.0 seconds
2018-09-07T22:02:54+02:00 vmsvr01 ovs-vswitchd: ovs|00031|memory|INFO|handlers:5 ports:2 revalidators:3 rules:5 udpif keys:23
2018-09-07T22:12:05+02:00 vmsvr01 ovs-vswitchd: ovs|00030|memory|INFO|10316 kB peak resident set size after 10.0 seconds
2018-09-07T22:12:05+02:00 vmsvr01 ovs-vswitchd: ovs|00031|memory|INFO|handlers:5 ports:2 revalidators:3 rules:5 udpif keys:21
2018-09-08T10:48:18+02:00 vmsvr01 ovs-vswitchd: ovs|00702|memory|INFO|peak resident set size grew 51% in last 3472.0 seconds, from 10300 kB to 15572 kB
2018-09-08T10:48:18+02:00 vmsvr01 ovs-vswitchd: ovs|00703|memory|INFO|handlers:5 ports:13 revalidators:3 rules:5 udpif keys:18
2018-09-08T12:21:11+02:00 vmsvr01 ovs-vswitchd: ovs|00704|memory|INFO|peak resident set size grew 51% in last 5573.2 seconds, from 15572 kB to 23492 kB
2018-09-08T12:21:11+02:00 vmsvr01 ovs-vswitchd: ovs|00705|memory|INFO|handlers:5 ports:13 revalidators:3 rules:5 udpif keys:32
2018-09-08T14:46:16+02:00 vmsvr01 ovs-vswitchd: ovs|00706|memory|INFO|peak resident set size grew 51% in last 8705.3 seconds, from 23492 kB to 35372 kB
2018-09-08T14:46:16+02:00 vmsvr01 ovs-vswitchd: ovs|00707|memory|INFO|handlers:5 ports:13 revalidators:3 rules:5 udpif keys:16
2018-09-08T17:46:03+02:00 vmsvr01 ovs-vswitchd: ovs|00708|memory|INFO|peak resident set size grew 50% in last 10786.3 seconds, from 35372 kB to 53060 kB
2018-09-08T17:46:03+02:00 vmsvr01 ovs-vswitchd: ovs|00709|memory|INFO|handlers:5 ports:13 revalidators:3 rules:5 udpif keys:18
2018-09-08T23:08:45+02:00 vmsvr01 ovs-vswitchd: ovs|00720|memory|INFO|peak resident set size grew 50% in last 19362.0 seconds, from 53060 kB to 79724 kB
2018-09-08T23:08:45+02:00 vmsvr01 ovs-vswitchd: ovs|00721|memory|INFO|handlers:5 ports:14 revalidators:3 rules:5 udpif keys:27
2018-09-09T06:41:39+02:00 vmsvr01 ovs-vswitchd: ovs|00722|memory|INFO|peak resident set size grew 50% in last 27174.2 seconds, from 79724 kB to 119588 kB
2018-09-09T06:41:39+02:00 vmsvr01 ovs-vswitchd: ovs|00723|memory|INFO|handlers:5 ports:14 revalidators:3 rules:5 udpif keys:26
2018-09-09T18:02:35+02:00 vmsvr01 ovs-vswitchd: ovs|00724|memory|INFO|peak resident set size grew 50% in last 40856.1 seconds, from 119588 kB to 179516 kB
2018-09-09T18:02:35+02:00 vmsvr01 ovs-vswitchd: ovs|00725|memory|INFO|handlers:5 ports:14 revalidators:3 rules:5 udpif keys:37
2018-09-10T10:58:41+02:00 vmsvr01 ovs-vswitchd: ovs|00727|memory|INFO|peak resident set size grew 50% in last 60965.9 seconds, from 179516 kB to 269276 kB
2018-09-10T10:58:41+02:00 vmsvr01 ovs-vswitchd: ovs|00728|memory|INFO|handlers:5 ports:15 revalidators:3 rules:5 udpif keys:39
2018-09-11T14:54:02+02:00 vmsvr01 ovs-vswitchd: ovs|00734|memory|INFO|peak resident set size grew 50% in last 100521.3 seconds, from 269276 kB to 403916 kB
2018-09-11T14:54:02+02:00 vmsvr01 ovs-vswitchd: ovs|00735|memory|INFO|handlers:5 ports:14 revalidators:3 rules:5 udpif keys:16
2018-09-13T08:06:22+02:00 vmsvr01 ovs-vswitchd: ovs|00740|memory|INFO|peak resident set size grew 50% in last 148339.4 seconds, from 403916 kB to 605876 kB
2018-09-13T08:06:22+02:00 vmsvr01 ovs-vswitchd: ovs|00741|memory|INFO|handlers:5 ports:15 revalidators:3 rules:5 udpif keys:15
2018-09-15T21:54:39+02:00 vmsvr01 ovs-vswitchd: ovs|00750|memory|INFO|peak resident set size grew 50% in last 222497.4 seconds, from 605876 kB to 908948 kB
2018-09-15T21:54:39+02:00 vmsvr01 ovs-vswitchd: ovs|00751|memory|INFO|handlers:5 ports:14 revalidators:3 rules:5 udpif keys:20
2018-09-19T18:15:25+02:00 vmsvr01 ovs-vswitchd: ovs|00763|memory|INFO|peak resident set size grew 50% in last 332445.8 seconds, from 908948 kB to 1363556 kB
2018-09-19T18:15:25+02:00 vmsvr01 ovs-vswitchd: ovs|00764|memory|INFO|handlers:5 ports:13 revalidators:3 rules:5 udpif keys:46
2018-09-25T11:54:40+02:00 vmsvr01 ovs-vswitchd: ovs|00855|memory|INFO|peak resident set size grew 50% in last 495554.7 seconds, from 1363556 kB to 2045468 kB
2018-09-25T11:54:40+02:00 vmsvr01 ovs-vswitchd: ovs|00856|memory|INFO|handlers:5 ports:16 revalidators:3 rules:5 udpif keys:53
2018-10-04T08:31:40+02:00 vmsvr01 ovs-vswitchd: ovs|00888|memory|INFO|peak resident set size grew 50% in last 765420.9 seconds, from 2045468 kB to 3068204 kB
2018-10-04T08:31:40+02:00 vmsvr01 ovs-vswitchd: ovs|00889|memory|INFO|handlers:5 ports:14 revalidators:3 rules:5 udpif keys:42
2018-10-16T14:45:35+02:00 vmsvr01 ovs-vswitchd: ovs|00911|memory|INFO|peak resident set size grew 50% in last 1059234.5 seconds, from 3068204 kB to 4602308 kB
2018-10-16T14:45:35+02:00 vmsvr01 ovs-vswitchd: ovs|00912|memory|INFO|handlers:5 ports:14 revalidators:3 rules:5 udpif keys:27
2018-11-04T06:27:02+01:00 vmsvr01 ovs-vswitchd: ovs|01015|memory|INFO|peak resident set size grew 50% in last 1615287.6 seconds, from 4602308 kB to 6903596 kB
2018-11-04T06:27:02+01:00 vmsvr01 ovs-vswitchd: ovs|01016|memory|INFO|handlers:5 ports:13 revalidators:3 rules:5 udpif keys:1
2018-12-01T19:28:14+01:00 vmsvr01 ovs-vswitchd: ovs|01092|memory|INFO|peak resident set size grew 50% in last 2379671.3 seconds, from 6903596 kB to 10355396 kB
2018-12-01T19:28:14+01:00 vmsvr01 ovs-vswitchd: ovs|01093|memory|INFO|handlers:5 ports:13 revalidators:3 rules:5 udpif keys:18
2019-01-12T13:12:25+01:00 vmsvr01 ovs-vswitchd: ovs|01234|memory|INFO|peak resident set size grew 50% in last 3606251.1 seconds, from 10355396 kB to 15533228 kB
2019-01-12T13:12:25+01:00 vmsvr01 ovs-vswitchd: ovs|01235|memory|INFO|handlers:5 ports:13 revalidators:3 rules:5 udpif keys:46

Another example below of a two weeks run that got OVS memory usage to around 1.5GB:

2019-01-30T19:28:30+01:00 vmsvr01 ovs-vswitchd: ovs|00030|memory|INFO|10240 kB peak resident set size after 10.0 seconds
2019-01-30T19:28:30+01:00 vmsvr01 ovs-vswitchd: ovs|00031|memory|INFO|handlers:5 ports:2 revalidators:3 rules:5 udpif keys:20
2019-01-30T19:43:36+01:00 vmsvr01 ovs-vswitchd: ovs|00031|memory|INFO|11248 kB peak resident set size after 10.0 seconds
2019-01-30T19:43:36+01:00 vmsvr01 ovs-vswitchd: ovs|00032|memory|INFO|handlers:5 ports:2 revalidators:3 rules:5 udpif keys:21
2019-01-30T20:48:49+01:00 vmsvr01 ovs-vswitchd: ovs|00045|memory|INFO|peak resident set size grew 51% in last 3912.4 seconds, from 11248 kB to 16996 kB
2019-01-30T20:48:49+01:00 vmsvr01 ovs-vswitchd: ovs|00046|memory|INFO|handlers:5 ports:14 revalidators:3 rules:5 udpif keys:33
2019-01-30T22:29:33+01:00 vmsvr01 ovs-vswitchd: ovs|00047|memory|INFO|peak resident set size grew 51% in last 6044.0 seconds, from 16996 kB to 25708 kB
2019-01-30T22:29:33+01:00 vmsvr01 ovs-vswitchd: ovs|00048|memory|INFO|handlers:5 ports:14 revalidators:3 rules:5 udpif keys:33
2019-01-31T00:21:27+01:00 vmsvr01 ovs-vswitchd: ovs|00072|memory|INFO|peak resident set size grew 50% in last 6714.3 seconds, from 25708 kB to 38644 kB
2019-01-31T00:21:27+01:00 vmsvr01 ovs-vswitchd: ovs|00073|memory|INFO|handlers:5 ports:14 revalidators:3 rules:5 udpif keys:27
2019-01-31T03:26:12+01:00 vmsvr01 ovs-vswitchd: ovs|00074|memory|INFO|peak resident set size grew 51% in last 11085.3 seconds, from 38644 kB to 58180 kB
2019-01-31T03:26:12+01:00 vmsvr01 ovs-vswitchd: ovs|00075|memory|INFO|handlers:5 ports:14 revalidators:3 rules:5 udpif keys:37
2019-01-31T08:43:03+01:00 vmsvr01 ovs-vswitchd: ovs|00077|memory|INFO|peak resident set size grew 50% in last 19010.4 seconds, from 58180 kB to 87484 kB
2019-01-31T08:43:03+01:00 vmsvr01 ovs-vswitchd: ovs|00078|memory|INFO|handlers:5 ports:15 revalidators:3 rules:5 udpif keys:43
2019-01-31T16:27:51+01:00 vmsvr01 ovs-vswitchd: ovs|00082|memory|INFO|peak resident set size grew 50% in last 27888.1 seconds, from 87484 kB to 131308 kB
2019-01-31T16:27:51+01:00 vmsvr01 ovs-vswitchd: ovs|00083|memory|INFO|handlers:5 ports:15 revalidators:3 rules:5 udpif keys:53
2019-02-01T03:39:07+01:00 vmsvr01 ovs-vswitchd: ovs|00086|memory|INFO|peak resident set size grew 50% in last 40276.1 seconds, from 131308 kB to 197044 kB
2019-02-01T03:39:07+01:00 vmsvr01 ovs-vswitchd: ovs|00087|memory|INFO|handlers:5 ports:14 revalidators:3 rules:5 udpif keys:21
2019-02-01T21:15:17+01:00 vmsvr01 ovs-vswitchd: ovs|00094|memory|INFO|peak resident set size grew 50% in last 63369.9 seconds, from 197044 kB to 295780 kB
2019-02-01T21:15:17+01:00 vmsvr01 ovs-vswitchd: ovs|00095|memory|INFO|handlers:5 ports:14 revalidators:3 rules:5 udpif keys:67
2019-02-02T22:12:54+01:00 vmsvr01 ovs-vswitchd: ovs|00102|memory|INFO|peak resident set size grew 50% in last 89856.8 seconds, from 295780 kB to 443884 kB
2019-02-02T22:12:54+01:00 vmsvr01 ovs-vswitchd: ovs|00103|memory|INFO|handlers:5 ports:14 revalidators:3 rules:5 udpif keys:37
2019-02-04T15:42:00+01:00 vmsvr01 ovs-vswitchd: ovs|00115|memory|INFO|peak resident set size grew 50% in last 149345.9 seconds, from 443884 kB to 665908 kB
2019-02-04T15:42:00+01:00 vmsvr01 ovs-vswitchd: ovs|00116|memory|INFO|handlers:5 ports:16 revalidators:3 rules:5 udpif keys:61
2019-02-07T05:18:08+01:00 vmsvr01 ovs-vswitchd: ovs|00134|memory|INFO|peak resident set size grew 50% in last 221768.8 seconds, from 665908 kB to 999076 kB
2019-02-07T05:18:08+01:00 vmsvr01 ovs-vswitchd: ovs|00135|memory|INFO|handlers:5 ports:15 revalidators:3 rules:5 udpif keys:9
2019-02-10T19:45:16+01:00 vmsvr01 ovs-vswitchd: ovs|00142|memory|INFO|peak resident set size grew 50% in last 311227.8 seconds, from 999076 kB to 1498828 kB
2019-02-10T19:45:16+01:00 vmsvr01 ovs-vswitchd: ovs|00143|memory|INFO|handlers:5 ports:15 revalidators:3 rules:5 udpif keys:59

Let me know if any other log information or command output can help to understand what's going on here.
Thanks.

On sáb, feb 16, 2019 at 9:12 PM, Ben Pfaff <blp at ovn.org> wrote:
It's not normal. OVS should be logging information about memory usage periodically. What do the logs say about this? On Sat, Feb 16, 2019 at 10:23:55AM +0000, Fernando Casas Schössow wrote:
Hi, I'm running OVS on a Qemu/KVM virtualization host for a while now. The virtualization host is running around 12 VMs without a huge network activity. The OVS configuration is very simple, nothing fancy. Four VLANs, two bond ports with two slaves each, three internal ports and then one port for each VM (I can provide the output of ovs-vsctl show if needed). Recently I noticed that when the ovs-vswitchd process starts it's consuming around 10MB of RAM but after two weeks (the server is running 24x7) ovs-vswitchd can be consuming around 2.5GB of RAM and after let's say a month or so it will be around 5GB or even a bit more. I'm new to OVS but in other circumstances I will suspect of either a memory leak or a misconfiguration. Is this normal behavior? Should ovs-vswitchd continously allocate more and more memory without ever releasing it? What I'm doing now to keep ovs-vswitchd memory usage under control is to restart the service every couple of weeks but this is far from ideal. Please find below some additional information about my setup and feel free to ask for any other details I may have missed. OVS command line: /usr/sbin/ovs-vswitchd --pidfile=/var/run/openvswitch/ovs-vswitchd.pid --detach --monitor --mlockall unix:/var/run/openvswitch/db.sock OVS version: 2.10.1 Distro: Alpine Linux 3.9.0 Linux kernel version: 4.19.18 Qemu version: 3.1 Libvirt version: 4.10.0 Thanks in advance. Fernando
_______________________________________________ discuss mailing list discuss at openvswitch.org<mailto:discuss at openvswitch.org> https://mail.openvswitch.org/mailman/listinfo/ovs-discuss


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.openvswitch.org/pipermail/ovs-discuss/attachments/20190216/c3887501/attachment-0001.html>


More information about the discuss mailing list