I've multidomain drupal site that starts to have quite a lot pages in it. Because our service on web315 seems a little slow today (2012-09-19 7:42 GMT), i started thinking how is memory limits configured in Webfaction's servers? And if there's a possibility that slowness comes from memory limit.
What I mean is, if my default subscription model contains 256M memory, is it automatically configured so, that my Drupal PHP process can eat it all (temporarily) if needed? There's no other applications (atm) on my subscription. Or do i need to do some php.ini configuration and/or ticket submitting my own?
It would be awesome, if WF would provide us feedback if we hit memory consumption limits, so we can react to it (with drupal: buy more memory).
asked Sep 19 '12 at 02:47
Webfaction does not set a "hard" limit on memory usage. It's a strict limit, but it works by checking the RAM usage of every user's processes every couple of minutes, then killing all of the processes for any users who are over the limit.
For a PHP process such as Wordress, Drupal, etc, generally the memory usage does not count as your memory usage, since the processes are being run by the system-wide Apache server, not directly under your own user on the machine. However, if any request takes a long time (over a minute), then this request becomes a "
For a daemon-based application which runs as its own server, such as Django, Rails, Zope, etc, then this directly counts against your memory usage. This has its own benefits though, especially on CentOS 6 servers.
If your memory usage does actually exceed the limit and your processes are terminated, a new Open Issue will be created in the Control Panel, and a notification is sent via email as well. The system only sends an email notification if all previous Open Issues for the same issue are marked as resolved.
In the case of PHP, by default the memory limit is set to 128MB. However, you can change this by setting the
In order to determine how much memory you are using, you can use the following command (within an SSH session) to download a memory usage python script:
Then, simply run it:
Hope that helps!