Help us improve your experience.

Let us know what you think.

Do you have time for a two-minute survey?

 

Optimize App Memory Usage

 

Tune the JSA Application Framework to optimize app memory usage.

Use any of the following methods to help prevent your app from using an excessive amount of memory.

  • Avoid allocating large amounts of memory by chunking (or staggering) the work into small memory footprints.

  • Change the memory model that is used by the Application Framework.

  • Call for garbage collection when you're finished with code that uses large amounts of memory.

Changing the Application Framework Memory Model

By default, the Application Framework configures the Werkzeug WSGI web application server that Flask uses to run as a single process. Threads are used to handle each request. You can configure the application server to create a separate process to handle each new request. When the request is completed, the process is removed, and all of the memory that is allocated by the Python interpreter to process this request is released.

To override this behavior, edit the run.py file and add threaded=False and process=N where N is greater than 1. In the following example, a value of process=3 allocates approximately 25 MB per interpreter and leaves some room for growth.

__author__ = 'IBM' from app import app from app.qpylib import qpylib qpylib.create_log() app.run(debug = True, host='0.0.0.0', threaded=False, process=3)

Include the source to the run.py in the template folder within your app archive file (.zip). The run.py file that is created during the installation is then overwritten with your settings.

Note

When you package an app with the SDK, the run.py is skipped and you must manually add it to your app archive file (.zip).

For more information about parameters that can be passed to the Werkzeug WSGI web application server, see http://werkzeug.pocoo.org/docs/0.11/serving/.

Calling for Garbage Collection

The Python interpreter might not know when to free the memory. You can speed up garbage collection by placing the following code right after sections where large amounts of memory are no longer needed:

import gc gc.collect()

Note

Python does not ensure that any memory that your code uses gets returned to the OS. Garbage collection ensures that the memory used by an object is free to be used by another object at some future time. Changing the Application Framework memory model option is important for apps that run for a long time. Killing the process ensures the memory is freed for use by other components.

Tools

Some tools that can help you identify memory problems:

  • Memory Profiler--A Python module for monitoring memory consumption of a process. For more information, see https://pypi.python.org/pypi/memory_profiler.

  • Linux utilities--The command-line utility top can be used to monitor all Python processes running on the machine:

    top -p $(pgrep -d',' python)

    You can also use the following command to get the total MB used by all Python interpreters on your system:

    ps -e -o pid,comm,rss | awk '/python/{sum+=$3} END {print sum}’

  • Resource Module--You can log the amount of memory your process uses by adding the following code to your module:

    import resource print 'Memory usage: %s (kb)' % resource.getrusage (resource.RUSAGE_SELF).ru_maxrss

Related Documentation