Defining Your Application Memory Model As the author of the Sherlock Holmes stories indicates in the preceding quote, there is a need to partition what resources you can fit close at hand and what you need to place into longer-term storage. For mobile devices this means having a good memory model.The most important thing about a memory model is having one. It is all too easy to allow the memory utilization of applications you are developing to grow incrementally, organically, and in a casual and unplanned way. For desktop applications, this results in confusing code that is often hard to maintain and upgrade and in applications that do not perform as reliably or efficiently as they are capable of. For mobile device applications, a sloppy memory model will result in applications hitting a "performance wall" that makes them perform unacceptably. When you get into this situation, it is usually hard to fix the problems without large-scale redesign. Having a well-defined memory utilization model for your applications avoids this morass and enhances design flexibility.It is useful to think of application memory utilization on two levels:Macro"application-level"memory management This refers to the application-level data and resources your application maintains while it is running. This data is generally long-lived with the data having scope outside of specific functions. Having a good model for managing how much of this data is in memory at a given time and when to throw out data and resources your application does not need for immediate use is essential for building mobile applications that perform well. Having too much long-lived state will crowd out memory that could otherwise be used for caching JITed code or as working memory for your functions and will force a managed application's garbage collector to run often and inefficiently.Micro"algorithm-level"memory allocation Functions allocate temporary memory to execute the instructions specified by your algorithms. Whether this is done efficiently or inefficiently is determined by your algorithm implementation strategy. For example, when writing code that will be executed in loops, it is important to write this code in as resource efficient a way as possible so as not to incur unnecessary overhead in execution. Paying close attention to the memory allocation efficiency of the algorithms you create can have a dramatic effect on the overall performance of your application.Desktop applications that hold a large memory working set (working set = memory in use) will typically push more and more data into a disk-based paging file. This results in rarely used data being pushed out of memory and helps mitigate wasteful macro application-level memory management. The result is relatively linear application performance over a wide range of memory usage. In addition, desktop managed-code systems are capable of sophisticated garbage-collection mechanisms that can partially offset the inefficiencies of wasteful algorithms. This helps performance at a micro level. Desktop applications still suffer from poor memory management, but the effects are dampened by the large computing environment.Mobile devices have smaller RAM budgets, and they generally do not in order to have huge secondary storage mechanisms for paging in and out memory quickly. Further, to run on the more resource-constrained systems, mobile device runtime garbage collectors are often simpler. This means that sloppy memory management will have a disproportionate effect on mobile device applications both at macro and micro levels. Mobile devices are far less tolerant of poor memory management.Both desktop and mobile device application development will benefit greatly from well-thought-out memory management, but on mobile devices this kind of planning is imperative. Without well-thought-out memory management, a desktop application will tend to become increasingly more clunky and sluggish. In contrast, a mobile device application without good macro and micro memory management strategies will quickly cross a threshold where it becomes too painful for users to use. |