• Join over 1.2 million students every month
  • Accelerate your learning by 29%
  • Unlimited access from just £6.99 per month

Linux memory management. Linux uses demand paging where the virtual pages are only loaded into physical memory when needed by the process. Swapping occurs when a process needs to bring data from the virtual memory to the physical memory,

Extracts from this document...

Introduction

Linux Memory Management

Memory management is an important part of a computers operating system, even from the early days a program will require more and more memory than actually exists. A strategy to overcome this is to use a virtual memory, virtual memory make a system appear that it has more memory as it shares it with other processes. The memory management sub system in Linux provides

  • Large address spaces
  • Protection
  • Memory Mapping
  • Fair physical memory allocation
  • Shared virtual memory

 In a virtual memory system all of the addresses are virtual and not physical. The page table converts virtual to physical addresses. Each of the physical and virtual memory is split up into pages, usually the same size .On alpha AXP the page sizes are 8 Kbytes and on Intel x86 systems the pages are 4kbytes.

...read more.

Middle

technique to efficiently choose pages which are to be removed from the system. This scheme requires every page in the system having an age which changes as the page is accessed. The more frequently that a page is accessed, the younger it is; the less that it is accessed the older it becomes. Old pages are good candidates for swapping. The Linux kernel is not run in the virtual address space instead it is solely allocated space in the physical memory address. The page table in Linux also controls access to the code as well as mapping, e.g. the processor has two modes user and kernel, you won’t allow access of kernel code to users and vice versa to protect your system.
...read more.

Conclusion

Dis – Advantages:

  • The drawback of using caches, hardware or otherwise, is that in order to save effort Linux must use more time and space maintaining these caches and, if the caches become corrupted, the system will crash.

Advantages:

  • Page tables are so successful that Linux uses the same page table manipulation code for the Alpha processor, which has three levels of page tables and for Intel x86 processors, which have two levels of page tables.
  • Linux memory management is better for older computers as it can sometimes require less processing power than other operating systems.

Resources:

  • http://tldp.org/LDP/tlk/mm/memory.html
  • http://www.inf.fu-berlin.de/lehre/SS01/OS/Lectures/Lecture14.pdf
  • Ehttp://www.makelinux.com/ldd3/chp-15-sect-1

...read more.

This student written piece of work is one of many that can be found in our University Degree Computer Science section.

Found what you're looking for?

  • Start learning 29% faster today
  • 150,000+ documents available
  • Just £6.99 a month

Not the one? Search for your essay title...
  • Join over 1.2 million students every month
  • Accelerate your learning by 29%
  • Unlimited access from just £6.99 per month

See related essaysSee related essays

Related University Degree Computer Science essays

  1. Project management and risk assessment

    project whereas staff retention is the issue as the project draws near to an end. At the very minimum project managers should review the risk assessment and management plan at each phase boundary before moving into a new phase of the project.

  2. Risk Management and Assessment for IT Projects.

    10. Comparison of TenStep to the PMBOK(r) One of the recognized standard project management methodologies is the Project Management Body of Knowledge (PMBOK(r)), which is the standard put forward by the Project Management Institute (PMI). The PMBOK(r) contains a lot of valuable information, and includes most all of the processes that TenStep contains.

  1. Lifecycle Management Of Information Technology Project In Construction

    not only the change to repre?entational ?tructure? a? outlined above, but al?o the change in way participant? think of underlying project mechani?m and their role in it. Currently, project? are regarded a? cu?tom, unique endeavor? and project ta?k? a? the collection of one-off activitie?.

  2. Random Access Memory

    RAM is used as the primary storage. Usually it is volatile memory where they will lose their data when the computer is shut down. RAM generally stores a bit of data as a charge in capacitor, or in the flip-flop state. * Another type of random access memory that was introduced in 1970s was DRAM (Dynamic Random

  1. TOTAL QUALITY MANAGEMENT SYSTEM

    In United States quality is determined by; 1. Well-known name 2. Word of mouth 3. Past Experience 4. Performance And while deciding to buy product they are influenced by; 1. Price 2. Quality 3. Performance In West Germany quality is determined by; 1. Price 2. Well-known name 3.

  2. Modulation.A modulation is a process by which an information signal is converted to a ...

    Figure 1.5: (a) Power spectral density of white noise (b) Autocorrelation function of white noise From the characteristics of white noise, any two different samples of a White noise process are uncorrelated. Since thermal noise is a Gaussian process and the samples are uncorrelated, the noise samples are also independent [2].

  1. ICT and special needs Stephen Hawking

    These results showed that it was necessary to combine General Relativity with Quantum Theory, the other great scientific development of the 20th Century. He also realised that combining these theories had its consequences. One consequence of such a unification that he discovered was that black holes should not be completely black, but should emit radiation and eventually evaporate and disappear.

  2. Information Management for microarray experimental data

    The organization of this huge-volume of data produced by microarray techniques is one of the biggest challenges that scientists and bioinformatics have yet faced. To design a microarray database, the large amount of data is not the only major difficult, with other unique characteristics such as complexity, dynamic data representation, and lack of standard nomenclatures, each causing additional problems.

  • Over 160,000 pieces
    of student written work
  • Annotated by
    experienced teachers
  • Ideas and feedback to
    improve your own work