Locality

 Locality of reference

 

Locality of reference refers to the tendency of the computer program to access the same set of memory locations for a particular time period. The property of Locality of Reference is mainly shown by loops and subroutine calls in a program.

What is 'the principle of locality'?

- It is a term for the phenomenon in which the same values or related storage locations are frequently accessed.

- It is also known as the locality of reference.

- The next most data items or instruction is the closest to the current data item or instruction.

- For example, a block in a file may be the closest one that is needed next. The OS can read the next block before its need and made available it on hand at the time of issuing the actual read request.

 There are two types of principle of locality:


1. Temporal locality

2. Spatial locality

Temporal locality

This type of optimization includes bringing in the frequently accessed memory references to a nearby memory location for a short duration of time so that the future accesses are much faster.

For example, if in an instruction set, we have a variable declared that is being accessed very frequently we bring in that variable in a memory register which is the nearest in memory hierarchy for faster access.

Spatial locality

This type of optimization assumes that if a memory location has been accessed it is highly likely that a nearby/consecutive memory location will be accessed as well and hence we bring in the nearby memory references too in a nearby memory location for faster access.

For example, traversal of a one-dimensional array in any instruction set will benefit from this optimization.

Using these optimizations, we can greatly improve upon the efficiency of the programs and can be implemented on hardware level or on software level.

Let us see the locality of reference relationship with cache memory and hit ratio.


What is Cache Memory

Definition of Cache Memory – Cache memory is variant of DRAM (Dynamic RAM), and it is embedded into the CPU processor or between CPU and main memory of computer system that helps to access data or instructions from computer’ memory (Primary and Secondary Memory) with higher efficiency. This memory works as a temporary storage unit that is trashed when power get turn-off. Cache memory is more readable form to access for the CPU chip compared to computer’s primary memory and locality of reference in OS.


cache memory

Relationship with Cache memory

Cache is a specially designed faster but smaller memory area, generally used to keep recently referenced data and data near recently referenced data, which can lead to potential performance increases.

Data in cache does not necessarily correspond to data that is spatially close in main memory. However, data elements are brought into cache one cache line at a time. This means that spatial locality is again important. If one element is referenced, a few neighbour elements will also be brought into cache.

Finally, temporal locality plays a role on the lowest level, since results that are referenced very closely together can be kept in the machine registers. Programming languages such as C allow the programmer to suggest that certain variables are kept in registers.

Relationship with Hit ratio

Hit ratio is the concept defined for any two adjacent levels of a memory hierarchy. It is the probability that an information item will be found in the memory we are looking at.

So locality of reference needs a good Hit ratio for achieving Fast Access time.





Comments

Popular posts from this blog

Transport Layer

Lock Conversion