Computer Storage devices

Primary Memory

Memory (RAM, ROM, EDO RAM, SD RAM)

Memory is the faculty of the brain by which data or information is encoded, stored, and retrieved when needed. It is the retention of information over time for the purpose of influencing future action. If past events could not be remembered, it would be impossible for language, relationships, or personal identity to develop. Memory loss is usually described as forgetfulness or amnesia.

Memory is often understood as an informational processing system with explicit and implicit functioning that is made up of a sensory processor, short-term (or working) memory, and long-term memory. This can be related to the neuron. The sensory processor allows information from the outside world to be sensed in the form of chemical and physical stimuli and attended to various levels of focus and intent. Working memory serves as an encoding and retrieval processor. Information in the form of stimuli is encoded in accordance with explicit or implicit functions by the working memory processor. The working memory also retrieves information from previously stored material. Finally, the function of long-term memory is to store data through various categorical models or systems.

Declarative, or explicit, memory is the conscious storage and recollection of data. Under declarative memory resides semantic and episodic memory. Semantic memory refers to memory that is encoded with specific meaning, while episodic memory refers to information that is encoded along a spatial and temporal plane. Declarative memory is usually the primary process thought of when referencing memory. Non-declarative, or implicit, memory is the unconscious storage and recollection of information. An example of a non-declarative process would be the unconscious learning or retrieval of information by way of procedural memory, or a priming phenomenon. Priming is the process of subliminally arousing specific responses from memory and shows that not all memory is consciously activated, whereas procedural memory is the slow and gradual learning of skills that often occurs without conscious attention to learning.

Memory is not a perfect processor, and is affected by many factors. The ways by which information is encoded, stored, and retrieved can all be corrupted. The amount of attention given new stimuli can diminish the amount of information that becomes encoded for storage. Also, the storage process can become corrupted by physical damage to areas of the brain that are associated with memory storage, such as the hippocampus. Finally, the retrieval of information from long-term memory can be disrupted because of decay within long-term memory. Normal functioning, decay over time, and brain damage all affect the accuracy and capacity of the memory.

RAM

RAM (random-access memory) is a hardware device that allows information to be stored and retrieved on a computer. RAM is usually associated with DRAM, which is a type of memory module. Because information is accessed randomly instead of sequentially like it is on a CD or hard drive, access times are much faster. However, unlike ROM, RAM is a volatile memory and requires power to keep the data accessible. If the computer is turned off, all data contained in RAM is lost.

Main Types of RAM

There are two main types of RAM:

  • DRAM (Dynamic Random Access Memory)
  • SRAM (Static Random Access Memory)

DRAM (Dynamic Random Access Memory) – The term dynamic indicates that the memory must be constantly refreshed or it will lose its contents.  DRAM is typically used for the main memory in computing devices. If a PC or smartphone is advertised as having 4-GB RAM or 16-GB RAM, those numbers refer to the DRAM, or main memory, in the device.

More specifically, most of the DRAM used in modern systems is synchronous DRAM, or SDRAM. Manufacturers also sometimes use the acronym DDR (or DDR2, DDR3, DDR4, etc.) to describe the type of SDRAM used by a PC or server. DDR stands for double data rate, and it refers to how much data the memory can transfer in one clock cycle.

In general, the more RAM a device has, the faster it will perform.

SRAM (Static Random Access Memory) – While DRAM is typically used for main memory, today SRAM is more often used for system cache. SRAM is said to be static because it doesn’t need to be refreshed, unlike dynamic RAM, which needs to be refreshed thousands of times per second. As a result, SRAM is faster than DRAM. However, both types of RAM are volatile, meaning that they lose their contents when the power is turned off.

ROM

Read-only memory (ROM) is a type of non-volatile memory used in computers and other electronic devices. Data stored in ROM cannot be electronically modified after the manufacture of the memory device. Read-only memory is useful for storing software that is rarely changed during the life of the system, sometimes known as firmware. Software applications for programmable devices can be distributed as plug-in cartridges containing read-only memory.

Strictly, read-only memory refers to memory that is hard-wired, such as diode matrix or a mask ROM integrated circuit, which cannot be electronically changed after manufacture. Although discrete circuits can be altered in principle, integrated circuits (ICs) cannot. Correction of errors, or updates to the software, require new devices to be manufactured and to replace the installed device.

Erasable programmable read-only memory (EPROM) and electrically erasable programmable read-only memory (EEPROM) can be erased and re-programmed, but usually this can only be done at relatively slow speeds, may require special equipment to achieve, and is typically only possible a certain number of times.

The term “ROM” is sometimes used to mean a ROM device containing specific software, or a file with software to be stored in EEPROM or Flash Memory. For example, users modifying or replacing the Android operating system describe files containing a modified or replacement operating system as “custom ROMs” after the type of storage the file used to be written to.

EDO RAM

Extended data out random access memory (EDO RAM/DRAM) is an early type of dynamic random access memory (DRAM) chip which was designed to improve the performance of fast page mode DRAM (FPM DRAM) that was used in the 1990s. Its main feature was that it eliminated wait times by allowing a new cycle to start while retaining the data output buffer from the previous cycle active, which allows a degree of pipelining (overlap in operation) that improved performance.

Extended data out dynamic random access memory was introduced in 1994 and began to replace fast page mode DRAM by 1995 when Intel first introduced the 430FX chipset that supports EDO DRAM. Before that, EDO DRAM could replace FPM DRAM, but if the memory controller was not specifically designed for the EDO, then the performance remained the same as FPM.

Single-cycle EDO DRAM is able to carry out an entire memory transaction in a single clock cycle, otherwise, it can do it in two cycles instead of three, once the page has been selected. The EDO’s capability allowed it to replace the slow L2 cache of PCs at that time and reduced the huge performance loss associated with the L2 cache, while making PCs cheaper to build overall. So a system using EDO with L2 cache was much faster compared to the FPM and L2 cache combination, while also being cheaper to build.

EDO was rated for 40 MHz maximum clock rate, 64 bits of bus bandwidth, 320 MBps peak bandwidth and ran at 5 volts. It was tangibly faster than the older FPM DRAM that had only 25 MHz max clock rate and 200 MBps peak bandwidth. However, it was superseded by the faster SDRAM starting in 1996, after only two years of major use.

SD RAM

SDRAM, which is short for Synchronous DRAM, is a type of memory that synchronizes itself with the computer’s system clock. Being synchronized allows the memory to run at higher speeds than previous memory types and asynchronous DRAM and also supports up to 133 MHz system bus cycling. Since 1993, this is the prevalent type of memory used in computers around the world. In the picture below is an example of a SDRAM DIMM. The original type, named SDRAM, up to the current type, DDR3, are all derivatives of the SDRAM memory type.

Synchronous dynamic random-access memory (SDRAM) is any dynamic random-access memory (DRAM) where the operation of its external pin interface is coordinated by an externally supplied clock signal.

The first commercial SDRAM was the Samsung KM48SL2000 chip, introduced in 1992. SDRAM is widely used in computers. Beyond the original SDRAM, further generations of double data rate RAM have entered the mass market DDR (also known as DDR1), DDR2, DDR3, and DDR4, with the latest generation (DDR5) scheduled to be released commercially within the future 2020-2021 timeframe.

Secondary Memory

You know that processor memory, also known as primary memory, is expensive as well as limited. The faster primary memory is also volatile. If we need to store large amount of data or programs permanently, we need a cheaper and permanent memory. Such memory is called secondary memory. Here we will discuss secondary memory devices that can be used to store large amount of data, audio, video and multimedia files.

Characteristics of Secondary Memory

These are some characteristics of secondary memory, which distinguish it from primary memory:

  • It is non-volatile, i.e. it retains data when power is switched off
  • It is large capacities to the tune of terabytes
  • It is cheaper as compared to primary memory

Depending on whether secondary memory device is part of CPU or not, there are two types of secondary memory – fixed and removable.

Hard Disk Drive

Hard disk drive is made up of a series of circular disks called platters arranged one over the other almost ½ inches apart around a spindle. Disks are made of non-magnetic material like aluminum alloy and coated with 10-20 nm of magnetic material.

Standard diameter of these disks is 14 inches and they rotate with speeds varying from 4200 rpm (rotations per minute) for personal computers to 15000 rpm for servers. Data is stored by magnetizing or demagnetizing the magnetic coating. A magnetic reader arm is used to read data from and write data to the disks. A typical modern HDD has capacity in terabytes (TB).

CD Drive

CD stands for Compact Disk. CDs are circular disks that use optical rays, usually lasers, to read and write data. They are very cheap as you can get 700 MB of storage space for less than a dollar. CDs are inserted in CD drives built into CPU cabinet. They are portable as you can eject the drive, remove the CD and carry it with you. There are three types of CDs:

  • CD-ROM (Compact Disk – Read Only Memory): The data on these CDs are recorded by the manufacturer. Proprietary Software, audio or video are released on CD-ROMs.
  • CD-R (Compact Disk – Recordable): Data can be written by the user once on the CD-R. It cannot be deleted or modified later.
  • CD-RW (Compact Disk – Rewritable): Data can be written and deleted on these optical disks again and again.

DVD Drive

DVD stands for Digital Video Display. DVD is optical devices that can store 15 times the data held by CDs. They are usually used to store rich multimedia files that need high storage capacity. DVDs also come in three varieties – read only, recordable and rewritable.

Pen Drive

Pen drive is a portable memory device that uses solid state memory rather than magnetic fields or lasers to record data. It uses a technology similar to RAM, except that it is nonvolatile. It is also called USB drive, key drive or flash memory.

Blu Ray Disk

Blu Ray Disk (BD) is an optical storage media used to store high definition (HD) video and other multimedia filed. BD uses shorter wavelength laser as compared to CD/DVD. This enables writing arm to focus more tightly on the disk and hence pack in more data. BDs can store up to 128 GB data.

Optical Disk

An optical disk is any computer disk that uses optical storage techniques and technology to read and write data. It is a computer storage disk that stores data digitally and uses laser beams (transmitted from a laser head mounted on an optical disk drive) to read and write data.

An optical disk is primarily used as a portable and secondary storage device. It can store more data than the previous generation of magnetic storage media, and has a relatively longer lifespan. Compact disks (CD), digital versatile/video disks (DVD) and Blu-ray disks are currently the most commonly used forms of optical disks. These disks are generally used to:

  • Distribute software to customers
  • Store large amounts of data such as music, images and videos
  • Transfer data to different computers or devices
  • Back up data from a local machine

Cache memory

Cache Memory is a special very high-speed memory. It is used to speed up and synchronizing with high-speed CPU. Cache memory is costlier than main memory or disk memory but economical than CPU registers. Cache memory is an extremely fast memory type that acts as a buffer between RAM and the CPU. It holds frequently requested data and instructions so that they are immediately available to the CPU when needed.

Cache memory is used to reduce the average time to access data from the Main memory. The cache is a smaller and faster memory which stores copies of the data from frequently used main memory locations. There are various different independent caches in a CPU, which store instructions and data.

Levels of memory:

  • Level 1 or Register

    It is a type of memory in which data is stored and accepted that are immediately stored in CPU. Most commonly used register is accumulator, Program counter, address register etc.

  • Level 2 or Cache memory

    It is the fastest memory which has faster access time where data is temporarily stored for faster access.

  • Level 3 or Main Memory

    It is memory on which computer works currently. It is small in size and once power is off data no longer stays in this memory.

  • Level 4 or Secondary Memory

    It is external memory which is not as fast as main memory but data stays permanently in this memory.

Cache Performance:

When the processor needs to read or write a location in main memory, it first checks for a corresponding entry in the cache.

  • If the processor finds that the memory location is in the cache, a cache hit has occurred and data is read from cache
  • If the processor does not find the memory location in the cache, a cache miss has occurred. For a cache miss, the cache allocates a new entry and copies in data from main memory, then the request is fulfilled from the contents of the cache.

The performance of cache memory is frequently measured in terms of a quantity called Hit ratio.

Hit ratio = hit / (hit + miss) =  no. of hits/total accesses

We can improve Cache performance using higher cache block size, higher associativity, reduce miss rate, reduce miss penalty, and reduce Reduce the time to hit in the cache.

Cache Mapping:

There are three different types of mapping used for the purpose of cache memory which are as follows: Direct mapping, Associative mapping, and Set-Associative mapping. These are explained below.

  1. Direct Mapping

    The simplest technique, known as direct mapping, maps each block of main memory into only one possible cache line. or

    In Direct mapping, assigne each memory block to a specific line in the cache. If a line is previously taken up by a memory block when a new block needs to be loaded, the old block is trashed. An address space is split into two parts index field and a tag field. The cache is used to store the tag field whereas the rest is stored in the main memory. Direct mapping`s performance is directly proportional to the Hit ratio.

i = j modulo m

where

i=cache line number

j= main memory block number

m=number of lines in the cache

For purposes of cache access, each main memory address can be viewed as consisting of three fields. The least significant w bits identify a unique word or byte within a block of main memory. In most contemporary machines, the address is at the byte level. The remaining s bits specify one of the 2s blocks of main memory. The cache logic interprets these s bits as a tag of s-r bits (most significant portion) and a line field of r bits. This latter field identifies one of the m=2r lines of the cache.

  1. Associative Mapping:

    In this type of mapping, the associative memory is used to store content and addresses of the memory word. Any block can go into any line of the cache. This means that the word id bits are used to identify which word in the block is needed, but the tag becomes all of the remaining bits. This enables the placement of any word at any place in the cache memory. It is considered to be the fastest and the most flexible mapping form.

  2. Set-associative Mapping:

    This form of mapping is an enhanced form of direct mapping where the drawbacks of direct mapping are removed. Set associative addresses the problem of possible thrashing in the direct mapping method. It does this by saying that instead of having exactly one line that a block can map to in the cache, we will group a few lines together creating a set. Then a block in memory can map to any one of the lines of a specific set..Set-associative mapping allows that each word that is present in the cache can have two or more words in the main memory for the same index address. Set associative cache mapping combines the best of direct and associative cache mapping techniques.

Application of Cache Memory:

  1. Usually, the cache memory can store a reasonable number of blocks at any given time, but this number is small compared to the total number of blocks in the main memory.
  2. The correspondence between the main memory blocks and those in the cache is specified by a mapping function.

Types of Cache:

  • Primary Cache:

    A primary cache is always located on the processor chip. This cache is small and its access time is comparable to that of processor registers.

  • Secondary Cache:

    Secondary cache is placed between the primary cache and the rest of the memory. It is referred to as the level 2 (L2) cache. Often, the Level 2 cache is also housed on the processor chip.

Locality of reference:
Since size of cache memory is less as compared to main memory. So to check which part of main memory should be given priority and loaded in cache is decided based on locality of reference.

Types of Locality of reference

  1. Spatial Locality of reference

    This says that there is a chance that element will be present in the close proximity to the reference point and next time if again searched then more close proximity to the point of reference.

  2. Temporal Locality of reference

    In this Least recently used algorithm will be used. Whenever there is page fault occurs within a word will not only load word in main memory but complete page fault will be loaded because spatial locality of reference rule says that if you are referring any word next word will be referred in its register that’s why we load complete page table so the complete block will be loaded.

Leave a Reply

error: Content is protected !!