• Join over 1.2 million students every month
  • Accelerate your learning by 29%
  • Unlimited access from just £6.99 per month

This report will discuss the evolution and history of two RISC Processors, SPARC and ARM and then compare and contrast these with the recent RISC processors as developed by Intel and Motorola.

Extracts from this document...


Only once in a lifetime will a new invention come about to touch every aspect of our lives. Such a device that changes the way we work, live, and play is a special one. This report will discuss the evolution and history of two RISC Processors, SPARC and ARM and then compare and contrast these with the recent RISC processors as developed by Intel and Motorola. In which we shall be discussing instruction sets, addressing modes, arithmetic units, processor acceleration techniques, and support for Operating Systems and High-Level programming Languages.  

Processors are a specific expression of what's known as an integrated circuit. Look at an old- fashioned radio or TV and you'll see discrete wires connecting the parts. Newer models have circuit boards in which lines of metal embedded in the board serve as wires connecting the components. In an integrated circuit, both the parts style in this case, transistors- and the wires are etched photographically into the chip. A microprocessor is really just a specific kind of integrated circuit. While the design of microprocessors has grown more sophisticated, improvements in manufacturing are the key to the way Intel has been able to double its speed performance every 18 months. By making parts of the chip smaller, they use less electricity, and more parts can be packed into the same space.

Semiconductor manufacturing is probably nearly unique among high-technology industries in combining three characteristics, each of which has some parallels in other industries but the combination of which in the semiconductor industry may be unique. The first is the sheer complexity of the product and process technologies, meaning that the ability of a manufacturer to predict the performance of a new manufacturing process is very limited.

...read more.


A year later Intel introduced the 8080. This chip was a step backwards in chip evolution with its 8 bit data bus. The 8080 could process.64 MIPS with its 6000 transistors. The 8080 used 6 micron technology. This chip is worth mentioning primarily because IBM chose to use it in its first personal computer. IBM was able to use the 8088 with existing 8 bit hardware, which was more cost effective. Later IBM began using the 8086 in its newer systems (Rosch, 68).

In 1982 Intel released the 80286. The 286 family was available in clock speeds of 8, 10, and 12 MHz that could execute 1.2, 1.5, and 1.66 MIPS respectively. The 80286 contained 134,000 transistors with 1.5 micron technology. These chips all used a 16 bit data bus and were used by IBM in it's AT models. This was also the first chip to use virtual memory, or using disk space as RAM (Random Access Memory). To allow full downward compatibility the 286 was designed to have two operating modes.

These modes are real and protected mode. Real mode mimics the operation of an 8086. Protected mode allows multiple applications to be run simultaneously and not interfere with each other (Rosch, 70-71).

The next member to the Intel family was added in November 1985 and was the 80386. These chips are offered in speeds of 16, 20, 25, 33 MHz and can process 5.5, 6.5, 8.5, and 11.4 MIPS respectively. The number of transistors in the 80386 is 275,000 with 1.5 micron technology. The 386 family doubled the register size to 32 bits. Also the 386 uses 16 bytes of prefetch cache that the chip uses to store the next few instructions.

The 386 has three models which are called the 386DX, 386SX, and the 386SL. The 386DX was the original and most powerful. The 386SX is a more economical sibling to the DX. It is basically scaled down, less powerful DX. Also the SX uses a 16 bit data bus.

...read more.


Ever since this first processor was introduced the market has done nothing but soared to unbelievable highs. The first processor common in personal computers was the 8088.This processor was introduced in June of 1978. It could be purchased in three different clock speeds starting at style 5 Megahertz and going up to 10

Megahertz. This CPU had 29,000 transistors. The 80286 and 80386 processors then evolved. The 386 was the first processor to be introduced in the DX, SX, and SL versions. Next came the 80486 processors of which there were even more choices here. The first 486 processor had 1,200,000 transistors and the latest have 1.4 million transistors. There speeds varied any where from 16 MHz on the first ones to 100 MHz on the most recent 486 processors. Some of which are still in use in homes all around the country.

In March 1993, Intel invented the Pentium Processor. It ran at clock speeds of 60 & 66 Mhz. These first Pentium processors had 3.1 million transistors, and had a 32-bit data path.  

The P7 (originally referred to a 64-bit 80x86 which was dropped in favour of the IA-64) was first released as the Pentium 4 in December 2000.

The SPARC and ARM processors are a mere page in the great book of processor history. There will be many new and extremely different processors in the near future. A tremendous amount of time and money have been put into the making and improving of the processor. The improving and investment of billions of pounds are continually going toward the cause of elaborating the processors. The evolution of the processor will continue to evolve for the better until the time when a much faster and more efficient electronic device is invented. This is turn will create a whole new and powerful generation of computers. Hopefully this report has given the reader some insight into the world of these two RISC processors and how they compare to that of CISC architectures developed by Intel and Motrola.

...read more.

This student written piece of work is one of many that can be found in our University Degree Computer Science section.

Found what you're looking for?

  • Start learning 29% faster today
  • 150,000+ documents available
  • Just £6.99 a month

Not the one? Search for your essay title...
  • Join over 1.2 million students every month
  • Accelerate your learning by 29%
  • Unlimited access from just £6.99 per month

See related essaysSee related essays

Related University Degree Computer Science essays

  1. Marked by a teacher

    Cisc v risc. To begin this assignment , I will outline the definition ...

    the part of the computer that fetches the instructions and data from the memory and carries out the instructions in the form of data manipulation and numerical calculations is called the cpu. All devices that are connected to a computer are indirectly connected to the cpu in someway.

  2. Network Design

    Standardization of Wireless Network The IEEE standards that concern wireless local areas connection (WLAN) falls under IEEE 802.11 we proposed to any of following standards in the network and the wireless devices which provide wireless access should meet the following standards .

  1. Phong Shading and Gouraud Shading The standard reflection model in computer graphics that ...

    The first thing to consider is whether to enable lighting or not. glEnable ( GL_LIGHTING ) ; ...or... glDisable ( GL_LIGHTING ) ; If it's disabled then all polygons, lines and points will be coloured according to the setting of the various forms of the glColor command.

  2. Measurement of Processors Performance report. In the experiment, a testing code was developed in ...

    Figure 1 Memory Hierarchy In the experiment, a testing code was developed in C Programming Language. The design involved filling up each memory hierarchy level with data, then by noting down the time taken to access each level, the bandwidth can be calculated.

  1. The project explains various algorithms that are exercised to recognize the characters present on ...

    Process_image - Process_image is a function used to display all the matched characters on the TV screen. It is created not only to test the results but also to provide a high level of convenience to the designer during debugging. Like Reprocess_image, this function is ignored during performance analysis.

  2. Information systems development literature review. Since the 1960s Methodologies, Frameworks, Approaches and CASE ...

    During the development of this project a number of fundamentals were brought to light, highlighting the impact of utilizing SSADM as a development Methodology that also applies to most development methodologies. Methodologies have the ability to identify tasks conducted within each module however they fail to recognize and state

  1. Methods and technology used in Computer Forensics

    Extraction of data is more difficult than the aforementioned functions, and so tools incorporating such features are usually more expensive. In computer forensics, extraction of data refers to the ability to recover data from some secure or corrupted location; think of the analogy of air lifting a crew from a stranded vessel.

  2. This report will discuss the benefits and constraints of network systems and topologies.

    future provision and use of proposed technology. In task 3.i design my Toplogy by using physical locations tools like switch, PC, Server,Router, and cables by use by using IP address and mask number 255.255.2550, troubleshooting and planin connection between routers and all the dievices on the network are well tested.

  • Over 160,000 pieces
    of student written work
  • Annotated by
    experienced teachers
  • Ideas and feedback to
    improve your own work