01101011
Sign most significant bit=0, therefore it is a positive number
Exponent bits are 110. In the excess notation of size 3 bits (110)2 corresponds to the decimal value (2)10 so our exponent is +2
The bit pattern representing the mantissa in this number is 1011. The radix point is always assumed to be to the left of the left most bit of the mantissa. Thus the mantissa is always a fraction, and it should be interpreted as being equal to
.1011
Since the exponent is +2 then this tells us that the radix point’s real position is found by moving two digits to the right in the mantissa. This gives you
10.11
The overall number in binary is then (+10.11)2. This is equal to (23/4)10
The next layer is the Microcode layer. Microcode is the code that directly controls the central processing unit. One machine instruction is the same as a lot of microcode instructions. Device drivers are also part of this layer; they are very low-level instructions to the hardware devices in the computer e.g. graphics card drivers. The job of the microcode layer is to translate instructions into signals that control the functions of various chips. As a system instruction is executed the microcode reads the instruction and then loads the equivalent microcode instruction that corresponds with the first instruction. That program then carries out all of the instructions that were required.
The next layer is the Machine layer. The machine layer is a series of 0’s and 1’s or integer numbers. Machine code is the main language of computers consisting of 0’s and 1’s. The output of any programming language is in machine code. After a programme has been written statements are made into output which is machine code. The machine code is stored as an executable file which means it can be run. A processor on a modern computer handles a number of 0’s and 1’s and it is designed to know which bits make it perform an operation, it can look at a set of bits and read the bits in the correct sequence.
The next layer is the operating system layer. The operating system is a program that runs and loads all of the other software programs on a computer; these programs are often referred to as applications. The applications on an operating system are able to make requests to the operating system for system resources such as memory. Users are also able to communicate with the operating system and run applications by using either a command line interface or a graphical user interface. The operating system is also responsible for a number of things including deciding which applications should be run and for how long before another is run. It also manages internal memory and decides how much memory each application should receive. It handles input and output devices and hardware devices like graphics cards and printers.
The next layer is the assembly layer. An assembler is a program that can convert a computers basic instruction into a pattern of bits that the computers processor can use to perform basic operations. Assembler instructions can be used to write a complete program. The sequence of these instructions is specified to the assembler program when that program is started. The assembler program works by taking each program statement and then generates a corresponding binary code of 0’s and 1’s of a given length.
The next layer is the high-level language layer. This layer runs procedure languages such as COBOL FORTRAN and C. High-level language is designed mainly for humans as it is based on our language and not on machine language like on lower layers. This language checks data type and data range and uses complicated expressions and statements.
The High-level language layer contains a compiler. A compiler is a special program that processes statements written in a specific language and turns them into machine language. When a programmer creates a statement the file that is created contains source statements, the program then runs the correct compiler specifying the name of the file that contains the source statements. To do this the compiler analyzes all of the source statements one after the next and then in stages starts to build the output code making sure that references to statements are in the correct order in the final code. The output of the compilation is usually called output code or sometimes output module. The processor in a machine can process the object code one instruction at a time.
High-level languages also use a floating point code. Floating point is used to represent fractions of whole numbers which can be either small or large. In high-level languages this type of code is used to store variables such as float or real. Because computers use integer numbers it would make it very hard to create complex code as you would be restricted to just 0’s and 1’s so this method of working with fractional numbers is used. Floating point is the most common method used for overcoming this weakness. The type of code is the most difficult to understand and see so examples are initially in decimal or notation and are then developed in binary code.
The next layer is the application layer. This is one of the top most layers that most computer users are familiar with using. It is the layer where you can use the internet, write e-mails and use different types of programs or applications. An application is any program that is designed to perform a function directly for the user. Examples of these types of programs include Microsoft Word and Microsoft Excel. Applications run on the operating system and use the computer resources.
Principal of the Von Neumann cycle
The Von Neumann cycle was created by a man called John Von Neumann. Von Neumann studied mathematics and in 1946 he co-authored a report on computer construction. Von Neumann’s main principles were that information is stored on a slow to access storage medium. Information is stored as binary digits. He also said that information can be interpreted as instructions or data and worked on in fast volatile memory. He said each cell in memory should have a unique address and that a processor manipulates the data according to built in operations.
Von Neumann said that the computer system should consist of parts not like some of the earlier single part computer systems. He said the five main parts should be Memory, the processor, input, output and the control unit. The memory in a computer system is responsible for storing data that has to be either read or written, there are two important registers that are associated with memory. The first is the memory address register which stores the address of a memory location. The second register is the memory data register; this can store two types of data, either the data to be read or the data that has to be written. The processing unit plays an important part in the computer system; it is responsible for the actual processing of data or information. It consists of different units which determine which operations to perform. A basic processing unit consists of an ALU (Arithmetic Logic Unit) and a TEMP (temporary storage). Input and output refers to devices which can either input or output data into a computer system. These are not a part of the main processing components and may be called peripherals or extras. Input devices typically transfer information to the processor; some examples of these devices are keyboard and mouse. Output devices transfer information from the processor; examples of these devices are monitor and printer. The control unit is used to execute instructions and store the addresses of instructions that haven’t been completed yet. The components in the control unit are the instruction register which is used to store the instruction being executed. The other component is the program counter that stores the address of the next instruction. The principles of the control unit are decoding instructions and breaking it down into steps and cycles. It issues control signals telling each unit what to do in a cycle and it also fetches instructions from memory to get them ready to be executed.
The main phases in the Von Neumann cycle can be identified as the fetch phase where an instruction is fetched from the memory, the decoding phase where the instruction is interpreted by the CPU and the execution phase where the instruction is executed. The Von Neumann cycle has influenced the design of conventional machines because even today the Von Neumann architecture is still the basis of today’s high-level programming languages and all of today’s modern micro processors. Von Neumann’s theory has clearly influenced the design of conventional machines because computers today work with binary numbers, operate electronically and computers today still perform one operation at a time. A conventional machine also contains the key components that Von Neumann stated were needed for a computer system to be efficient like a central control unit, a memory chip and input units. This all shows that Von Neumann’s principals have guided the design of all modern day computer systems.
Difficulties of Assembly languages
Every computer language must eventually be converted into machine or assembly language which are a low-level type language because they are closer to the machine not the user. However there are some difficulties associated with using low-level languages. The language is designed primarily for the computer so it is difficult for us to learn and master. The way of solving this problem is by using a series of layers where each layer builds on the one directly below. Therefore you have to design a new set of instructions that is convenient for people to use. These new instructions can become a new language. Because instructions can only be executed at assembly level a method had to be thought of to enable the computer to understand the level below it.
The components used to do this are called translators and interpreters. Low-level languages are also poorly suited to application requirements and because the language is hard to use and it takes along time to develop programs with this kind of language. Also when programming low-level languages the programmers have to understand how the machine works and understand the details of the machine. Machine language programming can also take a long time because one machine cycle is the same as one program statement so it makes writing a long programme very tedious.
Debugging is a difficult process with machine language because the problem could be with either the hardware or the software, so you will have to spend a lot longer testing both the software and hardware. Another difficulty is that machine language uses cryptic code which humans find hard to view; humans prefer pictures and language constraints because this is what we are familiar with.
GUI’s and Multi-user Operating systems
The way that users perform functions within the operating system depends on the type of user interface that they are using. The user interface is between the user and the programs that they are using. The two main types of user interfaces are either graphical or text based and are better known as a graphical user interface and a command line interface.
A GUI is a graphical user interface to the computer. Most of the operating systems that are around today use a graphical interface such as Windows XP. A GUI can include various graphics such as windows; pull down menus, buttons and wizards. An operating system that is quite popular at the moment is the UNIX operating system. UNIX is a multi-user operating system which means it allows many different users to use the system resources that are available at the same time. The operating system makes sure that the resources required are balanced and that the programs being have adequate and separate resources so that a problem with one user doesn’t happen to another. The advantage of a multi-user operating system is that the machine and software shared between multiple users so cost will be reduced. Another advantage is that the user doesn’t have to spend time backing up the system because a system supervisor usually makes regular back-ups of the system UNIX has the advantage in that it is written using the high language C software which means that it is easy to install and run on new computer systems however the original UNIX which was designed by AT&T labs in the 1960’s was originally written in assembly language which reduced its portability on other systems. UNIX was designed by a small group of programmers with the purpose of running their own applications on it. UNIX uses a command line interface or shell interface.
Using a command line Interface you execute commands by typing in the name off the command at a command prompt, after you have entered a command the CLI waits for the user to enter more commands, certain CLI’s provide facilities for editing and recalling commands so that you can recall the commands that you have already entered. Some Interfaces provide facilities so that you can run multiple commands at the same time; this is useful when a program requires several commands to be executed. Some more complex interfaces provide facilities to reduce the amount of typing that the user has to do, therefore only part of the command name has to be entered. Interfaces can also allow users to create shortcuts to the commands that they use most often, it does this by putting the popular commands into a file and when the file is run the commands are executed. A sequence of commands can be put into the file which is known as a script file. Some interfaces support languages that allow the user to automate complex tasks.
A Personal computer that is running either Windows 95 or 98 probably isn’t shared unless it is being run on a network or Local Area Network where a lot of microcomputers are connected together the LAN allows the sharing of data and peripherals like a multi-user operating system. However the disadvantage of the older multi-user operating system is that the more users that share the system the slower it will become. Whereas on a LAN there is no slow down even if many users are connected at the same time, this is because each computer has its own operating system. The multi-user operating system isn’t as popular as it once was and is being replaced by the LAN because personal computers have become more efficient and cheaper.
Business Problems
Most modern businesses are now using computer systems to store all of there information and data. So it is vital to the business that the information is up to date, that the business can understand the data and that it is cost effective. The data must also be easy to access to save them time. A business can lose a lot of money or even go out of business if vital data is lost or corrupted so it is important that the make sure the data or information is organised and backed up. In the earlier days of computing when technology wasn’t as advanced as it is now mainframe computers were around and the data file was used as the main method of storage. To obtain information the technology used was called file processing which used third generation programming languages like COBOL. To store the data, magnetic tape was usually used. Modern systems use databases and disc based systems to access and store important data. A disc system is more efficient than magnetic tape because records can be accessed randomly whereas with magnetic tape records are accessed one at a time which takes up a lot more time.
Due to the new disc technology databases have become more advanced and more efficient. Most business computer users need accurate and reliable data to run their business successfully, especially if their business is money related e.g. banking or insurance. However some business users may only require data that is accurate to a certain degree like weather forecasting. Businesses can access data files in three different ways. They can be accessed sequentially, by primary key or by secondary key. Sequential access of records is done by using magnetic tape. This method of data access takes a long time because you have to process each record until you find the required record. However this method is successful if you are prepared to wait until you arrive that the record you require. This method was used in the early businesses where tape based systems were used. This method is not used in modern businesses because with modern computer systems there is no need to process data sequentially and also tape systems are no longer used.
Primary Key Access of Records is another method of file processing. To access a record directly you have to use a primary key to identify it. A primary key is a record within a record that identifies it e.g. student id number could be the primary key of the entity student details. The other method of processing data is secondary key access of records. This is the processing of records into groups that have the same attributes e.g. you can extract records from students who all take a computing course or who live in a certain area. To organise data files businesses can use various methods.
One method is sequential file organisation where files are sorted in order of their primary key. There is no index file used with this method of organisation so you have to organise the files one at a time. This method is not very good for applications where you need to access files directly. The only medium to support this method is magnetic tape so it limits its use. Another way of organising data is direct organisation. This is done by a direct organisation allowing the user random access to records. This allows the user to amend records in real-time for a small number of requests. The main features of direct access are relative record addresses that are created from the records primary key. This process is called hash coding. A hash code is created by dividing the primary key with a prime number and taking the number left over.
The last method of organisation is indexed file organisation. With this method a separate file is used to store the address of records in a primary file. You can look up the actual address from the index file.
As it is important that businesses don’t lose their important data they usually back-up the data at regular intervals, a back-up is an exact copy of the original data. The systems administrator should make regular back-ups on a large system. The best time to do this is when the business system isn’t being used which is usually at night. Then if there is a systems crash then the medium that the data was backed up to can be used to restore the system to how it was before the data was lost or corrupted.
Another method of back-up used in businesses is the RAID system. The RAID should be set up for disc mirroring. This is where whatever is written to the first hard disc is automatically written to the other so if one crashes you still have the data on the other one. This may safeguard the system against data loss because it is like a continuous back-up process. This should not be relied upon on its own however as it is possible that hardware may become faulty. The main advantage of the RAID system is that it reduces faults. Faults can be reported to the administrator who can then arrange to fix the fault when the business isn’t busy. There are built in diagnostics in the RAID system so that if it detects a fault then all operations will be switched over to the standby disc.
Conclusion
I have come to the conclusion that all of the aspects that I have investigated all play an important role in a computer system. The principal layers play an important part in a computer system because without the layered system it would be very hard to write programs for a computer system because the machine layer isn’t designed for humans. The Von Neumann cycle is important because its structure forms the basis of the computer system and it forms the basis of instruction and execution. I have also discovered the importance of using high-level languages because low level languages are so difficult to use as they are designed for the machine and they are very tedious to program for. I have discovered that the development of graphical user interfaces have made it a lot easier to use a computer system because instead of the old command line interfaces where all you could do was type text at a command prompt you now have pictures and graphics and you can use a mouse cursor to select things. I have also discovered the importance of data to businesses because if they lose important data then they can lose a lot of money and potentially go bankrupt. Also I have discovered the ways business’s back-up there data by using tape drives and RAID systems. I feel that from investigating the tasks for the assignment I have gained a greater knowledge of operating system principles, machine architectures and file handling.