Software Evaluation
An evaluation report can be written to decide on what software to buy. Factors to be considered include:
- Cost
- Ease-of-use (including the HCI)
- Training needs
- Availability of technical support
- Documentation
- Compatibility with existing software and hardware
- Upgradability
- Portability
- Robustness
- Results of benchmark tests into performance
- Reputation of manufacturer
Each factor is given a weighting according to the needs of the organisation.
At the start of the systems life cycle, decisions have to be made about how to acquire the software that is needed. The options are:
- Software can be written by the end user
- A specialist department could design, write, test and evaluate the software
- External consultants could be called in to write and test the software
- An off-the-shelf package could be bought
- Software could be leased, with an annual fee payable for use.
You need to understand the option of software emulation and the problems associated with it.
Software can be written by the end-user (maybe using an application generator). What are the dangers here?
You need to understand the difference between Alpha Testing and Beta Testing.
Databases and Networks
You need to understand the advantages of RDBMS software and the principle of normalisation.
DBA - Database Administration
The database administrator should:
- Evaluate the design of the database and arrange for necessary changes to be made
- Inform users of changes
- Maintain the data dictionary (see below)
- Set Access Levels
- Allocate passwords
- Provide training
The Data Dictionary
This is a database about the database. It includes information about:
- Tables
- Data lengths and field types
- Data validation restrictions
- Descriptions of fields
- Relationships
The DBMS
This is an interface between the operating system and the user, which aims to make access to data as simple as possible. Its other functions are:
- It allows users to store, retrieve and update data
- It maintains the data directory
- Allows sharing of data (it can ensure that problems do not occur of the same record is accessed by two people at once)
- Backup and recovery
- Security
Normalisation
Normalisation is a process whereby the database designer works out the most efficient structure for the database.
You need to remember why relational databases are preferable to flat files. Flat file databases usually result in data redundancy (unnecessary data) and data inconsistency (inconsistent data that is likely to result in inaccurate information).
However, these problems can also exist in relational databases that have not been normalised because there might be many-to-many relationships.
For example, a hotel might have a database with two entities (tables) called GUESTS and ROOMS. This would be a many-to-many relationship because many guests would stay in many rooms. This problem could be resolved by having a "Junction Table" called BOOKINGS to establish when each guest was staying in each room. There are now three tables and two one-to-many relationships.
The advantages of normalisation are:
- Elimination of data redundancy
- Elimination of data inconsistency
- Program-Data Independence
Client-Server Database
Here the DBMS software runs on the network server. The server software processes requests for searches, sorts and reports that originate from the DBMS client software (on the workstation). This saves the entire database having to be copied to workstations when people want to perform queries.
Advantages of a client-server database are:
- An expensive resource can be made available to a large group
- Client stations can update the database
- Data consistency is ensured
- Processing is carried out by the server
- Communication time is reduced
- Reports can be held on the workstation and customised
- Strong centralised security
Disadvantages of client-server based networks are that the network is dependant on the sever. Regular maintenance and backup are essential.
Client-Server v Peer-to-Peer Networks
Most networks operate on a client-server system, except for small networks of not more than 10 machines. These days, clients are normally not "dumb terminals" because they have their own processor. Applications may be loaded into the client's RAM but printing tasks may be handled by the server. This includes managing print queues. Backup and security are also handled by the server.
Peer-to-Peer networks are usually small networks of not more than 10 machines. Software may be held on any of the computers and it is made available to any other computer. An office can share disk space, software and data.
Dispersed Systems
Because the price of hardware has fallen, it has become more cost-effective to move the processing power to where it is needed, i.e. on desktop machines. Word processing and spreadsheet software has made desktop processing very popular. Standalone machines have now been replaced by networked workstations, to allow data to be shared.
Thin Client Computing
This means having little or no processing power in the client computer. Thin clients are more secure because the user cannot install his own software or introduce viruses. Thin Client networks are easier to administer.
Distributed Databases
A distributed database is a database that consists of two or more data files located at different sites on the network.
Because the database is distributed, different users can access it without interfering with one another. However, the scattered data must be periodically synchronised to ensure data consistency.
Advantages: Can provide local autonomy, gives the advantage of being able to share data.
Disadvantages: Distributed systems are likely to be more complex and expensive to install and maintain. The need to transfer data increases the security risks.
LANs - Local Area Networks
No telecommunication lines are needed because computers on one site are linked together.
- Files, printers, scanners and disk space can be shared
- Users can communicate using software e.g. Lotus Notes
- Software on the server can be used by anyone
- All users can access the same database
- Backup can be done automatically across the network
You should also be able to give an example of a WAN (Wide Area Network).
To have a network, you need:
- File Server(s) - a powerful computer that stores and distributes files across the network.
- Network Adapter - each workstation needs a network card
- Cabling - to physically connect the computers
- Network Operating System
- Network Accounts - each user has a user ID and password to access their space on the network.
Access Levels
Typical levels of access are:
- No access
- Read only
- Read and Copy
- Read and Update
Network Topology
Network Topology means the layout or shape of the network. The most common types of network are Ring, Bus and Star.
Protocols and Standards
Speed of Transmission
You should understand the different types of connections that are possible e.g. ISDN and broadband, that can increase the speed of data transmission.
You should remember the difference between serial and parallel transmission (serial transmission is where one bit is sent at a time). Parallel transmission is suitable only over short distances.
You should understand the difference between analogue and digital transmission and the advantages of digital transmission.
Protocols
Protocols allow the use of "open systems" i.e. systems that are independent of the manufacturer and the platform. Therefore, the user is not restricted to equipment from one manufacturer.
Standards
Examples of De Facto standards:
- ACSII
- Betamax v VHS video recorders (betamax machines were useless because nobody produced tapes for them)
- QWERTY keyboards (why don't we adopt a different layout?)
- Light switches (down is on in Britain but off in America)
- Windows and DOS (see case study p.328-329)
De Jure standards - these are defined by industry groups or governments. The ISO (International Standards Organisation) lays down the standards. Example: the standard for compressing JPG images was laid down by the ISO.
Remember though that standards (unlike protocols) are not binding.
Advantages and Disadvantages of Standards
+ There is a wider marketplace for computers that comply with standards.
+ Standards allow the development of "open systems".
- Some major manufacturers have the power to dictate "de facto" standards.
- Standards can slow down technological advancement because standards have to be agreed before changes can be made.
Internet Protocols
On the Internet, there is every kind of computer, even WAP phones and cable TVs. The Internet would not work without protocols.
TCP/IP is the de facto standard for allowing different computers to communicate over the Internet.
HTTP is the standard for requesting and receiving HTML pages
FTP allows files to be transferred over the Internet
POP3 allows e-mail to be transferred.
The OSI Model
The OSI (Open Systems Interconnection) model was developed by the ISO (International Standards Organisation). OSI was developed as a guide to developing standards between computers of different origins. If a manufacturer ensures that their products obey a set of standards based on the OSI model, they can be connected to another manufacturer's machines relatively easily.
Human-Computer Interaction
Remember that interaction means much more than just the interface.
Physical interaction with the computer can include the whole work environment and, in particular, lighting, seating, furniture, room size and space.
Psychological factors include intuitiveness, ease-of-learning, methods of alerting the user (vision or hearing).
A guiding principle for good interface design is not to contradict our mental image of how things should be e.g. red means danger, green means go.
A good interface provides:
- Help for novice users
- Short-cuts for experienced users
- Metaphors or images (e.g. a picture of a printer on a print button)
- Consistent behaviour, which makes use of long-term memory e.g. always using F1 for Help or ESC to stop a process. There are certain functions that have become de facto standards.
- Clear and helpful error messages.
- Uncluttered screens with effective use of colour.
Impact on System Resources
A fancy interface is likely to have an impact on "system resources". This means:
- Processing power (processing power is needed to draw the interface, leaving less for the application itself)
- Backing Store (a GUI takes up more disk space than a command line interface)
- Immediate Access Store (a GUI will hog RAM)
Customising Software
An experienced user might want to hide certain features e.g. unwanted buttons or unnecessary prompts and warnings. Experienced users often want to "turn things off".
Examples of common annoyances are splash screens, "beeps", "Do you really want to do this?" prompts, etc. Consider the demise of the Office Assistant.