NCC International Advanced Diploma in Computer Studies

Authors Avatar

NCC International Advanced Diploma – ISA Assignment        Due date: 30 July, 2004

NCC  International Advanced Diploma in        Computer Studies

Internet System Administration

Prepared by:


Table of Content


Task 1

1A) Robots or spiders

Robots or Spiders (also called web robots, web spiders), are automatic programs used by search engines to visit websites. Some robots are designed to gather content for indexing into search databases, and other robots are designed to do link checking, HTML validation, and other tasks.

Web-wide search engines, such as Go Network, AltaVista and HotBot use robots because web pages were accessed remotely.

How Robots Follow Links to Find Pages ?

Robots.txt

With the use of indexing technique, Robots should also check a special file in the root of each server called robots.txt, a plain text file (not HTML). Robots.txt implements the Robots Exclusion Protocol, which allows the Web site administrator to define what parts of the site are off-limits to specific robot user agent names.

Here is a simple ROBOTS.TXT file that prevents all spiders from indexing certain directories on my web server:

User-agent: *
Disallow: /secure/
Disallow: /assets/
Disallow: /cssstyles/

Inter-related documents to form a spiderweb


1B) Updating Unix distribution and kernel

Unix Structure

The Unix Kernel is the heart of the UNIX system. The structures of Unix Kernel consist of User level and Kernel level (and hardware level).

The UNIX distribution was underlying on the user level in which some package software are installed. Users have the right to choose their favourite package software (distribution release). While, the UNIX kernel are the system services bundle-installed by the UNIX operating system. Without the system services program, UNIX operating system can not be run or used. Very often, the kernel was specified (or chosen) by UNIX developer such as HP, IBM, SUN Solaris and so on. Every UNIX has their default kernel, it usually not needed to update if bugs were not hit.  

Similarly to Microsoft Windows, the UNIX distribution referred to the software that were installed on UNIX by users themselves or pre-installed by operating system (bundled software). Users can add or remove the software by themselves through command or installer such as RPM method or other means.

Update UNIX distribution and kernel

Unlike Microsoft Windows, almost ALL UNIX system didn’t provide auto “liveupdate” method to update the UNIX distribution software and UNIX kernel. In case of known bugs for operating system or system enhancement released by developers, UNIX administrators were required to update the UNIX system through command prompt with specified command.

Join now!

Security needs and stability needs for update kernel and distribution

If UNIX were installed in the closed network, it usually didn’t need for update system patches or bug fixes for distributions. However, nowadays, due to raising hacker activities, demanding and increasing security concern, UNIX systems’ system holes are exposed to security risk in Internet, UNIX distribution and kernel were required to update its old kernel and out-dated distribution. For example, Red Hat Linux (one type of UNIX) release 6.0 (Shedwig) Kernel 2.2.5-15, can be updated to kernel 2.4.20 to enhance the system stability and security.

Procedure difference among UNIX ...

This is a preview of the whole essay