• Join over 1.2 million students every month
  • Accelerate your learning by 29%
  • Unlimited access from just £6.99 per month
Page
  1. 1
    1
  2. 2
    2
  3. 3
    3
  4. 4
    4
  5. 5
    5
  6. 6
    6
  7. 7
    7
  8. 8
    8
  9. 9
    9
  10. 10
    10

AO3 Investigate how search engines work

Extracts from this document...

Introduction

´╗┐Unit 2: collaborative working AO3-Investigate how search engines work Web crawlers- This is a type of search engine, it basically a bot that automates all sites on the web and keeps in them in a neat order according to your search. They by visiting sets on the web and then processing the details on the sites such as key words, hyper-links and such and then copy the websites for later use. I t basically starts with a bunch of names of Url it needs to visit as i said before it processes all the information on the websites particularly keywords, it then uses that information to determine things on what the website is about. for example it processes a page about football matches and player statistic I then type in "where to find sites on football matches" the search then finds relevant sites and used sites that uses keywords such as football matches and ranks them in a list according to relevance. Web crawlers though often have trouble duplicating the same website, this purposes a real problem when there is little change in url. Example of webpages that use web crawler software include Google yahoo Bing(Microsoft) These search engines use crawlers everyday to keep up with the changes on websites and the information on them. Though most save copies on the webs browsed and then index them for later use. Web cr awling can be used for other application but is mostly use for search engines. directories- Directories are a search engines that have links and websites organized and categorized by humans rather than bots and software, these are usually more organized and provide better quality sites this because the sites are regularly inspected and tested by human, the listing of websites were put there by people and categorized and put in list relevance by people. This sort of system often provides less results than an crawler and can also be bit slower since its done by humans. ...read more.

Middle

By putting their own links on already high ranking websites to raise their ranking. Another method quite widely used is called click-through measurement. Essentially this means that search engines will monitor the amount of clicks high ranking sites receive from a particular search. If a certain site receives low clicks while a site below it receives much more. The above will be ranked lower along the search results, while the other site gets ranked higher for its popularity. No search engine use the same method, as they have all developed then own unique algorithm. And only use the above method to some degree. How to search Keywords are the most effective way of searching online. Since search engines use specific keywords to find sites that match the search. The most effective way of doing this is by using nouns or objects for example instead of searching " cat needs washing" you search " cat wash". Verbs are best avoided. Also most search engines require at-least 6-7 keywords. Key words can differ quite dramatically depending on the the type of search engine your using, like of you compare a kids search engine to a adult/standard search engine, you will see a dramatic difference in what search result will come up. In this picture below you can see that when you type in the key word ?car? in Ask kids you get results which contain movies of animated cars and games involving cars. But now if you go on a standard search engine and search the term ?car?, you get different host of websites containing cars for sale, image of cars, and concept car and online magazines. Boloens operators When you go into a library and ask a Liberian "shoes". This will probably confuse his/her and she would respond by asking questions about shoes to narrow down exactly what kind of book you want. Well these are basically a search engines way of narrowing down your search to get exactly what you want. ...read more.

Conclusion

Google also presents the option of limiting the reading level from advance reading level to basic reading level. Other advanced search options offer a large selection language to translate the results. Making it open for people all over the world. And lastly, you can filter out the amount of content you want displayed on your search results. There are three levels of filter contents with the most common three being safe search,moderate search and safe search off. The safe search option, strictly filters out all inappropriate content. While the moderate setting is in the middle so has some inappropriate, but some is filter out if too extreme. And lastly you have the non safe search, which doesn?t filter out any inappropriate content whatsoever. Lastly you can filter out the results match only website operating within your country or area. This can be useful for people looking for maybe a local job, a club pr society and wants to filter out all contents coming from Australia, America, and even narrow it down to places only in Nottingham for example. Conclusion To conclude, Web-crawlers show a variety of information involving NYC, from maps to pictures to guides and wikis. This is array of information that would useful to many people looking to know about New york city. On the other hand, Dmoz(directory search engine) shows a organized looking files on information on new york. It also shows a little less information the the web crawler search. As you can see in the picture the webs-crawler also includes more maps on how to get to new york, while the directory doesn?t. It also has less results than the web-crawler. The dog-pile( meta search engine) shows a very similar results to that of web-crawler. Its also ordered in the exact same way. Lastly ask kids search engine shows relatively different results compared to the other search engine. If you look at the results you would see that they show more educational information about new york city and documentaries. While the other search engines focus on holidays and tours in NYC. ...read more.

The above preview is unformatted text

This student written piece of work is one of many that can be found in our AS and A Level Information Systems and Communication section.

Found what you're looking for?

  • Start learning 29% faster today
  • 150,000+ documents available
  • Just £6.99 a month

Not the one? Search for your essay title...
  • Join over 1.2 million students every month
  • Accelerate your learning by 29%
  • Unlimited access from just £6.99 per month

See related essaysSee related essays

Related AS and A Level Information Systems and Communication essays

  1. Marked by a teacher

    Database Coursework on a Vehicle Rental System: Analysis

    5 star(s)

    list of customers' contact details; UR19: enable user to search for a particular customer's details; UR20: enable user to search for vehicle details; UR21: enable user to search for a particular manufacturer's details; UR22: enable the above user requirements UR15-18 to be displayed onscreen or printed; UR23: enable user to

  2. Marked by a teacher

    Investigating a Transactional Website (www.latestdvd9.com)

    4 star(s)

    Also they could add a survey, which customers could fill in, in order to tell them what they think of every area and give everything a mark e.g. from 1-10. This would allow the website to get more customer information.

  1. Marked by a teacher

    Assignment A: Analysis of Transactional Website

    3 star(s)

    I fell that there shipping cost is very high, and don't see the reason to ship to the mainland when they are based in Norfolk, it would be more beneficial and cheaper for the customer and the supply to post the item or vehicle delivery.

  2. Data Processing Task. Mr Peters, the man who runs this particular Samsung store ...

    possible to produce a report on all product details * It should be possible to produce a report on the orders between given dates. * It should be possible to edit current records. * It should be possible to add new records.

  1. From the e-commerce strategy you gave me earlier I understand that by having a ...

    If you are taking goods back under the Sale of Goods act, you have a reasonable amount of time to take the goods back. When rejecting a product, you take your claim up against the retailer. The reasonable about of time depends on the product and how obvious the fault is.

  2. Global Warming

    Cuales medios no fueron eficaces y por que? Ninguno de los medios de comunicaci´┐Żn no fueron eficaces.

  1. Health Book

    This includes addressing two fundamental concerns: * Consumer confidence in e-health systems that the necessary standards and controls are in place to protect privacy and secure information about patient's health data as it is created, stored, accessed and exchanged; * Confidence required by clinicians and health end-users, specifically the issues of medico-legal indemnity and accountability.

  2. Wjec alevel ict creating a website

    to the RAC route planner website this was so people who didn't know how to get there would be able to plan a route to the venue of the washout promotion. To create the hyperlink, I highlighted the text and in the link box I copied and pasted the web address for the RAC route planner site.

  • Over 160,000 pieces
    of student written work
  • Annotated by
    experienced teachers
  • Ideas and feedback to
    improve your own work