Built a web crawler , C/C++ Programming

To develop a web crawler such that when given a base URL, it will traverse the entire web tree and then build an index of keywords and what URL link they appear on.   The web crawler is a server-side program where it can begin called the seed.  As the crawler visits the URL, it identifies all the hyperlinks in the page (eg: detecting the " < a href ..." links) and adds them to the list of URLs to visit, called the crawl frontier.   URLs from the frontier are then visited by your server-side program to look for a few keywords (eg: raffles, award, alumni etc) initially on this website and stored their corresponding URL links where these keywords can be found in a file.  

Below is a sample of the data stored in the Server for subsequent quick search.

Another interactive client and server socket programs using C++ can be developed to allow clients to query with the search keywords of the website and obtain a server's reply of a list of URLs where the keyword is found.  The server program developed must be able to handle multiple queries and hence be able to search through its data file for a correct response.  The server program should also allow continuous enquires until the customer enters quit.  

The communication between the client and server machine can be any bidirectional interactive protocol.  An example is to use socket programming where network endpoints (IP address and port number) are represented as sockets.

 

When creating the server application, you must follow these steps:

  • Create a new socket by typing: socket().
  • Bind an address (IP address and port number) to the socket by typing: bind. This step identifies the server so that the client knows where to go.
  • Listen for new connection requests on the socket by typing: listen().
  • Accept new connections by typing: accept().

Often, the servicing of a request on behalf of a client may take a considerable length of time. It would be more efficient in such a case to accept and deal with new connections while a request is being processed. The most common way of doing this is for the server to fork a new copy of it after accepting the new connection.

The "responsing" Server listens on a port and waits for client's request. Based on the Client's questions, the Server will response appropriately by looking up a data file stored at the server's end.  

Posted Date: 2/22/2013 6:38:02 AM | Location : United States







Related Discussions:- Built a web crawler , Assignment Help, Ask Question on Built a web crawler , Get Answer, Expert's Help, Built a web crawler Discussions

Write discussion on Built a web crawler
Your posts are moderated
Related Questions

Write a program to find the area under the curve y = f(x) between x = a and x = b, integrate y = f(x) between the limits of a and b

main() { int a[5]={1,3,6,7,0}; int *b; b=&a[2]; } The value of b[-1] is

A palindrome is a string that reads the same from both the ends. Given a string S convert it to a palindrome by doing character replacement. Your task is to convert S to palindrome

Make the following 3 functions work by filling in appropriate C code. This file 1. // Write a function to count the number of particular characters in a string. // Do not use a

Access to the channel/devices is achieved by means of general purpose I/O routines Theses are standard functions described in stdio.h header file namely getc and putc. Getc and put

it is a class enclosed in the scope of another class. For illustration: // Example: Nested class // class OuterClass { class NestedClass { // ... }; // .

how many bytes required to char

I need whatsapp software in my website Project Description: i need whatsapp software in my website same this whatsapp if anyone can make to me this in my website Skills

C++ improves on many of C's features and provides object-oriented programming capabilities used for software production, quality and reusability. C++ was developed by Bjarne Strons