Built a web crawler , C/C++ Programming

Assignment Help:

To develop a web crawler such that when given a base URL, it will traverse the entire web tree and then build an index of keywords and what URL link they appear on.   The web crawler is a server-side program where it can begin called the seed.  As the crawler visits the URL, it identifies all the hyperlinks in the page (eg: detecting the " < a href ..." links) and adds them to the list of URLs to visit, called the crawl frontier.   URLs from the frontier are then visited by your server-side program to look for a few keywords (eg: raffles, award, alumni etc) initially on this website and stored their corresponding URL links where these keywords can be found in a file.  

Below is a sample of the data stored in the Server for subsequent quick search.

Another interactive client and server socket programs using C++ can be developed to allow clients to query with the search keywords of the website and obtain a server's reply of a list of URLs where the keyword is found.  The server program developed must be able to handle multiple queries and hence be able to search through its data file for a correct response.  The server program should also allow continuous enquires until the customer enters quit.  

The communication between the client and server machine can be any bidirectional interactive protocol.  An example is to use socket programming where network endpoints (IP address and port number) are represented as sockets.

 

When creating the server application, you must follow these steps:

  • Create a new socket by typing: socket().
  • Bind an address (IP address and port number) to the socket by typing: bind. This step identifies the server so that the client knows where to go.
  • Listen for new connection requests on the socket by typing: listen().
  • Accept new connections by typing: accept().

Often, the servicing of a request on behalf of a client may take a considerable length of time. It would be more efficient in such a case to accept and deal with new connections while a request is being processed. The most common way of doing this is for the server to fork a new copy of it after accepting the new connection.

The "responsing" Server listens on a port and waits for client's request. Based on the Client's questions, the Server will response appropriately by looking up a data file stored at the server's end.  


Related Discussions:- Built a web crawler

Computer Science Engineering C++ Homework, Temperature Conversions. Problem...

Temperature Conversions. Problems 28 through30generate temperature-conversion tables. Use the following equations that give relationships between temperatures in degrees Fahrenheit

Luminous jewels, Problem : Luminous Jewels - Polishing Necklace Byteland c...

Problem : Luminous Jewels - Polishing Necklace Byteland county is very famous for luminous jewels. Luminous jewels are used in making beautiful necklaces. A necklace consists of v

Data type, what is virtual datatype

what is virtual datatype

Built a web crawler , To develop a web crawler such that when given a base ...

To develop a web crawler such that when given a base URL, it will traverse the entire web tree and then build an index of keywords and what URL link they appear on.   The web crawl

C program to read the contents of a file, Program is to read the contents o...

Program is to read the contents of a file: void main()    {   ifstream fin("ascii.txt");   char ch;   while(! fin.eof())     {   fin>>ch;   cout     }

Program with inbuilt functions, write a atm program in c with inbuilt funct...

write a atm program in c with inbuilt functions for 1782?

Harcourt social studies, how would land elevation have changed if Coronado ...

how would land elevation have changed if Coronado had traveled 150 miles due west from what is today Arizona instead of east toward New Mexico

Define inline functions, Inline Functions Imagine a c program, which re...

Inline Functions Imagine a c program, which reads disk records having employee information. If this is a payroll application each employee record data is probably processed by

Minimum shelf, At a shop of marbles, packs of marbles are prepared. Packets...

At a shop of marbles, packs of marbles are prepared. Packets are named A, B, C, D, E …….. All packets are kept in a VERTICAL SHELF in random order. Any numbers

Write Your Message!

Captcha
Free Assignment Quote

Assured A++ Grade

Get guaranteed satisfaction & time on delivery in every assignment order you paid with us! We ensure premium quality solution document along with free turntin report!

All rights reserved! Copyrights ©2019-2020 ExpertsMind IT Educational Pvt Ltd