K-nearest neighbor for text classification, Computer Engineering

Assignment Help:

Assignment 2: K-nearest neighbor for text classification.

The goal of text classification is to identify the topic for a piece of text (news article, web-blog, etc.). Text classification has obvious utility in the age of information overload, and it has become a popular turf for applying machine learning algorithms. In this project, you will have the opportunity to implement k-nearest neighbor and apply it to text classification on the well known Reuter news collection.

1.       Download the dataset from my website, which is created from the original collection and contains a training file, a test file, the topics, and the format for train/test.

2.       Implement the k-nearest neighbor algorithm for text classification. Your goal is to predict the topic for each news article in the test set. Try the following distance or similarity measures with their corresponding representations.

a.        Hamming distance: each document is represented as a boolean vector, where each bit represents whether the corresponding word appears in the document.

b.       Euclidean distance: each document is represented as a numeric vector, where each number represents how many times the corresponding word appears in the document (it could be zero).

c.         Cosine similarity with TF-IDF weights (a popular metric in information retrieval): each document is represented by a numeric vector as in (b). However, now each number is the TF-IDF weight for the corresponding word (as defined below). The similarity between two documents is the dot product of their corresponding vectors, divided by the product of their norms.

3.        Let w be a word, d be a document, and N(d,w) be the number of occurrences of w in d (i.e., the number in the vector in (b)). TF stands for term frequency, and TF(d,w)=N(d,w)/W(d), where W(d) is the total number of words in d. IDF stands for inverted document frequency, and IDF(d,w)=log(D/C(w)), where D is the total number of documents, and C(w) is the total number of documents that contains the word w; the base for the logarithm is irrelevant, you can use e or 2. The TF-IDF weight for w in d is TF(d,w)*IDF(d,w); this is the number you should put in the vector in (c). TF-IDF is a clever heuristic to take into account of the "information content" that each word conveys, so that frequent words like "the" is discounted and document-specific ones are amplified. You can find more details about it online or in standard IR text.

4.       You should try k = 1, k = 3 and k = 5 with each of the representations above. Notice that with a distance measure, the k-nearest neighborhoods are the ones with the smallest distance from the test point, whereas with a similarity measure, they are the ones with the highest similarity scores.

 

 


Related Discussions:- K-nearest neighbor for text classification

Macroscopic and microscopic approaches - thermodynamics, Macroscopic and Mi...

Macroscopic and Microscopic approaches - Thermodynamics: Thermodynamic studies are undertaken by following two different approaches.  l. Macroscopic approach (Macro mean big)

Explain ai applications, AI Applications Search is a important elemen...

AI Applications Search is a important element of an AI system, and search functions are executed on huge quantities of complicated structured data by means of unstructured in

List one advantage & disadvantage of having large block size, List one adva...

List one advantage and one disadvantage of having large block size. Ans: Advantage: By using a huge block of memory is maximum process's accommodation that resulting is less no

Explain naming convention verification points, Explanation:- In functio...

Explanation:- In functional testing, you require to confirm that the objects in the application-under-test look and work as designed from build to build. To accomplish this, yo

What is the cih, CIH, also known  as Chernobyl or Spacefiller, is a Microso...

CIH, also known  as Chernobyl or Spacefiller, is a Microsoft Windows computer virus which first emerged in 1998. It is one of the most damaging viruses, overwriting critical inform

Relationship between distributed and mobile computing, a) Briefly describe...

a) Briefly describe the relationship between distributed computing, mobile computing and pervasive computing. b) Suppose the following scenario, which demonstrate the possibili

Using bit wise operator implement nor and nand gate, Q. Write a program t...

Q. Write a program to implement NOR, NAND, XOR and XNOR gates using and without using bit wise operator. Also perform necessary checking. The user has option to give n numbe

How online databases work, How Online Databases Work? An online or web-...

How Online Databases Work? An online or web-based database keeps data on a cloud of servers somewhere on the Internet, which is accessible by any authorized user with an Intern

Observations of high level language program, Q. Observations of High Level ...

Q. Observations of High Level Language Program? Observations Integer constants appeared nearly as frequently as structures or arrays. Most of the scalars were foun

Find resolution and output voltage of D/A convertor, A 6-bit R-2R ladder D/...

A 6-bit R-2R ladder D/A converter has a reference voltage of 6.5V. It meets standard linearity.Find (i) The Resolution in Percent. (ii) The output voltage for the word 011100.

Write Your Message!

Captcha
Free Assignment Quote

Assured A++ Grade

Get guaranteed satisfaction & time on delivery in every assignment order you paid with us! We ensure premium quality solution document along with free turntin report!

All rights reserved! Copyrights ©2019-2020 ExpertsMind IT Educational Pvt Ltd