Computer processing, Basic Computer Science

Assignment Help:

Computer Processing:

Most of the earliest computer memories have been based on physical elements which can exist in just one of the two states (on or off): such an element corresponds to one bit of information. Binary symbols are used because electronic devices can store and process them rapidly and cheaply. The physical devices used have been changed greatly,, but the principles of information representation mid manipulation within digital computers have remained essentially the same. The size of a computer can be described by the number of bits (binary digits) which its memory contains, but for most purposes larger units of storage are used to characterise machines. Generally, these are denoted by a byte which normally consists of 8 bits, and is sufficient to represent one character, and a word which can be of length -8, 16, 32 bits, is the smallest unit of storage to which most of the computer's instructions can be applied. Memories are usually described as being of a size measured in Kilobytes (K bytes) or Megabytes (M bytes). The sets or patterns of bits which make up bytes and words are used to represent all information within a computer, whether it is program instructions or data items.

The exact method of representation will vary from one machine to another, particularly with respect to instructions formats. An 8 bit byte can accommodate 256 different bit patterns and so it is sufficient to allow for most of the characters which could be required to be printed, e.g., upper and lower case letters A-Z, together with the numbers 0-9 and also a range of punctuation symbols, with allowance for non-printing characters such as end-of-line. The set of bit patterns corresponding to a set of characters is called a character code. Standard codes are ASCII  and EBCDIC. Numbers are normally represented by one or more computer words. Incase of integers, the set of 8,16, or 32 bits (as per the computer's word length) is treated as a binary integer (i.e., encoded in the notation of binary arithmetic). A real number is represented by dividing a computer word into two components: a simple decimal number (i.e., the mantissa portion) together with the power to which it must be raised (i.e., the exponent).  


Related Discussions:- Computer processing

Types of pc, TYPES OF PC:  In general,  basic function of all the comp...

TYPES OF PC:  In general,  basic function of all the computers is same i.e. Computers accept data as input, perform operations on these data and generates the desired output t

Describe counting instructions, They are used to reduce or enlarge the cont...

They are used to reduce or enlarge the content of the counters. DEC INC DEC INSTRUCTION Idea: To diminish the operator. Syntax: DEC destiny This action subtracts 1 from the destiny

Concept in programming language, CONCEPT IN PROGRAMMING LANGUAGE: A Pr...

CONCEPT IN PROGRAMMING LANGUAGE: A Programming Language is used to design and describe a set of instructions and computations to be executed by a computer. To do programming,

How to assign Values to Variables in python?, ython variables do not compri...

ython variables do not comprise to be explicitly declared to already reserve memory space. The declaration occurs automatically when you allocate a value to a variable. The equal s

Unspecified bit rate, ATM Theory - Unspecified Bit Rate Is intended for no...

ATM Theory - Unspecified Bit Rate Is intended for non-real time applications, that is, those not requiring tightly constrained delay and delay variation. Examples of applications

C programming program, I wrote a program that adds up 5x5 matrices by colum...

I wrote a program that adds up 5x5 matrices by column and row. Now i have to use pointers to rewrite the same code how would do that

Input devices, Input Devices: i)  Keyboard is the most common form of ...

Input Devices: i)  Keyboard is the most common form of input devices. It was originally designed in the last century. Since then, only minor improvements have taken place in k

Widening web, The Widening Web The idea, the whole world is agog...

The Widening Web The idea, the whole world is agog now had accidental origins. Here is how it all began. In early 1969, the U.S. Defence department financed a network fo

Definition of algorithm , Definition of  Algorithm  An algorithm is a ...

Definition of  Algorithm  An algorithm is a design or plan of obtaining a solution to a problem. The solution is presented by listing all steps in which they are carried out.

Write Your Message!

Captcha
Free Assignment Quote

Assured A++ Grade

Get guaranteed satisfaction & time on delivery in every assignment order you paid with us! We ensure premium quality solution document along with free turntin report!

All rights reserved! Copyrights ©2019-2020 ExpertsMind IT Educational Pvt Ltd