Computer processing, Basic Computer Science

Assignment Help:

Computer Processing:

Most of the earliest computer memories have been based on physical elements which can exist in just one of the two states (on or off): such an element corresponds to one bit of information. Binary symbols are used because electronic devices can store and process them rapidly and cheaply. The physical devices used have been changed greatly,, but the principles of information representation mid manipulation within digital computers have remained essentially the same. The size of a computer can be described by the number of bits (binary digits) which its memory contains, but for most purposes larger units of storage are used to characterise machines. Generally, these are denoted by a byte which normally consists of 8 bits, and is sufficient to represent one character, and a word which can be of length -8, 16, 32 bits, is the smallest unit of storage to which most of the computer's instructions can be applied. Memories are usually described as being of a size measured in Kilobytes (K bytes) or Megabytes (M bytes). The sets or patterns of bits which make up bytes and words are used to represent all information within a computer, whether it is program instructions or data items.

The exact method of representation will vary from one machine to another, particularly with respect to instructions formats. An 8 bit byte can accommodate 256 different bit patterns and so it is sufficient to allow for most of the characters which could be required to be printed, e.g., upper and lower case letters A-Z, together with the numbers 0-9 and also a range of punctuation symbols, with allowance for non-printing characters such as end-of-line. The set of bit patterns corresponding to a set of characters is called a character code. Standard codes are ASCII  and EBCDIC. Numbers are normally represented by one or more computer words. Incase of integers, the set of 8,16, or 32 bits (as per the computer's word length) is treated as a binary integer (i.e., encoded in the notation of binary arithmetic). A real number is represented by dividing a computer word into two components: a simple decimal number (i.e., the mantissa portion) together with the power to which it must be raised (i.e., the exponent).  


Related Discussions:- Computer processing

What is Shortest-Remaining-Time (SRT) Scheduling?, • The SRT is the preemp...

• The SRT is the preemptive complement of SJF and helpful in time-sharing environment. • In SRT scheduling, the process with the least estimated run-time to completion is run next,

#title.flow chart., akrenda club has just opened their jym and pools servic...

akrenda club has just opened their jym and pools services. They need a system that can keep record of total sales of each. The club has silver and gold members. The silver members

Stack, How to write an algorithm to reverse the order of elements on a stac...

How to write an algorithm to reverse the order of elements on a stack s using two additional stacks

Basic description of a computer system, The main components of a computer ...

The main components of a computer system at a basic level Computer System We call computer system to the whole configuration of a computer, as well as the peripheral units and the

Theory of computation, I define a restricted form of TMs M as follows. Give...

I define a restricted form of TMs M as follows. Given any input x on the tape of M, the initial portion of the tape that holds x is read-only and one-way. That is, M cannot write o

Explain popular way commonly used to evaluate interfaces, Question: Usi...

Question: Using a questionnaire is a popular way commonly used to evaluate interfaces. (a) Describe advantages of using questionnaires as a means to evaluate interfaces.

Explain the basic structure of a c program, Question 1 Explain the basic...

Question 1 Explain the basic structure of a C program with an example 2 What would be the value of x after execution of the following statements? 3.What are the commonly u

Development of UNIX , Development of UNIX: The original UNIX developme...

Development of UNIX: The original UNIX development was performed on a Digital PDP-7 minicomputer and later moved to a PDP-11 minicomputer, the forerunner of the VAX computer.

Write Your Message!

Captcha
Free Assignment Quote

Assured A++ Grade

Get guaranteed satisfaction & time on delivery in every assignment order you paid with us! We ensure premium quality solution document along with free turntin report!

All rights reserved! Copyrights ©2019-2020 ExpertsMind IT Educational Pvt Ltd