### Software Engineering Methods and Tools for Soft Computing-Paper

###
Software Engineering Methods and Tools for Soft Computing

Software development effort estimation is one of the most major activities in software project management. A number of models have been proposed to construct a relationship between software size and effort; however there are many problems. This is because project data, available in the initial stages of project is often incomplete, inconsistent, uncertain and unclear . Effort estimates may be used as input to project plans, iteration plans, budgets, investment analyses, pricing processes so it becomes very important to get accurate estimates. Software effort prediction models fall into two main categories: algorithmic and non-algorithmic. The most popular algorithmic estimation models include Boehm's COCOMO , Putnam's SLIM and Albrecht's Function Point .These models require as inputs, accurate estimate of certain attributes such as line of code (LOC), complexity and so on which are difficult to obtain during the early stage of a software development project. The models also have difficulty in modeling the inherent complex relationships between the contributing factors, are unable to handle categorical data as well as lack of reasoning capabilities. The limitations of algorithmic models led to the exploration of the nonalgorithmic techniques which are soft computing based. These include artificial neural network, evolutionary computation, fuzzy logic models, case-based reasoning, and combinational models and so on.

**METHODOLOGIES USED**

**1. Neural Networks**

Neural networks are nets of processing elements that are able to learn the mapping existent between input and output data. The neuron computes a weighted sum of its inputs and generates an output if the sum exceeds a certain threshold. This output then becomes an excitatory (positive) or inhibitory (negative) input to other neurons in the network. The process continues until one or more outputs are generated . It reports the use of neural networks for predicting software reliability, including experiments with both feed forward and Jordan networks with a cascade correlation learning algorithm.

Each neuron in the network computes a nonlinear function of its inputs and passes the resultant value along its output . Among the several available training algorithms the error back propagation is the most used by software metrics researchers. The drawback of this method lies in the fact that the analyst can't manipulate the net once the learning phase has finished . Neural Network's limitations in several aspects prevent it from being widely adopted in effort estimation.

It is a 'black box' approach and therefore it is difficult to understand what is going on internally within a neural network. Hence, justification of the prediction rationale is tough. Neural network is known of its ability in tackling classification problem. Contrarily, in effort estimation what is needed is generalization capability. At the same time, there is little guideline in the construction of neural network topologies.

**2. Fuzzy Logic**

Fuzzy logic is a valuable tool, which can be used to solve highly complex problems where a mathematical model is too difficult or impossible to create. It is also used to reduce the complexity of existing solutions as well as increase the accessibility of control theory.

The development of software has always been characterized by parameters that possess certain level of fuzziness. Study showed that fuzzy logic model has a place in software effort estimation . The application of fuzzy logic is able to overcome some of the problems which are inherent in existing effort estimation techniques.

Fuzzy logic is not only useful for effort prediction, but that it is essential in order to improve the quality of current estimating models . Fuzzy logic enables linguistic representation of the input and output of a model to tolerate imprecision. It is particularly suitable for effort estimation as many software attributes are measured on nominal or ordinal scale type which is a particular case of linguistic values.

**3. Genetic Programming**

Genetic programming is one of the evolutionary methods for effort estimation. Evolutionary computation techniques are characterized by the fact that the solution is achieved by means of a cycle of generations of candidate solutions that are pruned by the criteria 'survival of the fittest'. When GA is used for the resolution of real-world problems, a population comprised of a random set of individuals is generated. The population is evaluated during the evolution process. For each individual a rating is given, reflecting the degree of adaptation of the individual to the environment. A percentage of the most adapted individuals is kept, while that the others are discarded.

The individuals kept in the selection process can suffer modifications in their basic characteristics through a mechanism of reproduction. This mechanism is applied on the current population aiming to explore the search space and to find better solutions for the problem by means of crossover and mutation operators generating new individuals for the next generation. This process, called reproduction, is repeated until a satisfactory solution is found.

**4. Particle Swarm Optimization**

Particle swarm optimization (PSO) is a computational method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality. Such methods are commonly known as Meta Heuristics as they make few or no assumptions about the problem being optimized and can search very large spaces of candidate solutions. PSO shares many similarities with evolutionary computation techniques such as Genetic Algorithms (GA). The system is initialized with a population of random solutions and searches for optima by updating generations.

However, unlike GA, PSO has no evolution operators such as crossover and mutation. In PSO,the potential solutions, called particles, fly through the problem space by following the current optimum particles. One method has been proposed to use Particle Swarm Optimization (PSO) for tuning the parameters of the Constructive COst Model (COCOMO) for better effort estimation . The performance of the developed models using PSO was tested on NASA software project data presented in .The proposed models provided good estimation capability compared to traditional model structures. An algorithm is developed named Particle Swarm Optimization Algorithm (PSOA) to fine tune the fuzzy estimate for the development of software projects.

### Looking For Essay Writing, Or Paper Writing Services? Ask An Expert Now!

We at www.expertsmind.com develop plagiarism free paper, essay writing services, abstract of paper, terms papers, dissertation and thesis as per your academic or college needs. Our specialist writers guarantee for plagiarism free quality content as per your requirement. So why are you waiting for?

## Leave a Comment