Write a program to define a matrix:
1. Write a function that takes an integer and calculates and returns the factorial of the integer. The Factorial of a number "n" is computed as (n * (n-1) * (n-2) * ... * 1). Ex., factorial of "3" would be 3 * 2 * 1 = 6. In the "main" function, ask the user to enter a positive integer between 1 and 10 and pass the number entered by the user as a parameter to the function. Display the value of the factorial returned by the function.
2. Write a program to accept 10 characters from the user and store them in an array. Then, ask the user to enter a character to search for. Look for the character in the stored array. If it is found, print the position of the element in the array. Otherwise, print an appropriate message telling the user it was not found. Repeat this search with different inputs as long as the user wishes to continue.
3. Write a program to define a matrix (double dimensional array) of a pre-defined size containing integers. Use #define statements to define the number of rows and columns in the matrix (matrix size must be greater than 2x2). Accept values into the matrix by asking for user input. Transpose the matrix and print the resultant matrix. (Note: Transpose of a matrix essentially interchanges the rows and columns in the matrix.) Note that the original matrix need not be square.
For example, the transpose of the matrix
2 3 4
5 6 7