arrays - Error while calling Matrices Multiplication Function in C# -
I created this function to multiply a simple matrix
int [] Matrix (strings) [] input, int [,] key) {int [] result = new int [input.Length]; (Int line = 0; line & lt; key.GetLength (0); line ++) {for (int col = 0; col & lt; key.GetLength (0); col ++} {results [row ] + = Key [row, col] * AtoZ.IndexOf (input [0] [cola]);}} return result;} However, when I try to call it It is giving me some errors:
Calling function:
total = MatrixMul (OutPut [i], key);
Error 1 The best overloaded method for matching 'Hill_Cipher_CSharp.Hill_Cipher.MatrixMul' (string [], int [, ]) is invalid argument c: \ users \ wael \ documents \ visual studio Yo 2013 \ Projects \ Hill_Cipher_CSharp \ Hill_Cipher.cs 34 23 Hill_Cipher_CSharp Error 2 Argument 1: Can not be changed from 'string' to 'string []': \ Users \ wael \ documents \ visual studio 2013 \ Projects \ Hill_Cipher_CSharp \ Hill_Cipher.cs 34 33 Hill_Cipher_CSharp
The problem is with "Output", and is out here:
IEnumerable & lt; string & gt; ; Output = Enumerable.Range (0, Input. Length / Keygate Lamp (0)). Select (x => Input.SubString (x * key.GetLength (0), key.get lang (0))); String [] output = output. Toure ();
should be like function:
int [] Matrixool (string input, int [,] key) {int [] result = new int [input. Length]; (Int line = 0; line & lt; key.GetLength (0); line ++) {for (int col = 0; col & lt; key.GetLength (0); col ++} {results [row Calling function: (Int i = 0; i & lt; outPut.Length; i ++) {encChars = MatrixMul (output [i], key);}
Comments
Post a Comment