I wrote a program that keeps generating random numbers between 0 and 10,000 until the number generated is equal to a predetermined value (in this case 4377). Then the program repeats the process for N number of times, then calculates the average number of tries it takes the computer to get 4377.

I have done this the first time using an infinite while loop with an if statement inside it that breaks out of the loop when the number is equal to 4377, then I thought my code had a redundant line (the if statement), I could just write (y != 4377) as a while condition. .

Contrary to my expectations, I got different results for each case. The first code usually takes around 10,000 tries on average (consistently) to get 4377, when the second code takes way fewer tries on average, some times as few as 3! That’s a huge difference! (N = 1000).

Case 1 – infinite while loop with a conditional break inside:

```
using System;
namespace Codeac
{
class Program
{
static void Main(string[] args)
{
Random x = new Random();
int i, size;
Console.Write("Enter sample size: ");
size = Convert.ToInt32(Console.ReadLine());
double[] k = new double[size];
int y = 0;
for (int p = 0; p < size; p++)
{
i = 0;
while (true)
{
y = x.Next(0, 10000);
i++;
if (y == 4377) { break; }
}
k[p] = i;
Console.WriteLine(y);
}
Console.WriteLine("\n \n \n");
Console.WriteLine("Mean of tries is: ");
Console.WriteLine(Mean(k));
Console.WriteLine("\n \n \n");
Console.WriteLine("Stabdard deviation is: ");
Console.WriteLine(Math.Round(StandardDeviation(k), 3));
}
public static double Mean(double[] samples)
{
double mean = 0;
for (int r = 0; r < samples.Length; r++)
{
mean += samples[r];
}
mean /= samples.Length;
return mean;
}
public static double StandardDeviation(double[] samples)
{
double mean = Mean(samples);
double standardDeviation = 0;
for (int r = 0; r < samples.Length; r++)
{
standardDeviation += Math.Pow(samples[r] - mean, 2);
}
standardDeviation = Math.Sqrt(standardDeviation / mean);
return standardDeviation;
}
}
}
```

Case 2 – reversed condition as a while condition.

```
using System;
namespace Codeac
{
class Program
{
static void Main(string[] args)
{
Random x = new Random();
int i, size;
Console.Write("Enter sample size: ");
size = Convert.ToInt32(Console.ReadLine());
double[] k = new double[size];
int y = 0;
for (int p = 0; p < size; p++)
{
i = 0;
while (y != 4377)
{
y = x.Next(0, 10000);
i++;
}
k[p] = i;
Console.WriteLine(y);
}
Console.WriteLine("\n \n \n");
Console.WriteLine("Mean of tries is: ");
Console.WriteLine(Mean(k));
Console.WriteLine("\n \n \n");
Console.WriteLine("Stabdard deviation is: ");
Console.WriteLine(Math.Round(StandardDeviation(k), 3));
}
public static double Mean(double[] samples)
{
double mean = 0;
for (int r = 0; r < samples.Length; r++)
{
mean += samples[r];
}
mean /= samples.Length;
return mean;
}
public static double StandardDeviation(double[] samples)
{
double mean = Mean(samples);
double standardDeviation = 0;
for (int r = 0; r < samples.Length; r++)
{
standardDeviation += Math.Pow(samples[r] - mean, 2);
}
standardDeviation = Math.Sqrt(standardDeviation / mean);
return standardDeviation;
}
}
}
```

Please ignore the standard deviation for now.

If anyone has any ideas on why this is happening, or any remarks or thoughts about this, please reply to this topic. I appreciate all responses.