Hellllo Got|Apex! I was trying to work through something today that for the life of me I can't remember how to do...and I remembered all of the math gurus that we had here - so I thought I'd give it a shot
The problem is set up as such:
You have X number of cases to test. During each trial, or iteration, you randomly test one of the X cases. The process cannot use logic to choose the next case, so it's chosen at random every time. In other words, it is possible to choose the same one any number of times in a row.
Obviously, the minimum number of trials needed to test all X cases would be X; and the maximum number of trials needed to test all X cases would be infinite. I am trying to determine the average, or expected number of trials that need to be executed before all X test cases have been hit.
If this is easier to do using an actual value, we can pretend that X is 5 - although it will more likely end up being somewhere between 30 and 50.
If anybody has insight on this, it would be greatly appreciated
Miss you guys!