For easier and less intimidating viewing, better use a bigger display. Unless you're watching a video or recalling something
Usually, we call something we cannot predict and that has no recognizable patterns "random." Like coin flips. But are they actually unpredictable? If you think about it, then you may realize that they are not. Indeed, if you know the initial position, velocity, and all the forces acting on a coin you toss, you will be able to calculate its outcome. In fact, there are robots which can do it with $100\%$ success rate.
So you can argue that coin flips are not random. Neither dice, weather, or financial markets are random. In each of those cases, if we know enough, then "The future, just like the past, would be present before its eyes." as Pierre-Simon Laplace once said. Or did he... Anyway, is there something else no one could predict even if we knew everything? I.e., is there a process determined by nothing? One of Albert Einstein's most famous quotes is "God does not play dice with the universe", which is practically a criticism of people having positive answers to that question. And since that guy was quite smart, maybe we should trust him...?
Also, if Einstein and Laplace are right, then we do not have a free will :( Think about it: since we consist of quadrillions of little particles, then knowing everything about each of them should tell us exactly what we are gonna think, feel, and do in the future. This sounds scary, doesn't it?
Exercise 1.
There are online random number generators. But if everything is so "deterministic", then how do they work and are they really random? Do you have any ideas on how to construct one?
(This exercise is not easy, it is more of a discussion provoker: so take your guess and reveal the box below if you want to)
Explanation and comments
click on blur to reveal
There are a lot of ways to get a random number generator actually. Some of them (e.g random.org) take things like "atmospheric noise" (which sounds quite random) to produce a random number. But most of the other ones are different and are actually called pseudo-random, because rerunning them will give you the same random number(s). One specific example is related to so-called logistic map, i.e an equation $x_{n+1} = rx_n(1-x_n)$. Surprisingly, this simple sequence behaves totally chaotic if $3.5699 < r < 4$ and this chaos is what is used to generate random numbers. Details are not that important, but what is important here is that a simple deterministic sequence helps generating random numbers.
Those of you who have programmed before, might remember a thing called "seed", where you need to give any number you wish (e.g in Python the command goes like "random.seed(78)"). And each time you run your code, that pseudo-random number generator will give you the same number(s), unless you change the seed (i.e that number $78$ in the example).
Let's go back to the question of whether there is a process determined by nothing. So, all cards on the table — at this point in time, humanity still does not know whether such a process in nature exists. However, we think Quantum Mechanics gives a positive answer to this question.
Of course, we do not expect you to know much about it. Well, probably you know nothing about it because it is a very difficult subject. So here is its quick description — it is a fundamental theory in physics that describes nature's physical properties at the scale of atoms. I.e., it is the physics of tiny particles. One of the interesting and confusing things about it is that you work with probabilities of finding them (little particles) at different locations rather than exact positions. And this is not because of the lack of knowledge about where the particle is but because of the uncertainty encoded in the model of the universe, which we believe to be true.
Putting philosophy aside, we must admit that even if Quantum Mechanics is wrong and even if everything is theoretically $100\%$ predictable, we are far from knowing how to predict everything. This is because properly analysing all the information that influences the outcome of something is often impossible. In particular, because of the "butterfly effect" (which stands for "slight change in initial conditions dramatically affect the future"), in many cases, the best we can do is to say that a certain event is just random. Even the dice or coins are so sensitive to the initial conditions that predicting the outcome based on the first $1$ second of a throw and knowledge about the surroundings is utterly difficult. So, no wonder that, for example, financial markets, which are millions of times more complex than one coin, are also often viewed as random processes.
Thus, in many situations of all kinds, applying probability theory is our best shot at predicting what is going to happen.
Let's move on to maths and numbers. In the previous lesson, we talked about the probability of an event, like seeing a $1$ when rolling a die. Just like the probability of any event, it is a number between 0 and 1. We argued that it should be $\frac{1}{6}$ (since there are 6 possibilities and "rolling a 1" is just one of them), but then said that it is not the best argument.
The key reason this argument is not the best is that it hides the fundamental truth, which is that we ourselves decide which probabilities we assign to which events. We decide what probability space we are working in. Depending on the circumstances, we may pick a different probability space. I.e., we ourselves assign those numbers (called probabilities) to some events, and sometimes we might think that assigning the same numbers to all "seemingly similar events" is not the best idea. For example, it is almost never useful to argue "well, the price of Amazon's stock will go up or down, so it is $50-50$". Tell this to your boss at your finance job and see what happens ;)
Definition [Probability Model]
Let's put it all in one place. So, before talking about probabilities of some events, we need to make sure we agree on a probability model. This means that:
Once we agree on all these, we say we have a probability space, which can also be called a probability model (these two terms are interchangeable). As an example: when talking about one fair die people usually talk about a model where there are six simple events: {rolling a 1, ..., rolling a 6} and they all have equal probability (because of the word "fair") equal to $\frac{1}{6}$. Therefore the question "what is the probability of getting a 1 when rolling a fair die?" is quite silly since the answer is "it is $\frac{1}{6}$ by definition of the model we have chosen".
Exercise 2.
How would you answer the following question "why rolling two threes with two fair dice is equal to $\frac{1}{36}$?" Think about the model, simple events, etc...
(The answer like "It is just the product rule" does not count here. What do you mean exactly? Note that we are starting from scratches, so what rules can we even talk about?)
Explanation and comments
click on blur to reveal
There are 36 simple events, namely "roll 1, roll 1", "roll 1, roll 2", ..., "roll 6, roll 6". You can count that there are 36 of them or, if you want to be fancy, say that there are 36 of them because of the product rule from counting combinatorics (NOT from probability theory). Finally, each of the 36 simple events gets the same probability assigned, i.e., probability $\frac{1}{36}$, because that is how it feels right to interpret the word "fair" in the problem statement. That is why in particular, rolling two threes has probability $\frac{1}{36}$. By what we chose to be true, i.e., by the definition of our model, that seems reasonable in this case.
Exercise 3.
There is a coin such that the probability of getting heads when tossing it is $3$ times larger than the probability of getting tails. If you toss this coin $3$ times, what is the probability you will get exactly two heads?
(This exercise is a bit bashy if you do it properly, but it is good to go through something like this at least once)
Explanation and comments
click on blur to reveal
If there is a coin that we toss only once, then the probability model is clear (it has two simple events). Let $p$ be a probability of tossing tails. Then $1-p$ is probability of tossing heads. We know that $1-p = 3p$, therefore $p=0.25$. To make it all shorter, I will keep $p$ and $q=1-p$ instead of $0.25$ and $0.75$ (probabilities of tossing tails and heads correspondingly). Those numbers don't matter as much for our purposes. Now, when we toss a coin $3$ times, we will consider a probability model with $8$ simple events and the following probabilities (H stands for "Heads", T stands for "Tails"):
\begin{array}{ c | c | c | c }
HHH & HHT & HTH & HTT \\
q \cdot q \cdot q & q \cdot q \cdot p & q \cdot p \cdot q & q \cdot p \cdot p \\
\hline \\
THH & THT & TTH & TTT \\
p \cdot q \cdot q & p \cdot q \cdot p & p \cdot p \cdot q & p \cdot p \cdot p
\end{array}
You can check that the sum of all the 8 probabilities is indeed 1. One of the ways to do it is to note that the sum of all these probabilities is actually $(p+q)(p+q)(p+q)$: expand everything if don't feel certain about this. And that is equal to $1^3=1$.
Finally, let's answer the main question of the problem. For this, let's find the three simple events which have two H-s in them. Their probabilities are all equal to $q^2p$, therefore the answer is $3q^2p$. If you recall what was $p$ and what was $q$, you can get a numerical answer which is $3 \cdot 0.75^2 \cdot 0.25 = 0.421875 \approx 0.42$.
Exercise 4.
In another universe, there are five companies called Pineapple, Nanosoft, Nikola, Frodosung and Nosebook that are in close competition with each other for already 70 years. Five years ago, the companies got bored and so each month they decided to give a special award to one of them that earned the most money that month, just to motivate self-development even more. Below is the information on how many awards were obtained by which company:
\begin{array}{ c | c | c }
Pineapple & Nanosoft & Nikola \\
14 & 9 & 11 \\
\hline \\
Frodosung & Nosebook \\
15 & 11
\end{array}
You do not know which company will get the next award, you barely know much about any of them. Based on the table above, how would you go about betting a 100 dollars on which company will win the coming month? You double the money you put on the winner, and lose everything you put on not-a-winner. You can split the 100 dollars and put non-zero amount on several of them.
(This is an open question and there is no one perfect answer. Once you think you have reasonable ideas, feel free to reveal the box below)
Explanation and comments
click on blur to reveal
Even intuitively (without knowing anything about probability) it makes sense to bet at least something on Frodosung since it has won most of the times. But let's think in terms of probabilities. Here is a way of how you could do it:
If this is approximately how you thought about the question – good. If you got to solving a certain maximisation problem to explain your answer, this is even better. Reasonable ideas is what was the point of the exercise, there is no need to give your final answer. Finally, let's stress again that we ourselves assigned probabilities to some events based on our knowledge of the past. This makes sense, and it is an extremely useful approach in finance. Almost the only one really...
The philosophy of randomness is fascinating and a bit scary. Quantum Mechanics? No free will?! But independently of that, being good at probabilistic theories is what helps people and companies to reason under uncertainty, even if this uncertainty comes from lack of knowledge.
We have then taken more steps into the maths of all this. From now on, you will be one of the people who know more about probability at the conceptual level than most others.
The next step is going to be an introduction into the world of lies—I mean "Statistics." The fact that there are millions of junk pseudo-scientific studies and misuses of statistics is both concerning and a good topic for a stand-up. You will learn more soon, but first, please try solving problems from the problem set, at least the first four of them (the last one is a bit challenging).