Материал готовится,

пожалуйста, возвращайтесь позднее

пожалуйста, возвращайтесь позднее

So throughout the course, we'll be using

some concepts and notations from discreet

probability. I wanted to make

sure everybody's familiar with the

notation, so I'll do a very, very quick

crash course in discreet probability. Even

if you're not familiar with what I'm

saying here, it's not the end of the

world. I'll provide an online resource,

where you can read a little bit that's not

too much reading material. And you

can familiarized yourself with basically

what we'll use from discreet probability.

We'll be using the notation mostly. And it

should be fairly easy to catch up and

learn the background [inaudible]. Okay, so

let's start with the basics. Basically,

the discreet probability space. And for

us, it's just going to be a finite set,

which I'll denote by U. And most commonly,

we'll use a set of all N bit streams as

our probability space. So this means all,

strings, with letters 0, 1 of length N. And

then a probability distribution we'll

denote by P over this set U, is just a

function that assigns to every element

in U, a weight in the interval zero to

one. And the only requirement is that the

sum of all the weight is one. Okay? So

that's basically, what a probability

distribution is. And I wanted to mention,

quickly, two examples. The first example

is the uniform distribution that basically

assigns, to every element in the space,

the same weight. So it gives every element

in the space the same distribution. So

basically, if we sample from this

distribution, the likelihood of every

element in the space is equally likely,

basically. And the other distribution I

wanted to mention is the point

distribution. At X0, basically here all

the probability mass is centered on one

point and all other points essentially

have zero mass. Now because our

probability space is finite. You can

literally just think of the function P

just as a vector. So think of basically

this function P we can literally write

down all it's values. This would be a

vector of dimension two to the N, right,

or two to the N components here, if our

universe is 0, 1 to the N. And I just

wanted to mention that when I say that two

distributions are equal, so say two

distributions over 0, 1 to the N, we'll say

that they're equal if these vectors, if

the vector then corresponds to one

distribution, is exactly identical to the

vector that corresponds to another

distribution. So then we'll say the two

distributions are the same. Okay. So far,

so good. So just some more notation.

Basically, if you give me a subset of the

universe, we can define the probability of

that subset basically as the sum of all

the probabilities of all elements in the

sets. Okay? So this set A is called an

event, okay? And just as an example,

suppose we look at all the N bit strings

that happen to end in one, one. Okay? So

we're looking at all strings of length N

that, you know, that's only made up of the

letters zero and one. And these strings

happen to end in eleven, okay? So that's

our event. And now imagine we're looking

at the uniform distribution of over 0, 1 to

the N. What do you think is the

probability assigned to this event? So

what is the weight of this entire event

under the uniform distribution. Well, so I

imagine everybody knows this is, the

weight would be one-fourth, because the

probability of getting 1-1 is the last two

bits, basically, would be, was half for

getting the last bit to be one.

some concepts and notations from discreet

probability. I wanted to make

sure everybody's familiar with the

notation, so I'll do a very, very quick

crash course in discreet probability. Even

if you're not familiar with what I'm

saying here, it's not the end of the

world. I'll provide an online resource,

where you can read a little bit that's not

too much reading material. And you

can familiarized yourself with basically

what we'll use from discreet probability.

We'll be using the notation mostly. And it

should be fairly easy to catch up and

learn the background [inaudible]. Okay, so

let's start with the basics. Basically,

the discreet probability space. And for

us, it's just going to be a finite set,

which I'll denote by U. And most commonly,

we'll use a set of all N bit streams as

our probability space. So this means all,

strings, with letters 0, 1 of length N. And

then a probability distribution we'll

denote by P over this set U, is just a

function that assigns to every element

in U, a weight in the interval zero to

one. And the only requirement is that the

sum of all the weight is one. Okay? So

that's basically, what a probability

distribution is. And I wanted to mention,

quickly, two examples. The first example

is the uniform distribution that basically

assigns, to every element in the space,

the same weight. So it gives every element

in the space the same distribution. So

basically, if we sample from this

distribution, the likelihood of every

element in the space is equally likely,

basically. And the other distribution I

wanted to mention is the point

distribution. At X0, basically here all

the probability mass is centered on one

point and all other points essentially

have zero mass. Now because our

probability space is finite. You can

literally just think of the function P

just as a vector. So think of basically

this function P we can literally write

down all it's values. This would be a

vector of dimension two to the N, right,

or two to the N components here, if our

universe is 0, 1 to the N. And I just

wanted to mention that when I say that two

distributions are equal, so say two

distributions over 0, 1 to the N, we'll say

that they're equal if these vectors, if

the vector then corresponds to one

distribution, is exactly identical to the

vector that corresponds to another

distribution. So then we'll say the two

distributions are the same. Okay. So far,

so good. So just some more notation.

Basically, if you give me a subset of the

universe, we can define the probability of

that subset basically as the sum of all

the probabilities of all elements in the

sets. Okay? So this set A is called an

event, okay? And just as an example,

suppose we look at all the N bit strings

that happen to end in one, one. Okay? So

we're looking at all strings of length N

that, you know, that's only made up of the

letters zero and one. And these strings

happen to end in eleven, okay? So that's

our event. And now imagine we're looking

at the uniform distribution of over 0, 1 to

the N. What do you think is the

probability assigned to this event? So

what is the weight of this entire event

under the uniform distribution. Well, so I

imagine everybody knows this is, the

weight would be one-fourth, because the

probability of getting 1-1 is the last two

bits, basically, would be, was half for

getting the last bit to be one.

Загрузка...

Выбрать следующее задание

Ты добавил

Выбрать следующее задание

Ты добавил