Friday, November 26, 2010

Quark-gluon plasma detected at the LHC!

I've been watching experiments running all month on the LHC web site, including those from ATLAS. You too can do so here. You can select what you want to see via the dropdown menu in the top left.

A paper showing the creation of a quark-gluon plasma at the LHC by the ATLAS experiment was accepted for publication today. A related experiment, CMS, is going to try to confirm this soon.

Quarks are the particles that make up protons, neutrons, and so forth. Gluons are the force particles that keep quarks stuck together. If you move two quarks apart, they create more gluons; unlike gravity and electromagnetism, which get weaker over long distances, the strong force (which is what we call the force that gluons carry) is stronger at longer distances. So you have to heat your quarks up a lot to get them to move apart significantly, and when you do, you get a lot of gluons being created. If you heat things up enough, the quarks from different nuclei will all mingle together in a sea of gluons, sort of like when you dissolve salt in water. This is called a plasma.

This event is important because it gives us details about the first few seconds of the Universe. These measurements might help explain why we have a lot of matter, but not much antimatter. New particles might be formed in the aftermath of such a plasma... perhaps dark matter particles, or the Higgs boson. We might even be able to discover new fundamental forces in such plasmas!

In related news, there is was a suggestion recently published that it might be possible to prove or disprove the hypothesis of supersymmetry very soon. Several groups of researchers have figured out what to look for, and it's within the capacity of the LHC to do it.

Exciting days ahead in the world of physics. You can get a schedule of the LHC's planned events online, if you're interested in the current state of affairs.

Thursday, November 25, 2010

Non-mathematical paradoxes

Someone wrote to me:


I always thought a paradox was two physicians.
But I'd like it explained why "we're damned if we do, and damned if we
don't."
I'd like a satisfactory explanation of "jumbo shrimp."
How about "Government Intelligence."
What about "lipstick?" It comes right off.


My responses:


Damned if we do, damned if we don't because I'm not a Christian.
Jumbo shrimp because the ones with floppy ears can't pronounce the letter D.
Government intelligence because intelligence implies penetrating thoughts, and
the government is always looking for ways to screw you.
Lipstick because it makes women look hotter, and therefore kisses last longer.


(Obviously not serious.)

Wednesday, November 24, 2010

Mathematical paradoxes -- The Two Envelopes Problem

The problem


Suppose you are given two envelopes with money in them. You are told that you may pick and keep only one of them. You are also told that one envelope contains twice as much money as the other, but not which envelope has more money. You choose one of the envelopes, and you find that it contains $10. Now, you are given the option of taking the other envelope instead. Is there any advantage to switching envelopes? How much money can you expect to gain/lose?



You might like to spend some time trying to figure it out. If I offered you the chance to buy the other envelope for $12, would you take it?


Math background


You need the ability to add and multiply. Dividing by 2 might help also. That's it! (It's great that some very complex problems don't need much knowledge of what most people think of as "math.")

The paradox


There are two common answers.

Answer 1: There are two options for what's in the other envelope. It either contains $5 or $20, with a 50% chance of each. On average, then, if you switch envelopes you will get ($5 + $20)/2 = $12.50. Keeping the $10 envelope only gets you $10. Therefore, you should switch, and you will get on average $12.50, for an average gain of $2.50.


Answer 2: Since you picked the first envelope randomly, it's impossible to gain by switching. You might have taken the higher number, and you might have taken the lower number. If you took the higher one, then you stand to lose the difference between the two envelopes. If you took the lower one, then you stand to gain that difference. Since there was a 50/50 chance of taking each, the differences balance out, and the average gain from switching envelopes is $0.



Which do you think is correct?


A more precise statement of the problem



  1. There are two envelopes with dollar amounts x and 2x. We'll say that envelope L has x and envelope H has 2x

  2. There is a probability of 50% of choosing each envelope.

  3. In one of the envelopes, the amount is $10.


The question: what is the average of the amount you would gain if you chose envelope L and the amount you would lose if you chose envelope H?


You might like to take a second to think about it again. Does that change your idea of what the answer is?


The solution


As for the Monty Hall problem, stating the problem more precisely makes it easier to figure out what the answer is. If you took envelope L, then you will gain x dollars if you switch. If you took envelope H, then you will gain x - 2x = -x dollars if you switch. (That is, you lose x dollars.) The average is (x + -x)/2 = 0.

But we can't accept this answer until we know why the other possible answer is wrong. After all, it's pretty intuitive. There is either double the amount in the other envelope, or there's half, and there is a 50% chance of each, right?

Well, no. To be precise (and you really have to be!), there is either a 100% chance of the other envelope having double the money as the one you chose, or there is a 100% chance of the other envelope having half. But that's not the same as saying there is a 50% chance of each. If you did an experiment where you had a constant number for the first envelope chosen (call it $10), and then either doubled it or halved it for the other envelope, then you would indeed benefit from switching envelopes. But that's not what is happening in the Two Envelopes Problem. That is a different problem entirely.


There's another way of looking at it. If you say,
  1. I have a chance of gaining $10, but I only have a chance of losing $5,

then you are assigning undue weight to the gains. It's more accurate to say
  1. If the other envelope has $5 in it, then I stand to lose $5 by switching. But if I'd chosen the other envelope first, I'd stand to gain $5.

  2. If the other envelope has $20 in it, then I stand to gain $10 by switching. But if I'd chosen the other envelope first, I'd stand to lose $10 by switching.

Do you see the difference? If you make the first statement, then you are assigning double the weight to the third statement compared to the second. You are saying something like "there is a 2/3 chance of statement 1 being correct, and a 1/3 chance of statement 2 being correct." (The 50% chance pertains only to whether or not you took H or L, so you can't even guess that statements 2 and 3 each have a 50% probability.) However, you simply don't know which of 2 and 3 is correct, so you can not assign odds to them. Because the first statement is assigning odds where it is impossible to do so, it isn't true.


Was your first idea right? Did you change your mind to the right answer after the problem was reworded?


Conclusion


In the Two Envelopes Problem, there is on average no gain to be had by switching envelopes. You will either gain x dollars or lose x dollars from the best-case scenario, and there's a 50% chance of each.

This is probably a great way to scam people. Put $10 and $20 in two envelopes, then let them look in one. Ask them to pay a bit under $12.50 if they chose the $10 envelope and want to switch, or a bit under $30 if they chose the $20 envelope. They'll think it's a great deal--on average, how can they lose?--but your gains from the people who choose the $20 envelope first will outweigh your losses from the people who chose the $10 envelope first. (Of course, you have to stipulate that they can't keep the first envelope they chose, or they're just going to take your money and run. Maybe. Some people don't like gambling.)

In practical terms, the amount of money you can offer is limited. If Alice offers two envelopes to Bob, and Bob opens an envelope to find $10 000, and he knows Alice's net worth is only $25 000, then he knows he must have opened the envelope with more money. For a similar reason, the amounts in both envelopes should be even numbers. I don't know if there's a way around this. Maybe selecting the amounts according so a distribution that puts emphasis on low values would work... but knowledge of the distribution might allow a savvy investor to beat the system. I think this is possibly a good research opportunity for someone (maybe me, after I'm done the current batch of papers I'm working on). The problem is because money isn't infinitely divisible. You can't offer someone 1/9 dollars. It's not obvious how finely divisible money has to be to make this scheme work.

Tuesday, November 23, 2010

Mathematical paradoxes -- The Monty Hall Problem

The problem



There was (and apparently still is! Wayne Brady runs it now!) a game show called Let's Make a Deal. It had a game in which the host--Monty Hall, after whom this problem is named--would show a contestant three doors. (I'm going to assume it's a female contestant here for the sake of clarity, to distinguish her from the distinguished Monty.) Behind one door was a car; behind each of the other two doors was a goat. The contestant would choose a door, and the host would open one of the other doors to reveal a goat. At this point, the contestant was given the choice to keep her original choice or to switch to the remaining unopened door. If the door she decided on in the end had the car behind it, the contestant won the car. (I don't know if they were allowed to keep the goat if they chose it. Probably they got to substitute another prize.) The question is: is there any advantage to switching doors?


What do you think? Would you switch doors, or would you keep the same one, or does it not matter?


Math background


All you really need to know is how to add and multiply. Easy enough, right? You could teach this to elementary school kids. (If anyone wants me to come to their classroom and teach it to their kids, I will do it!)

The paradox


There are three answers that people commonly come up with.

Answer 1: There can be no advantage. The door was originally chosen randomly, and the car is behind a random door. There are two doors after a goat is revealed, so the contestant has a 50% chance of having chosen the correct door.

Answer 2: Switching doors will give you a 2/3 chance of winning. There was a 1/3 chance of choosing the car in the first place, and then a goat was revealed, but that doesn't change the odds of finding the car behind the chosen door. If the contestant switches, there is a 2/3 chance of finding the car behind the other door.

Answer 3: Every door has something random behind it. No matter which door is choosen in the end, there is only a 1/3 chance of winning.


Which answer of these three do you think is best? Can you figure out why the others are wrong?


A more precise statement of the problem



  1. Two goats and a car are randomly placed behind 3 doors.

  2. The contestant picks any one door.

  3. If the contestant's door has the car behind it, the host picks either of the other doors. If the contestant's door has a goat behind it, the host picks the remaining door with a goat behind it.

  4. The contestant is given the option of keeping her door or switching her choice to the remaining unopened door.

  5. The contestant wins if her final choice of door has the car behind it.


What are the odds of winning if the contestant chooses to keep the same door? What if she switches?

The assumptions here are that all placements of the car are random, all episodes of the show are shown on TV (no bias in selection), and that the host always reveals a door with a goat behind it in step 3 (this is a bias, however). This is critical to the problem. If you assume that some episodes are never seen on TV, then you have an incomplete (and very possibly biased) view of the probabilities. In particular, if you exclude any scenario in which a car might be revealed, you get a different answer than if you don't.


Take a minute and think about it again. Do you agree with your earlier idea?


The solution


The more precise statement of the problem should clue you in to what's going on, especially step 3. If the host always had to choose to reveal one of the other doors at random, then each door would indeed be equally likely to have the car behind it. But that's not the case. However, just for the sake of comparison, let's look at each of the possibilities for this scenario. I've drawn these out in a tree form, because it's way easier to understand this way.

In the first column, I've shown the possibilities for which door the contestant first chooses. There is a 2/3 chance of choosing a goat, and 1/3 of choosing the car. In the second branch, I've shown the chances of the host picking either the car or a goat to reveal. Note that this doesn't happen in the game show; I'm just showing it for the purpose of comparison. In the final branch, I've shown what happens if the contestant switches doors or not. If the she wins the car, that's labelled win. If she gets a goat, it's lose.

For each winning possibility, I've multiplied all the probabilities from the left side of the tree to the win, and put that number at the end. For each loss, I put down 0 as the result. To get the total odds of winning by switching, add up each of the switch branches. To get the odds of winning by not switching, add up those branches.

In this case, there is a 1/3 chance of winning by switching, and a 1/3 chance of winning by not switching. There is also a 1/3 chance of the host opening the door with the car, and in that case you're not going to win no matter what you do. So there's no advantage to switching or not. Note that this diagram covers two of the commonly given answers above. If someone claims that there's a 50% chance of winning by switching, odds are she has calculated that the odds of winning and losing are the same and thinks that that means those odds are 50%. She has neglected the chance that it is impossible to win. If someone claims that the chance of winning by is 1/3 no matter what, then she has also chosen to use this diagram, and realizes that there's a chance of not winning.

However, I'd argue that that's not what the problem really is. On the TV show, the host always opened a door with a goat. It would make sense that if the contestant chose a goat in the first place, they always opened the other door with the goat. Otherwise, they'd have to throw out a third of their footage! (However, since we don't know for sure that this is the case, I won't argue too vigorously against it.) This distinction will become important when I discuss the Two Coins Paradox. In fact, it is a fundamental feature of experimental design: you have to know what your experiment is going to be before you do the experiment. In this case, the experiment might be a test to see if the "always switching" strategy works. To figure that out, you must know exactly how the test was done, and in particular, whether or not any possibilities are being discarded.

This diagram shows the correct solution. It is almost the same as the last diagram, except that the option of the host revealing the car is removed. You can see clearly that the odds of winning when you switch are 2/3, while the odds of winning if you don't switch are only 1/3. That is because no matter what your initial choice was, there is always a chance of winning. The bottom branch of the first diagram didn't have that choice.


Was your first idea right? Did you change your mind to the right answer after the problem was reworded?


Conclusion


Experimental design is important! You need to clearly define what you're trying to measure, or you can't do statistics on it!

If you clearly define the Monty Hall problem as "no matter what the contestant chooses, the host always picks a door with a goat," then the contestant doubles her chances of winning by switching doors. If you think, "the host opened a door at random, but they just didn't show the episodes where the host revealed the door with the car behind it," then there is no advantage to switching, and the contestant has a 1/3 chance of winning the car no matter what.

This is related to a statistical problem called Berkson's Paradox, and the "selection bias" that leads to the wrong answer is more formally called an ascertainment bias. The difference between this problem and Berkson's is that there is supposed to be an ascertainment bias here.


Sources:
1. Too much TV as a kid.
2. Image of Monty Hall from http://www.letsmakeadeal.com/images/mh-1975.jpg. I'm calling it fair use, for educational purposes.

Sunday, October 3, 2010

Blood colours

Just some fun facts that were sitting in a draft article that I will never finish. It's been a year since I last posted on this blog, mostly because I was busy finishing my master's thesis, and partly because I was getting bored with the whole "reporting news" thing. So I'm going to change things up a bit, I think. I'll start posting single interesting articles as I come across them, rather than a weekly digest (which was something of a chore), and I also want to do some fun math/science things.

Most mammals have the familiar red blood that we all know and love. The colour is mostly due to the presence of the iron-based molecule hemoglobin. Spiders have copper-based blood (same as crabs, hemocyanin) which is blue or green or sometimes yellowy. Some worms (Polychaeta) have blood which is green/bright red (chlorocruorin, iron-based), depending on conditions. Some other worms (mostly deep sea worms) have a different colour of red (hemerythrin, iron). Sea squirts and tunicates have a pale green blood pigment (vanadium chromagen, clearly vanadium-based, but with the vanadium possibly in several different oxidation states) which can, depending on other chemicals present be blue, orange, yellow or pink (or basically anything). Some molluscs have brown blood (pinnaglobin, manganese).

There are many others, some of which are mentioned in a neat little book about extraterrestrials. Xenology: An Introduction to the Scientific Study of Extraterrestrial Life, Intelligence, and Civilization.