Friday, November 23, 2012

Thanksgiving!

Happy Thanksgiving!


For the past two weeks, I have been trying to see if I can calculate how many samples it takes to determine the key with a certain probability. The good news is that in this process, I was able to understand large deviation theory better. The bad news is that I don't feel like I am making significant progress. It takes me a long time to figure out what I am doing. I think I spend most of my time just staring at my paper. One of my professor's grad student tried to assure me that this always happens in the beginning of research. While I have hope that I will get better at this, I think I am doomed for the poster presentation I have to give in 13 days.

The ideas I am working on involve finding the probability that two nibbles of the key has has a specified difference when XOR-ed together. In other words, I am trying to figure out the probability distribution of K0 ^ K4 = Y, which can take on the values y0, y1, y2,...,y15.

I am assuming that I know for each plaintext the time it takes for the plaintext to get encrypted, which we will call X. This plaintext has a key difference of say Z. I want to figure out what Y is given that we know X. The weird thing about his model is that X is most useful for determining the probability that Y = y_i, only if Z = y_i. If Z = y_j. for j != i, X tells us as much about Y = y_j as it does for any Y != y_i. Therefore, we can just set 
P( Y = y_i | Z = y_j, j != i) = (1 - P ( Y = y_j | Z = y_j ) ) / 15.   

Using this model, along with Bayes' rule, assuming iid distribution of X and uniform prior of Y, (and that my math is correct), we can derive that 

P( Y = y_i | X^n ) = C * P(X^n | Y = y_i) / P(X^n | Y = y_j, j != i)

where C is a normalizing constant and n is the number of plaintext times we have such that Z = y_i for all X. If we write that in terms of relative entropy, we can get that  

P( Y = y_i | X^n ) = 2 ^ (-n *( D (P_x || y_i) - D(P_x || y_j) ))

where P_x is the empirical distribution of X^n. This relative entropy form may be helpful for applying some known theorems, however, I am still not entirely sure what to do with this. The current plan is to see if I can use this to calculate a lower bound on the probability of a given key difference using a simple Bernoulli distribution to model P(X | Y = y_i, Z = y_i) and P(X | Y = y_j, Z = y_i, i != j). Hopefully there will be interesting results.

Question for this week. I got this from our group meeting:

You are in a room with 3 light switches. Each of these switches control one of three lights in an adjacent room. From your room, you can't see which switch controls which light. You are allowed to flip some combination of switches and walk into the adjacent room to see which lights are on. How can you determine which switch controls which light by only making one trip to the adjacent room?

1 comment:

  1. The computation sounds intense, but thanks for simplifying it down!

    Answer to the teaser: Turn switch #1 on for ~30 mins, then turn it off and turn switch #2 on. Go into the adjacent room; switch #2 controls the light that is on, switch #1 controls the warm light(bulb), and switch #3 controls the cool light bulb. Physics class FTW :)

    ReplyDelete