defining some terms in quantum information theory
Taken from wikipedia article
a "bit" is the basic unit of information. In computing, a bit can be defined as a variable that can have only two possible values. In information theory, one bit is typically defined as the uncertainty of a binary random variable that is 0 or 1 with equal probability or the information that is gained when the value of such a variable becomes known. If the value of a bit is completely predictable, the reading of that value provides no information and has 0 entropy
The number of bits needed to represent the result of an uncertain event is given by its entropy. entropy is a measure of the amount of uncertainty associated with the value of a given variable, X. entropy in information theory refers to a measure of uncertainty in a random variable. In this context, the shannon entropy is the average unpredictability in a random variable which is equivalent to its information content.
Entropy refers to the number of "bits" in a piece of information. If the variable can give one piece of information, it has an entropy of one (a single toss of a fair coin has an entropy of one). So, you have to figure out the number of pieces of information that can be gotten from the variables to determine the entropy. As uncertainty decreases, entropy decreases. When google makes a guess at what your searching for its calculating all the possible bits that you could be typing in and then making an "educated" guess using probability to determine what is most likely. With fewer possibilities, the google search has lower entropy, with greater possibilities the google search has higher entropy.
Noise Channel coding Theorem: reliable communication is possible over a noisy channel as long as the rate of communication is below a threshold called the "channel capacity". channel coding works at finding the optimal code that can transmit over a noisy channel with as little error as possible (you want to be coding at a rate near the channel capacity)
Coding Theory attempts to create codes that reduce the error rate on a noisy channel, attempting to achieve the lowest noise possible for that given channel.
Source Coding Theorem holds that it is impossible to compress data such that the code rate (average number of bits per symbol) is less than the shannon entropy of the source without losing important information.
You are attempting to code with as few symbols as possible without losing any of the information you hope to communicate. Using the Source coding theorem allows you to get close to that exact number.
lossless data compression: where the data has to be reconstructed exactly
lossy data compression--where you can reconstruct the data within a specified boundary but it doesn't have to be "exact"
Error correcting codes (channel coding)--adds some redundancy so that the data can be transmitted efficientlly and faithfully across a noisy channel.
Joint entropy is the entropy of two discrete random variables (the amount of uncertainty associated with the value of each variable). If they are independent, their joint entropy is the sum of their individual entropies.
Mutual Information--measures the amount of information that can be obtained about one randome variable by observing another.
Quantum States are an alogrithm for assigning probabilities to what might result from a measurement (see http://www.informationphilosopher.com/presentations/Milan/papers/Mohrhoff_on_Stapp.pdf)
Quantum states are not actual states of affairs but probabilities about possible states of affairs
Quantum States are an alogrithm for assigning probabilities to what might result from a measurement (see http://www.informationphilosopher.com/presentations/Milan/papers/Mohrhoff_on_Stapp.pdf)
Quantum states are not actual states of affairs but probabilities about possible states of affairs
Comments
Post a Comment