Classical information theory answers fundamental questions about the storage and transmission of data. Suppose the goal is to transmit events that occur according to some specified probability distribution. How much information is inherent in the result of each event? How can the events be encoded (into bits, say) for efficient transmission of this information? Intuitively, a more likely outcome is less surprising and less informative, and so requires less in the way of storage. In this talk, we'll introduce the fundamental quantity of entropy (and its variations) which provide a foundation for precise answers to the above and related questions. We illustrate its use in the analysis of data encoding and transmission. We also set the stage for a comparison of classical information sources with quantum sources.
The seminar will be held in hybrid mode. Join Zoom Meeting https://cuboulder.zoom.us/j/94002553301 Passcode 790356
Category theory studies things and relationships between those things. However, in many settings, there are also relationships between those relationships; for example, there are homotopies between continuous maps between spaces, or natural transformations between functors between categories. This data is naturally described through the language of higher categories. In this talk, I aim to give a motivation of higher category theory by explaining its relationship to homotopy theory, as well as provide some examples of results from higher category theory.