Information theory is a mathematical framework for quantifying, storing, and communicating information, established by Claude Shannon in the 1940s. It involves concepts such as entropy, mutual information, and channel capacity, with applications ranging from data compression to cryptography and artificial intelligence. The field combines insights from various disciplines including mathematics, engineering, and computer science.