Information theory is a mathematical framework for quantifying, storing, and communicating information, established by Claude Shannon in the 1940s. It involves concepts such as entropy, mutual information, and channel capacity, with applications ranging from data compression to cryptography and artificial intelligence. The field combines insights from various disciplines including mathematics, engineering, and computer science.
LLMc is a novel compression engine that utilizes large language models (LLMs) to achieve superior data compression by leveraging rank-based encoding. It surpasses traditional methods such as ZIP and LZMA, demonstrating enhanced efficiency in processing and decompression. The project is open-source and aims to encourage contributions from the research community.
Entropy triage is a novel method developed by MOXFIVE to repair files corrupted by failed ransomware encryption using Shannon entropy to select usable data blocks. By automating the reconstruction process, this technique has achieved over 90% success in restoring virtual disks that standard decryptors cannot fix. However, it requires specialized skills and has limitations regarding the type of data it can recover.