We are independent & ad-supported. We may earn a commission for purchases made through our links.

Advertiser Disclosure

Our website is an independent, advertising-supported platform. We provide our content free of charge to our readers, and to keep it that way, we rely on revenue generated through advertisements and affiliate partnerships. This means that when you click on certain links on our site and make a purchase, we may earn a commission. Learn more.

How We Make Money

We sustain our operations through affiliate commissions and advertising. If you click on an affiliate link and make a purchase, we may receive a commission from the merchant at no additional cost to you. We also display advertisements on our website, which help generate revenue to support our work and keep our content free for readers. Our editorial team operates independently from our advertising and affiliate partnerships to ensure that our content remains unbiased and focused on providing you with the best information and recommendations based on thorough research and honest evaluations. To remain transparent, we’ve provided a list of our current affiliate partners here.

What Is Distributed Source Coding?

By Jean Marie Asta
Updated Feb 02, 2024
Our promise to you
WiseGEEK is dedicated to creating trustworthy, high-quality content that always prioritizes transparency, integrity, and inclusivity above all else. Our ensure that our content creation and review process includes rigorous fact-checking, evidence-based, and continual updates to ensure accuracy and reliability.

Our Promise to you

Founded in 2002, our company has been a trusted resource for readers seeking informative and engaging content. Our dedication to quality remains unwavering—and will never change. We follow a strict editorial policy, ensuring that our content is authored by highly qualified professionals and edited by subject matter experts. This guarantees that everything we publish is objective, accurate, and trustworthy.

Over the years, we've refined our approach to cover a wide range of topics, providing readers with reliable and practical advice to enhance their knowledge and skills. That's why millions of readers turn to us each year. Join us in celebrating the joy of learning, guided by standards you can trust.

Editorial Standards

At WiseGEEK, we are committed to creating content that you can trust. Our editorial process is designed to ensure that every piece of content we publish is accurate, reliable, and informative.

Our team of experienced writers and editors follows a strict set of guidelines to ensure the highest quality content. We conduct thorough research, fact-check all information, and rely on credible sources to back up our claims. Our content is reviewed by subject matter experts to ensure accuracy and clarity.

We believe in transparency and maintain editorial independence from our advertisers. Our team does not receive direct compensation from advertisers, allowing us to create unbiased content that prioritizes your interests.

In communication and information theory, distributed source coding (DSC) is a crucial problem that describes the compression of information sources that are correlated in multiples but cannot communicate with one another. DSC allows relationship paradigms in video coding that swap the complexity of encoders and decoders, representing a conceptual shift in video processing. A correlation with many sources can be modeled between channel codes and decoder sides, enabling distributed source coding to shift computational complexity between the encoder side and decoder side. This provides an appropriate framework for applications that have a sender that is complexity strained, like a sensor network or video compression.

Two men named Jack K. Wolf and David Slepian proposed a theoretical bound of lossless compression concerning distributed source coding, which is now called the Slepian-Wolf theorem or bound. The bound was proposed in entropy terms with correlated sources of information in the year 1973. One of the things they were able to present was that two separate and isolated sources are able to compress data efficiently and as if both sources communicated directly to each other. Later, in 1975, a man named Thomas M. Cover extended this theorem to an instance of more than two sources.

In distributed source coding, multiple dependent sources are coded with separate joint decoders and encoders. The Slepian-Wolf theorem, which represents these sources as two different variables, assumes that two separate and correlated signals came from different sources and did not communicate with one another. These are the encoders and their signals are transferred to a receiver, which is the decoder that can perform the process of joint decoding of both signals of information. The theorem attempts to solve what the probability rate is of the receiver decoding an error and approaching zero, which is represented as its joint entropy. As both Wolf and Slepian proved in 1973, even if correlated signals become separately encoded, the combined rate is sufficient.

Though this theorem theoretically postulates that this is achievable in distributed source coding, the limits of the theory have not been realized or even closely approached in practical applications. Two other scientists, Ramchandran and Pradhan, have attempted to solve how to reach this theoretical limit and demonstrate the plausibility of the Slepian-Wolf theorem. They attempted this by the provision of a particular solution for the two encoded signals having a maximum separation distance.

WiseGEEK is dedicated to providing accurate and trustworthy information. We carefully select reputable sources and employ a rigorous fact-checking process to maintain the highest standards. To learn more about our commitment to accuracy, read our editorial process.

Discussion Comments

WiseGEEK, in your inbox

Our latest articles, guides, and more, delivered daily.

WiseGEEK, in your inbox

Our latest articles, guides, and more, delivered daily.