>[!warning]
>This content has not been peer reviewed.
# The capacity of a noisy channel
**Shannon (1948):** The maximum rate at which information can be transmitted reliably over a noisy channel is $C = B \log_2(1 + S/N)$, where $B$ is bandwidth, $S$ is signal power, and $N$ is noise power. Above this rate, errors cannot be made arbitrarily rare. It is a fundamental limit of information theory and has been proven rigorously (Shannon 1948; Cover & Thomas 2006).
---
## What it is
- **Shannon capacity:** $C = B \log_2(1 + \mathrm{SNR})$. The “signal vs noise” trade-off is exact: more noise means less reliable throughput for a given bandwidth.
- **Entropy and resolution:** Shannon entropy $\sigma = -\sum p_j \log_2 p_j$ measures the effective bit-count of a distribution; channel capacity is the maximum *rate* of bits that can be sent and recovered in the presence of noise.
- **Implication:** There is no “free” signal on a noisy medium; every bit costs a share of the finite capacity.
---
## How RRT/RST uses it
- **A2 (Resolution), [[Resource Triangle]]:** The substrate is a noisy channel. Signal $\Omega$ and noise $N$ share a finite budget $W$; the “throughput” of structure (resolution) is limited by the same trade-off: $\mu = \Omega/W$ and $\eta = \Omega/N$ play the role of signal-to-noise; the fidelity $\mu(\eta,n)$ is the substrate’s “reliable rendering” curve.
- **[[Fidelity Derivation]]:** The budget identity $\mu^n + \nu^n = 1$ and the form $\mu = \eta/(1+\eta^n)^{1/n}$ encode how much “signal” can be maintained for a given noise floor — the same conceptual limit as Shannon’s, in a scale-free (power-law) setting. So the triangle is the **relational version** of the noisy-channel limit.
---
## Links
| Concept | Note |
|:---|:---|
| Information, entropy, resolution in RST | **[[expanded theory applied/foundation/Information and Entropy/Information and Entropy (RST)]]** |
| Budget $W^n = \Omega^n + N^n$ | **[[Resource Triangle]]** |
| Rendering quality $\mu(\eta,n)$ | **[[Fidelity]]** |
---
## References
- Shannon, C. E. (1948). *A mathematical theory of communication.* Bell Syst. Tech. J. **27**, 379–423, 623–656. [DOI](https://doi.org/10.1002/j.1538-7305.1948.tb01338.x)
- Cover, T. M. & Thomas, J. A. (2006). *Elements of Information Theory*, 2nd ed. Wiley, Hoboken. (Standard reference for capacity theorems and proofs.)