CSAIL Research Abstracts - 2005 link to http://publications.csail.mit.edu/abstracts/abstracts05/index.html link to http://www.csail.mit.edu
bullet Introduction bullet Architecture, Systems
& Networks
bullet Language, Learning,
Vision & Graphics
bullet Physical, Biological
& Social Systems
bullet Theory bullet

horizontal line

The Capacity of Information Networks

Nick Harvey, Robert Kleinberg & April Rasala Lehman

Introduction

How much information can be transmitted through a network? This fundamental question has recently received renewed attention. Information in a network is often viewed as an immutable commodity. Data packets in a network are modeled in the same way as cars in a highway system or water in a system of pipes. However, information is fundamentally different from traffic or a fluid in two respects. A car reaching an intersection can proceed along only one road. In contrast, information can be duplicated and transmitted across many channels in a network simultaneously. In addition, information can also be manipulated in ways with no physical analogue. For example, if a network router receives bits x and y, then it can transmit a function, say x xor y, of the two bits.

Does coding information in this way provide any benefit, in terms of the transmission rate? Ahlswede et al [1] recently constructed a small example showing the surprising result that coding can in fact provide some benefit. How large can this benefit be in general? This question, and related questions, lead to a new field of research called network coding. Our research seeks to characterize the benefit of network coding in general information networks.

This work is currently under submission [2].

Approach

Understanding the capacity of information networks is a long-standing open problem. Two different fields have sought to understand the question of network capacity: combinatorial optimization and information theory. In combinatorial optimization, the capacity of a network is typically determined by analyzing packing problems (e.g., flows) that are constrained by the graph structure. This theory grew out of a need to understand the shipment of cargo in transportation networks and does not capture the subtleties of information transmission. On the other hand, information theory provides a deep understanding of complex communication problems over structurally simple channels but does not yet fully extend to arbitrary graph structures. Our approach is to combine combinatorial ideas with information theoretic ideas. The combination of these tools allows us to make significant progress on upper bounds for the capacity of information networks.

Results

Our main contribution is the strongest known upper bound on the capacity of an information network. This upper bound combines strong information inequalities derived from the network structure with inequalities based on characteristics of information (e.g., the submodularity of Shannon's entropy function). Assembling all of these inequalities into a linear program, we obtain a computable upper bound on the network capacity.

A key ingredient of our upper bound is a new relation which we call informational dominance: an edge set A informationally dominates an edge set B if the information transmitted on A uniquely determines the information transmitted on B. We develop a combinatorial criterion (and a corresponding polynomial-time algorithm) for determining when A informationally dominates B. We believe that this combinatorial characterization of informational dominance is likely to be of independent interest.

We also consider the important special case of information networks where each data stream has exactly one sender and one receiver. Characterizing the capacity of such networks is called the k-pairs communication problem. This problem has a close relationship to the well-known multicommodity flow problem. Thus, it is interesting to compare the rate when network coding is used to the multicommodity flow rate and various bounds on this rate (e.g., sparsity). We prove the following inequalities.

  • Undirected networks: In all graphs, multicommodity flow rate ≤ network coding rate ≤ sparsity.
  • Directed networks: In some graphs, multicommodity flow rate ≤ sparsity < network coding rate.

For the case of undirected graphs, understanding whether these inequalities can be strict is an important question. It is known that for some graphs, the multicommodity flow rate can be strictly less than the sparsity. Thus, for some graphs, one of the two inequalities stated above must be strict. The undirected k-pairs conjecture claims that the multicommodity flow rate always equals the network coding rate. We make progress on this conjecture by using our general upper bound on the capacity of information networks. We prove that the conjecture holds for an infinite class of graphs for which there is a gap between the sparsity and the multicommodity flow rate. This result also gives the first proof of a gap between the sparsity of an undirected graph and the network coding rate. This result was obtained independently and concurrently by Jain et al. [3].

For the case of directed graphs, we construct a family of examples where the network coding rate is larger than the sparsity by a factor that is linear in the size of the graph. This shows that the network coding rate can be significantly larger than the multicommodity flow rate. We also obtain a matching upper bound, thereby proving that our construction is optimal.

References

[1] Rudolf Ahlswede, Ning Cai, Shuo-Yen Robert Li and Raymond W. Yeung. Network Information Flow. In IEEE Transactions on Information Theory, 46(4), pp. 1204-1216, 2000.

[2] Nicholas J. A. Harvey, Robert D. Kleinberg and April Rasala Lehman. On the Capacity of Information Networks. Submitted to the Special Issue of the IEEE Transactions on Information Theory and the IEEE/ACM Transactions on Networking, 2005.

[3] Kamal Jain, Philip A. Chou, Vijay V. Vazirani, Raymond Yeung and Gideon Yuval. On the Capacity of Multiple Unicast Sessions in Undirected Graphs. Submitted Manuscript, 2005.

horizontal line

MIT logo Computer Science and Artificial Intelligence Laboratory (CSAIL)
The Stata Center, Building 32 - 32 Vassar Street - Cambridge, MA 02139 - USA
tel:+1-617-253-0073 - publications@csail.mit.edu
(Note: On July 1, 2003, the AI Lab and LCS merged to form CSAIL.)