ECE287A: References

Professor Young-Han Kim, UCSD, Spring Quarter 2008–09

We won’t attempt to give a complete coverage of the literature, but will sample a few key papers and review articles.

Introduction

  1. T. C. Hu, “Multi-commodity network flows,” Operations Research, vol. 11, no. 3, pp. 344–360, 1963. (pdf)

  2. L. R. Ford, Jr. and D. R. Fulkerson, “Maximal flow through a network,” Canad. J. Math., vol. 8, pp. 399–404, 1956. (link)

  3. P. Elias, A. Feinstein, and C. E. Shannon, “A note on the maximum flow through a network,” IRE Trans. Inf. Theory, vol. IT-2, no. 4, pp. 117–119, Dec. 1956. (pdf)

  4. C. E. Shannon, “A mathematical theory of communication,” Bell System Tech. J., vol. 27, pp. 379–423, 623–656, 1948. (pdf)

  5. C. E. Shannon, “Coding theorems for a discrete source with a fidelity criterion,” in IRE Int. Conv. Rec., part 4, 1959, vol. 7, pp. 142–163, reprinted with changes in Information and Decision Processes, R. E. Machol, Ed. New York: McGraw-Hill, 1960, pp. 93-126.

  6. C. E. Shannon, “Two-way communication channels,” in Proc. 4th Berkeley Sympos. Math. Statist. Prob.   Berkeley, Calif.: Univ. California Press, 1961, vol. I, pp. 611–644.

Preliminaries

  1. A. D. Wyner and J. Ziv, “A theorem on the entropy of certain binary sequences and applications — I,” IEEE Trans. Inf. Theory, vol. IT-19, pp. 769–772, 1973. (pdf)

  2. H. S. Witsenhausen, “Entropy inequalities for discrete channels,” IEEE Trans. Inf. Theory, vol. IT-20, pp. 610–616, 1974. (pdf)

  3. H. S. Witsenhausen and A. D. Wyner, “A conditional entropy bound for a pair of discrete random variables,” IEEE Trans. Inf. Theory, vol. IT-21, no. 5, pp. 493–501, 1975.

  4. S. Shamai and A. D. Wyner, “A binary analog to the entropy-power inequality,” IEEE Trans. Inf. Theory, vol. IT-36, no. 6, pp. 1428–1430, 1990. (pdf)

  5. R. Bellman, Introduction to Matrix Analysis, 2nd ed.   New York: McGraw-Hill, 1970.

  6. A. W. Marshall and I. Olkin, “A convexity proof of Hadamard’s inequality,” Amer. Math. Monthly, vol. 89, no. 9, pp. 687–688, 1982.

  7. C. E. Shannon, “A mathematical theory of communication,” Bell System Tech. J., vol. 27, pp. 379–423, 623–656, 1948. (pdf)

  8. A. J. Stam, “Some inequalities satisfied by the quantities of information of Fisher and Shannon,” Inf. Control, vol. 2, pp. 101–112, 1959. (pdf)

  9. N. M. Blachman, “The convolution inequality for entropy powers,” IEEE Trans. Inf. Theory, vol. IT-11, pp. 267–271, 1965. (pdf)

  10. M. S. Pinsker, Information and Information Stability of Random Variables and Processes.   San Francisco: Holden-Day, 1964.

  11. R. M. Gray, Entropy and Information Theory.   New York: Springer, 1990.

  12. I. Csisz'ar and J. K"orner, Information Theory, 3rd ed.   Budapest: Akad'emiai Kiad'o, 1981.

  13. A. Orlitsky and J. R. Roche, “Coding for computing,” IEEE Trans. Inf. Theory, vol. IT-47, no. 3, pp. 903–917, 2001.

  14. S.-Y. Tung, “Multiterminal source coding,” Ph.D. Thesis, Cornell University, Ithaca, NY, 1978.

Point-to-point information theory

  1. C. E. Shannon, “A mathematical theory of communication,” Bell System Tech. J., vol. 27, pp. 379–423, 623–656, 1948. (pdf)

  2. A. Feinstein, “A new basic theorem of information theory,” IRE Trans. Inf. Theory, vol. IT-4, pp. 2–22, 1954. (pdf)

  3. R. G. Gallager, “A simple derivation of the coding theorem and some applications,” IEEE Trans. Inf. Theory, vol. IT-11, pp. 3–18, 1965. (pdf)

  4. R. G. Gallager, Information Theory and Reliable Communication.   New York: Wiley, 1968.

  5. R. G. Gallager, Low-Density Parity-Check Codes.   Cambridge, MA: MIT Press, 1963.

  6. T. Richardson and R. Urbanke, Modern Coding Theory.   New York: Cambridge University Press, 2008.

  7. T. M. Cover and J. A. Thomas, Elements of Information Theory, 2nd ed.   New York: Wiley, 2006.

  8. R. J. McEliece, The Theory of Information and Coding.   Reading, MA: Addison-Wesley, 1977.

  9. A. D. Wyner, “The capacity of the band-limited Gaussian channel,” Bell System Tech. J., vol. 45, pp. 359–395, Mar. 1966. (pdf)

  10. M. J. E. Golay, “Note on the theoretical efficiency of information reception with PPM,” Proc. IRE, vol. 37, no. 9, p. 1031, Sept. 1949. (pdf)

  11. C. E. Shannon, “Coding theorems for a discrete source with a fidelity criterion,” in IRE Int. Conv. Rec., part 4, 1959, vol. 7, pp. 142–163, reprinted with changes in Information and Decision Processes, R. E. Machol, Ed. New York: McGraw-Hill, 1960, pp. 93-126.

  12. T. Berger, “Rate distortion theory for sources with abstract alphabets and memory.” Inf. Control, vol. 13, pp. 254–273, 1968.

  13. J. G. Dunham, “A note on the abstract-alphabet block source coding with a fidelity criterion theorem,” IEEE Trans. Inf. Theory, vol. IT-24, no. 6, p. 760, 1978.

  14. J. A. Bucklew, “The source coding theorem via Sanov’s theorem,” IEEE Trans. Inf. Theory, vol. IT-33, no. 6, pp. 907–909, 1987.

  15. M. Gastpar, B. Rimoldi, and M. Vetterli, “To code, or not to code: lossy source-channel communication revisited,” IEEE Trans. Inf. Theory, vol. IT-49, no. 5, pp. 1147–1158, 2003. (pdf)

Multiple access channel

  1. R. Ahlswede, “Multiway communication channels,” in Proc. 2nd Int. Symp. Inf. Theory, Tsahkadsor, Armenian S.S.R., 1971, pp. 23–52. (pdf)

  2. H. H. J. Liao, “Multiple access channels,” Ph.D. Thesis, University of Hawaii, Honolulu, Sept. 1972. (pdf)

  3. G. Dueck, “Maximal error capacity regions are smaller than average error capacity regions for multi-user channels,” Probl. Control Inf. Theory, vol. 7, no. 1, pp. 11–19, 1978.

  4. T. M. Cover, “Some advances in broadcast channels,” in Advances in Communication Systems, A. J. Viterbi, Ed.   San Francisco: Academic Press, 1975, vol. 4, pp. 229–260.

  5. A. D. Wyner, “Recent results in the Shannon theory,” IEEE Trans. Inf. Theory, vol. IT-20, pp. 2–10, 1974. (pdf)

Distributed lossless source coding

  1. D. Slepian and J. K. Wolf, “Noiseless coding of correlated information sources,” IEEE Trans. Inf. Theory, vol. IT-19, pp. 471–480, 1973. (pdf)

  2. T. M. Cover, “A proof of the data compression theorem of Slepian and Wolf for ergodic sources,” IEEE Trans. Inf. Theory, vol. IT-21, no. 2, pp. 226–228, Mar. 1975. (pdf)

Broadcast channel

  1. J. K"orner and K. Marton, “General broadcast channels with degraded message sets,” IEEE Trans. Inf. Theory, vol. IT-23, no. 1, pp. 60–64, 1977. (pdf)

  2. T. M. Cover, “Broadcast channels,” IEEE Trans. Inf. Theory, vol. IT-18, no. 1, pp. 2–14, Jan. 1972. (pdf)

  3. R. G. Gallager, “Capacity and coding for degraded broadcast channels,” Probl. Inf. Transm., vol. 10, no. 3, pp. 3–14, 1974. (pdf)

  4. P. P. Bergmans, “Random coding theorem for broadcast channels with degraded components,” IEEE Trans. Inf. Theory, vol. IT-19, no. 2, pp. 197–207, 1973. (pdf)

  5. P. P. Bergmans, “A simple converse for broadcast channels with additive white Gaussian noise,” IEEE Trans. Inf. Theory, vol. IT-20, pp. 279–280, 1974. (pdf)

  6. J. K"orner and K. Marton, “Comparison of two noisy channels,” in Topics in Information Theory (Second Colloq., Keszthely, 1975).   Amsterdam: North-Holland, 1977, pp. 411–423.

  7. A. El Gamal, “The capacity of a class of broadcast channels,” IEEE Trans. Inf. Theory, vol. IT-25, no. 2, pp. 166–169, 1979. (pdf)

  8. C. Nair, “Capacity regions of two new classes of 2-receiver broadcast channels,” 2009. urlhttp:arxiv.orgabs0901.0595

  9. G. S. Poltyrev, “The capacity of parallel broadcast channels with degraded components,” Probl. Inf. Transm., vol. 13, no. 2, pp. 23–35, 1977.

  10. G. S. Poltyrev, “Capacity for a sum of broadcast channels,” Probl. Inf. Transm., vol. 15, no. 2, pp. 40–44, 1979.

  11. A. El Gamal, “Capacity of the product and sum of two unmatched broadcast channels,” Probl. Inf. Transm., vol. 16, no. 1, pp. 3–23, 1980.

  12. T. M. Cover, “An achievable rate region for the broadcast channel,” IEEE Trans. Inf. Theory, vol. IT-21, pp. 399–404, 1975. (pdf)

  13. E. C. van der Meulen, “Random coding theorems for the general discrete memoryless broadcast channel,” IEEE Trans. Inf. Theory, vol. IT-21, pp. 180–190, 1975. (pdf)

  14. K. Marton, “A coding theorem for the discrete memoryless broadcast channel,” IEEE Trans. Inf. Theory, vol. IT-25, no. 3, pp. 306–311, 1979. (pdf)

  15. A. El Gamal and E. C. van der Meulen, “A proof of Marton’s coding theorem for the discrete memoryless broadcast channel,” IEEE Trans. Inf. Theory, vol. IT-27, no. 1, pp. 120–122, Jan. 1981. (pdf)

  16. S. I. Gelfand and M. S. Pinsker, “Capacity of a broadcast channel with one deterministic component,” Probl. Inf. Transm., vol. 16, no. 1, pp. 24–34, 1980.

  17. E. C. van der Meulen, “A survey of multi-way channels in information theory: 1961–1976,” IEEE Trans. Inf. Theory, vol. IT-23, no. 1, pp. 1–37, 1977.

  18. S. I. Gelfand, “Capacity of one broadcast channel,” Probl. Inf. Transm., vol. 13, no. 3, pp. 106–108, 1977.

  19. H. Weingarten, Y. Steinberg, and S. Shamai, “The capacity region of the Gaussian multiple-input multiple-output broadcast channel,” IEEE Trans. Inf. Theory, vol. IT-52, no. 9, pp. 3936–3964, Sept. 2006. (pdf)

  20. M. Mohseni and J. M. Cioffi, “A proof of the converse for the capacity of Gaussian MIMO broadcast channels,” submitted to IEEE Trans. Inf. Theory, 2006. (pdf)

  21. Y. Liang, G. Kramer, and H. V. Poor, “Equivalence of two inner bounds on the capacity region of the broadcast channel,” in Proc. 46th Annual Allerton Conference on Communications, Control, and Computing, Monticello, IL, Sept. 2008.

  22. C. Nair and A. El Gamal, “An outer bound to the capacity region of the broadcast channel,” IEEE Trans. Inf. Theory, vol. IT-53, no. 1, pp. 350–355, Jan. 2007. (pdf)

  23. H. Sat^o, “An outer bound to the capacity region of broadcast channels,” IEEE Trans. Inf. Theory, vol. IT-24, no. 3, pp. 374–377, 1978. (pdf)

  24. C. Nair and V. W. Zizhou, “On the inner and outer bounds for 2-receiver discrete memoryless broadcast channels,” in Proc. ITA Workshop, La Jolla, CA, 2008. urlhttp:arxiv.orgabs0804.3825

  25. Y. Liang, G. Kramer, and S. Shamai, “Capacity outer bounds for broadcast channels,” in Proc. Information Theory Workshop, Porto, Portugal, May 2008, pp. 2–4.

  26. A. A. Gohari and V. Anantharam, “An outer bound to the admissible source region of broadcast channels with arbitrarily correlated sources and channel variations,” in Proc. 46th Annual Allerton Conference on Communications, Control, and Computing, Monticello, IL, Sept. 2008.

  27. C. Nair and A. El Gamal, “The capacity region of a class of 3-receiver broadcast channels with degraded message sets,” 2008.

  28. S. Borade, L. Zheng, and M. Trott, “Multilevel broadcast networks,” in Proc. IEEE International Symposium on Information Theory, Nice, France, June 2007, pp. 1151–1155.

  29. R. F. Ahlswede and J. K"orner, “Source coding with side information and a converse for degraded broadcast channels,” IEEE Trans. Inf. Theory, vol. IT-21, no. 6, pp. 629–637, 1975. (pdf)

  30. I. Csisz'ar and J. K"orner, “Broadcast channels with confidential messages,” IEEE Trans. Inf. Theory, vol. IT-24, no. 3, pp. 339–348, 1978. (pdf)

Interference channel

  1. H. Sat^o, “Two-user communication channels,” IEEE Trans. Inf. Theory, vol. IT-23, no. 3, pp. 295–304, 1977. (pdf)

  2. A. B. Carleial, “A case where interference does not reduce capacity,” IEEE Trans. Inf. Theory, vol. IT-21, no. 5, pp. 569–570, 1975. (pdf)

  3. H. Sat^o, “The capacity of the Gaussian interference channel under strong interference,” IEEE Trans. Inf. Theory, vol. IT-27, no. 6, pp. 786–788, Nov. 1981. (pdf)

  4. M. H. M. Costa and A. El Gamal, “The capacity region of the discrete memoryless interference channel with strong interference,” IEEE Trans. Inf. Theory, vol. IT-33, no. 5, pp. 710–711, 1987. (pdf)

  5. X. Shang, G. Kramer, and B. Chen, “A new outer bound and the noisy-interference sum-rate capacity for Gaussian interference channels,” submitted to IEEE Trans. Inf. Theory, Dec. 2007.

  6. V. S. Annapureddy and V. V. Veeravalli, “Gaussian interference networks: Sum capacity in the low interference regime and new outer bounds on the capacity region,” submitted to IEEE Trans. Inf. Theory, Feb. 2008.

  7. A. S. Motahari and A. K. Khandani, “Capacity bounds for the Gaussian interference channel,” submitted to IEEE Trans. Inf. Theory, Jan. 2008.

  8. R. Etkin, D. Tse, and H. Wang, “Gaussian interference channel capacity to within one bit,” submitted to IEEE Trans. Inf. Theory, 2007. urlhttp:arxiv.orgabscs.IT0702045

  9. T. S. Han and K. Kobayashi, “A new achievable rate region for the interference channel,” IEEE Trans. Inf. Theory, vol. IT-27, no. 1, pp. 49–60, 1981. (pdf)

  10. H.-F. Chong, M. Motani, H. K. Garg, and H. El Gamal, “On the Han–Kobayashi region for the interference channel,” IEEE Trans. Inf. Theory, vol. IT-54, no. 7, pp. 3188–3195, July 2008. (pdf)

  11. A. El Gamal and M. H. M. Costa, “The capacity region of a class of deterministic interference channels,” IEEE Trans. Inf. Theory, vol. IT-28, no. 2, pp. 343–346, 1982. (pdf)

  12. .I. E. Telatar and D. N. C. Tse, “Bounds on the capacity region of a class of interference channels,” in Proc. IEEE International Symposium on Information Theory, Nice, France, June 2007. (pdf)

  13. S. A. Jafar and S. Vishwanath, “Generalized degrees of freedom of the symmetric K user Gaussian interference channel.” urlhttp:arxiv.orgabscs.IT0608070

  14. G. Bresler and D. N. C. Tse, “The two-user Gaussian interference channel: A deterministic view,” Euro. Trans. Telecomm., vol. 19, no. 4, pp. 333–354, June 2008.

  15. G. M. Ziegler, Lectures on Polytopes.   New York: Springer-Verlag, 1995.

Channels with state

  1. D. Blackwell, L. Breiman, and A. J. Thomasian, “The capacity of a class of channels,” Ann. Math. Statist., vol. 30, pp. 1229–1241, 1959.

  2. I. Csisz'ar and J. K"orner, Information Theory, 3rd ed.   Budapest: Akad'emiai Kiad'o, 1981.

  3. A. Lapidoth and P. Narayan, “Reliable communication under channel uncertainty,” IEEE Trans. Inf. Theory, vol. IT-44, no. 6, pp. 2148–2177, 1998.

  4. D. Blackwell, L. Breiman, and A. J. Thomasian, “The capacity of a certain channel classes under random coding,” Ann. Math. Statist., vol. 31, pp. 558–567, 1960. (link)

  5. R. Ahlswede and J. Wolfowitz, “Correlated decoding for channels with arbitrarily varying channel probability functions,” Inf. Control, vol. 14, pp. 457–473, 1969.

  6. R. Ahlswede and J. Wolfowitz, “The capacity of a channel with arbitrarily varying channel probability functions and binary output alphabet,” Z. Wahrsch. Verw. Gebiete, vol. 15, pp. 186–194, 1970.

  7. R. Ahlswede, “Elimination of correlation in random codes for arbitrarily varying channels,” Probab. Theory Related Fields, vol. 44, no. 2, pp. 159–175, 1978. (link)

  8. J. Wolfowitz, Coding Theorems of Information Theory, 3rd ed.   Berlin: Springer-Verlag, 1978.

  9. I. Csisz'ar and P. Narayan, “The capacity of the arbitrarily varying channel revisited: positivity, constraints,” IEEE Trans. Inf. Theory, vol. IT-34, no. 2, pp. 181–193, 1988.

  10. I. Csisz'ar and J. K"orner, “On the capacity of the arbitrarily varying channel for maximum probability of error,” Z. Wahrsch. Verw. Gebiete, vol. 57, no. 1, pp. 87–101, 1981.

  11. A. J. Goldsmith and P. P. Varaiya, “Capacity of fading channels with channel side information,” IEEE Trans. Inf. Theory, vol. IT-43, no. 6, pp. 1986–1992, 1997. (pdf)

  12. C. E. Shannon, “Channels with side information at the transmitter,” IBM J. Res. Develop., vol. 2, pp. 289–293, 1958. (pdf)

  13. A. V. Kuznetsov and B. S. Tsybakov, “Coding in a memory with defective cells,” Probl. Inf. Transm., vol. 10, no. 2, pp. 52–60, 1974.

  14. S. I. Gelfand and M. S. Pinsker, “Coding for channel with random parameters,” Probl. Control Inf. Theory, vol. 9, no. 1, pp. 19–31, 1980.

  15. C. Heegard and A. El Gamal, “On the capacity of computer memories with defects,” IEEE Trans. Inf. Theory, vol. IT-29, no. 5, pp. 731–739, 1983.

  16. C. Heegard, “Capacity and coding for computer memory with defects,” Ph.D. Thesis, Stanford University, Nov. 1981.

  17. M. H. M. Costa, “Writing on dirty paper,” IEEE Trans. Inf. Theory, vol. IT-29, no. 3, pp. 439–441, 1983.

  18. A. S. Cohen and A. Lapidoth, “The Gaussian watermarking game,” IEEE Trans. Inf. Theory, vol. IT-48, no. 6, pp. 1639–1667, 2002.

  19. W. Yu, A. Sutivong, D. J. Julian, T. M. Cover, and M. Chiang, “Writing on colored paper,” in Proc. IEEE International Symposium on Information Theory, Washington D.C., 2001, p. 302.

  20. G. Caire and S. Shamai, “On the capacity of some channels with channel state information,” IEEE Trans. Inf. Theory, vol. IT-45, no. 6, pp. 2007–2019, 1999.

  21. S. I. Gelfand and M. S. Pinsker, “On Gaussian channels with random parameters,” in Proc. the Sixth International Symposium on Information Theory, vol. Part 1, Tashkent, USSR, 1984, pp. 247–250, (in Russian).

  22. Y.-H. Kim, A. Sutivong, and S. Sigurj'onsson, “Multiple user writing on dirty paper,” in Proc. IEEE Int. Symp. Inf. Theory, Chicago, Illinois, June/July 2004, p. 534.

  23. Y. Steinberg, “Coding for the degraded broadcast channel with random parameters, with causal and noncausal side information,” IEEE Trans. Inf. Theory, vol. IT-51, no. 8, pp. 2867–2877, 2005.