# Maximal correlation and monotonicity of free entropy.

@article{Dadoun2020MaximalCA, title={Maximal correlation and monotonicity of free entropy.}, author={Benjamin Dadoun and Pierre Youssef}, journal={arXiv: Operator Algebras}, year={2020} }

We introduce the maximal correlation coefficient $R(M_1,M_2)$ between two noncommutative probability subspaces $M_1$ and $M_2$ and show that the maximal correlation coefficient between the sub-algebras generated by $s_n:=x_1+\ldots +x_n$ and $s_m:=x_1+\ldots +x_m$ equals $\sqrt{m/n}$ for $m\le n$, where $(x_i)_{i\in \mathbb{N}}$ is a sequence of free and identically distributed noncommutative random variables. This is the free-probability analogue of a result by Dembo--Kagan--Shepp in classical… Expand

#### References

SHOWING 1-10 OF 14 REFERENCES

Operator-valued distributions. I. Characterizations of freeness

- Mathematics
- 2001

Let $M$ be a $B$-probability space. Assume that $B$ itself is a $D$-probability space; then $M$ can be viewed as $D$-probability space as well. Let $X$ be in $M$. We look at the question of relating… Expand

A free analogue of Shannon's problem on monotonicity of entropy

- Mathematics
- 2005

Abstract We prove a free probability analog of a result of [S. Artstein, K. Ball, F. Barthe, A. Naor, Solution of Shannon's problem on monotonicity of entropy, J. Amer. Math. Soc. 17 (2004) 975–982].… Expand

Solution of Shannon's problem on the monotonicity of entropy

- Mathematics
- 2004

It is shown that if X1, X2, . . . are independent and identically distributed square-integrable
random variables then the entropy of the normalized sum
Ent (X1+ · · · + Xn over √n) is an increasing… Expand

Monotonicity of Entropy and Fisher Information: A Quick Proof via Maximal Correlation

- Mathematics, Computer Science
- Commun. Inf. Syst.
- 2016

A simple proof is given for the monotonicity of entropy and Fisher information associated to sums of i.i.d. random variables. The proof relies on a characterization of maximal correlation for partial… Expand

On the multiplication of free N-tuples of noncommutative random variables

- Mathematics
- 1996

<abstract abstract-type="TeX"><p>Let <i>a</i><sub xmlns:m="http://www.w3.org/1998/Math/MathML" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">1</sub>,...,… Expand

The analogues of entropy and of Fisher's information measure in free probability theory
V. Noncommutative Hilbert Transforms

- Mathematics
- 1998

The semicircle law, free random variables, and entropy

- Mathematics
- 2000

Overview Probability laws and noncommutative random variables The free relation Analytic function theory and infinitely divisible laws Random matrices and asymptotically free relation Large… Expand

The analogues of entropy and of Fisher's information measure in free probability theory, I

- Mathematics
- 1993

Analogues of the entropy and Fisher information measure for random variables in the context of free probability theory are introduced. Monotonicity properties and an analogue of the Cramer-Rao… Expand

Remarks on the maximum correlation coefficient

- Mathematics
- 2001

The maximum correlation coefficient between partial sums of independent and identically distributed random variables with finite second moment equals the classical (Pearson) correlation coefficient… Expand

Free probability and random matrices

- 2012

In these lectures notes we will present and focus on free probability as a tool box to study the spectrum of polynomials in several (eventually) random matrices, and provide some applications.