Home

February 1, 2026

Rational Rank Conjecture?

Last year in the UBC CS Theory Seminar, Lianna Hambardzumyan from UVic presented their work on a new equivalent formulation of the log-rank conjecture. They defined a new complexity measure for Boolean matrices, the signed rectangle rank, denoted $\srr$, and showed that it is polynomially related to the rank, denoted $r$, through the following relation: $$\srr\leq O(r\log r).$$ This article will discuss the proof of this result, and some thoughts about different notions of rank of Boolean matrices, and their connections to communication complexity.

Communication is Bounded by Rank (not log-rank unfortunately...)

As a warmup for the next section, I will show that $D(M)\leq \rk(M)+1$ for any Boolean matrix $M$, where $D(M)$ is the deterministic communication complexity of $M$. The main idea is to show that $M$ can have at most $2^{\rk(M)}$ distinct columns, hence we can design a communication protocol by encoding the distinct columns with $\log_2 2^{\rk(M)}=\rk(M)$ bits and compute the function using $\rk(M)+1$ bits.

We now proceed to show that a rank $r$ Boolean matrix $M$ can have at most $2^r$ distinct columns (and rows).

Proof

Take a linearly independent set of rows of $M$, and put them in a matrix $R$. Then $M$ can be factored to $M=C\cdot R$ for a unique matrix $C$ with $r$ columns. By our choice, the matrix $R$ is Boolean, so it can have at most $2^r$ distinct columns.

Now this means that $M$ must also have at most $2^r$ distinct columns, since the linear map associated with matrix $C$ is injective, as desired. $\Box$

Signed Rectangle Rank

The signed rectangle rank of a Boolean matrix $M$ is defined as the minimum number of $\pm 1$ linear combinations of rank $1$ Boolean matrices that sums to $M$. We see that $\rk(M)\leq \srr(M)\leq \chi_1(M)$, where $\chi_1(M)$ is the partition number of $M$, defined as the minimum number of primitive matrices that sums to $M$.

In the original paper, they defined a set $S$ of columns of $M$ to be independent if all $2^{|S|}$ possible subset sums are distinct. The result $\srr(M)\leq O(\rk(M)\log \rk(M))$ follows from two claims.

Claim 1: For any independent subset $S$ of $M$, $|S|\leq O(\rk(M)\log \rk(M))$.

Proof

We show this by double counting. Put all elements of $S$ in a matrix $N$, and let $P$ be a $|S|\times 2^{|S|}$ Boolean matrix where all columns are distinct. Then $N\cdot P$ is a matrix whose columns are all distinct subset sums of $S$.

Now factor matrix $N$ into $N=C\cdot R$, where $R$ consists of $\rk(M)$ linearly independent rows of $N$. Now $N\cdot P=C\cdot (R\cdot P)$. Notice that $R\cdot P$ is a $\rk(M)\times 2^{|S|}$ matrix, with entries in $\{0,1,\dots,|S|\}$. Therefore it has at most $(1+|S|)^{\rk(M)}$ distinct columns. Since $C$ is injective, $C\cdot (R\cdot P)$ has at most $(1+|S|)^{\rk(M)}$ distinct columns, which means that $$2^{|S|}\leq (1+|S|)^{\rk(M)},$$ and rearranging gives $$|S|\leq O(\rk(M)\log \rk(M)),$$ as claimed. $\Box$

Claim 2: Let $S$ be a maximal independent set of $M$. Then any column of $M$ is a $\pm 1$ linear combination of elements in $S$.

Proof

For any column of $M$ not in $S$, by maximality of $S$, there must exist a subset sum $v_+$ of $S$ and a subset sum $v_-$ of $S$ such that $v_-+v=v_+$, which gives $v=v_+-v_-$, that is, $v$ is a $\pm 1$ linear combination of elements in $S$, as claimed. $\Box$

From these two claims, it's now easy to see that $M$ can be written as a $\pm 1$ linear combination of at most $O(\rk(M)\log \rk(M))$ rank $1$ Boolean matrices.

Non-deterministic Rank

Let $M$ be an $n\times m$ Boolean matrix over $\RR$. We say that a matrix $A$ is a non-deterministic representation (or support-representation) of $M$ if $M_{ij}=1\Leftrightarrow A_{ij}\neq 0$. Define the non-deterministic rank of $M$, denoted $\nrk(M)$, as the minimum rank over all non-deterministic representations of $M$. This notion has been studied in quantum communication complexity (see this), where $\nrk(M)$ is related to non-deterministic quantum communication complexity through the following tight connection: $$NQcc(M)=\lfloor \log \nrk(M)\rfloor+1,$$ where $NQcc(M)$ is the non-deterministic quantum communication complexity.

Note that $\nrk(M)$ can be constant even when $\rk(M)$ is almost full. For example, consider the matrix $M=J_n-I_n$. This has $\rk(M)=\Theta(n)$, yet $\nrk(M)=O(1)$, since it can be non-deterministically represented as the following difference of rank $1$ matrices: $$\begin{pmatrix} 1 & 2 & 4 & \cdots & 2^{n-1}\\ 2 & 4 & 8 & \cdots & 2^n\\ \vdots &\vdots & \ddots &\vdots & \vdots\\ 2^{n-2} & 2^{n-1} &\cdots &\cdots & 2^{2n-3}\\ 2^{n-1} & 2^n &\cdots &\cdots & 2^{2n-2} \end{pmatrix}-\begin{pmatrix} 1 & 1 & 1 & \cdots & 1\\ 4 & 4 & 4 & \cdots & 4\\ \vdots &\vdots & \ddots &\vdots & \vdots\\ 2^{2n-4} & 2^{2n-4} &\cdots &\cdots & 2^{2n-4}\\ 2^{2n-2} & 2^{2n-2} &\cdots &\cdots & 2^{2n-2} \end{pmatrix}$$

Rational Rank

Motivated by rational degree of Boolean functions, where $\rdeg(f)=\max\{\ndeg(f),\ndeg(\neg f)\}$, we can define the rational rank of $M$ as $\rrk(M)=\max\{\nrk(M),\nrk(\neg M)\}$. Some natural question about this notion are whether $\rrk(M)$ is polynomially or quasi-polynomially related to $\rk(M)$. Notice that since $|\rk(M)-\rk(\neg M)|\leq 1$, this question is the same as asking the relation between $\rrk(\neg M)$ and $\rk(\neg M)$. I do not have any immediate counterexample for this question. Another question is whether this relates to any computational complexity measures.

Something that I currently can show is that $$\max(\nrk(M),\nrk(\neg M))\geq \Omega(\log \rk(M)).$$ This follows from lemma 3.6 of this paper that $M$ must contain a submatrix equal to one of $I_k, \neg I_k, GT_k, \neg GT_k$ for $k=\Omega(\log \rk(M))$, where $GT_k$ is the upper triangular Boolean matrix on $k$ variables. This is very similar to the claim that all Boolean functions depending on $n$ variables must restrict to a non-trivial symmetric function on $\log^*(n)$ number of variables (this is iterated logarithm). Notice that all $4$ of these matrices have rational rank at least $k$, so the inequality follows.