Memory Complexity with Shared Registers In his seminal paper on communication complexity, Yao considers two processes exchanging bits alternatively. Each process, disposing of an unbounded amount of memory and computing power, receives a chain of k bits as an input. Yao studies the minimum number of bits to exchange for computing some given Boolean function deterministically. Disjointness and Equality are examples of such functions. He also studies the case where randomisation is used, by assuming that the processes dispose initially of a string of random bits, on top of the arguments of the function. The approach proposed here has some similarities, the computation of Boolean functions is considered, but differs in many aspects from Yao's theory. First, there are P processes, P > 2, which communicate through a connected network. The communication mode is not bit by bit, but by shared registers. Each process receives initially a chain of k bits, for some k > 0. The processes run a distributed algorithm for computing (in a special sense) a Boolean function whose arguments are the P chains of k bits. While communication complexity concerns the number of exchanged bits, the study concerns the minimum size of the registers (memory complexity). The beauty of the communication complexity theory lies in the consideration of a purely combinatorial structure on matrices, the monochromatic rectangles. The aim of the internship is to study a combinatorial approach for computing the memory complexity of some class of boolean functions.