I n

1958 the composer Varèse, structural engineer and composer Xenakis, architect and artist Corbusier, film cameraman and producer Agnostini, and the technical team of Philips Corporation build the Philips Pavilion for the Brussels World Fair (fig 1). Xenakis transmutes axonometric projection into staff notation. The amalgam notation transmutes into four-dimensional data for electronic machines. The machines distribute sounds to 350 speakers in the pavilion. The pavilion coextends qualities of plasticity and projection in architecture and music. Point, line, plane, time, interval, and concrete make a continuous whole.

Prior to this moment, most audiences privately hear broadcasts from a unidirectional source (fig. 2), see plastic art as either static boxes on a wall or solid masses in a box (fig. 3), and in public experience both together sitting down (fig. 4). Audiences experience art objects from outside in linear time. In the pavilion poly-directional sounds shroud the audience in non-linear or polyrhythmic time. These times give way to continuity, eternity, and formlessness.

Coextensive qualities of plasticity, projection, and reception intensify with the introduction of electronic machines. Despite the intricate contrivances of the pavilion’s electronic system, one could break the process into three components: the notation inputs a message into a mediator that dispatches information to a receiver. But message, medium, and receiver give way to a much less discrete model, especially when one thinks about the complex screening processes in electro-acoustic prosthetics.

The pavilion merges notation, sound, structure, and concrete with finesse, but falls short in relating this combination to light and film projections. If one were to take the pavilion’s immersive sound model and enhance it with the increasing fluidity of digital, electro acoustic, and audio-visual systems then one would arrive at an integrative electro-acoustic-optical-architectural system.