The full title of this project is: Life-like Behaviour in a Gestural Interface for Interacting with Sound. The present paper introduces a music improvisation system exploring user supplied physical gestures and independent generative behaviour. We avoid the notion of explicit instrumental control in favour of implicit influence. In addition, global system behaviour results from blending physical and algo- rithmic activity at multiple levels. An evolving population of gestures interact, mutate and evolve according to a virtual physics. Data extracted from gestures feed a dynamic mapping scheme influencing a population of granular synthe- sizers. The system affords intimate human-machine musical improvisation by merging direct speculative action and unpredictable yet coherent algorithmic behaviour in an integrated structure.