Brain–machine interfaces have evolved to a stage at which robotic arms can perform complex movements, such as reaching and grasping, in response to signals from the brain. Further progress, towards systems that allow dexterous control of an artificial limb, will require the development of somatosensory feedback, allowing touch to inform the use of the limb. A step towards that goal has been achieved with the demonstration of an interface that multiplexes neuronal signals from the motor cortex to control elements of a computer display with artificial tactile feedback through microstimulation of the somatosensory cortex. Monkeys learned to use this interface to move a computer cursor or hand image to explore visual targets. To discover which target would yield reward, the monkeys had to discriminate the microstimulation evoked from each target when the actuator touched the objects. Brain–machine interfaces1,2 use neuronal activity recorded from the brain to establish direct communication with external actuators, such as prosthetic arms. It is hoped that brain–machine interfaces can be used to restore the normal sensorimotor functions of the limbs, but so far they have lacked tactile sensation. Here we report the operation of a brain–machine–brain interface (BMBI) that both controls the exploratory reaching movements of an actuator and allows signalling of artificial tactile feedback through intracortical microstimulation (ICMS) of the primary somatosensory cortex. Monkeys performed an active exploration task in which an actuator (a computer cursor or a virtual-reality arm) was moved using a BMBI that derived motor commands from neuronal ensemble activity recorded in the primary motor cortex. ICMS feedback occurred whenever the actuator touched virtual objects. Temporal patterns of ICMS encoded the artificial tactile properties of each object. Neuronal recordings and ICMS epochs were temporally multiplexed to avoid interference. Two monkeys operated this BMBI to search for and distinguish one of three visually identical objects, using the virtual-reality arm to identify the unique artificial texture associated with each. These results suggest that clinical motor neuroprostheses might benefit from the addition of ICMS feedback to generate artificial somatic perceptions associated with mechanical, robotic or even virtual prostheses.