Abstract Spatial orientation is a prerequisite for most behaviors. In insects, the underlying neural computations take place in the central complex (CX), the brain’s navigational center. In this region different streams of sensory information converge to enable context-dependent navigational decisions. Accordingly, a variety of CX input neurons deliver information about different navigation-relevant cues. In bees, direction encoding polarized light signals converge with translational optic flow signals that are suited to encode the flight speed of the animals. The continuous integration of speed and directions in the CX can be used to generate a vector memory of the bee’s current position in space in relation to its nest, i.e. perform path integration. This process depends on specific, complex features of the optic flow encoding CX input neurons, but it is unknown how this information is derived from the visual periphery. Here, we thus aimed at gaining insight into how simple motion signals are reshaped upstream of the speed encoding CX input neurons to generate their complex features. Using electrophysiology and anatomical analyses of the halictic bees Megalopta genalis and Megalopta centralis , we identified a wide range of motion-sensitive neurons connecting the optic lobes with the central brain. While most neurons formed pathways with characteristics incompatible with CX speed neurons, we showed that one group of lobula projection neurons possess some physiological and anatomical features required to generate the visual responses of CX optic-flow encoding neurons. However, as these neurons cannot explain all features of CX speed cells, local interneurons of the central brain or alternative input cells from the optic lobe are additionally required to construct inputs with sufficient complexity to deliver speed signals suited for path integration in bees.