A long history has passed since electromyography (EMG) signals have been explored in human-centered robots for intuitive interaction. However, it still has a gap between scientific research and real-life applications. Previous studies mainly focused on EMG decoding algorithms, leaving a dynamic relationship between the human, robot, and uncertain environment in real-life scenarios seldomly concerned. To fill this gap, this paper presents a comprehensive review of EMG-based techniques in human-robot-environment interaction (HREI) systems. The general processing framework is summarized, and three interaction paradigms, including direct control, sensory feedback, and partial autonomous control, are introduced. EMG-based intention decoding is treated as a module of the proposed paradigms. Five key issues involving precision, stability, user attention, compliance, and environmental awareness in this field are discussed. Several important directions, including EMG decomposition, robust algorithms, HREI dataset, proprioception feedback, reinforcement learning, and embodied intelligence, are proposed to pave the way for future research. To the best of what we know, this is the first time that a review of EMG-based methods in the HREI system is summarized. It provides a novel and broader perspective to improve the practicability of current myoelectric interaction systems, in which factors in human-robot interaction, robot-environment interaction, and state perception by human sensations are considered, which has never been done by previous studies.