Social robots commonly rely on facial expressions and gestures to convey emotions. However, many robots follow a predetermined sequence, executing a set of facial animations and movement sequences once an emotion is identified. This rigid approach can lead to unnatural processing when confronted with additional stimuli during an ongoing emotional expression. This may cause the robot to ignore the new stimulus until the emotion is fully expressed, or to abruptly move on to the next one. To address this limitation, we implemented an emotion engine with a linear dynamic affect-expression model (LDAEM) that calculates the emotion based on stimuli and determines the corresponding facial expression and robot movements. By leveraging the Ekman 6 basic emotions, our emotion engine incorporates 12 control points (CPs) for facial expression and 3 CPs for movement. Experimental results demonstrate the dynamic adaptation of emotions to stimuli. Notably, our approach allows for smooth transitions between emotions, even when different emotional stimuli are introduced during an ongoing emotional expression. Moreover, it can be seamlessly applied to other robotic systems, offering a versatile framework for emotional expression.