Neuromorphic vision-based robotic tactile sensors fuse touch and vision, enabling manipulators to efficiently grip and identify objects. Precise robotic manipulation requires early detection of slips on the grasped object, which is crucial for maintaining grip stability and safety. Modern closed-loop feedback technologies use measurements from neuromorphic vision-based tactile sensors to control and prevent object slippage. Unfortunately, most of these sensors measure and report data-based rather than model-based information, resulting in less efficient control capabilities. This work proposes physical and mathematical modeling of an in-house-developed neuromorphic vision-based robotic tactile sensor that utilizes a protruded marker design to demonstrate the model-based approach. This sensor is mounted on the UR10 robotic manipulator, enabling manipulation tasks such as approaching, pressing, and slipping. The neuromorphic vision-based robotic tactile sensor-derived mathematical model revealed first-order system behavior for three manipulation-related actions under study. Experimental robotic manipulator grasping work is conducted to verify and validate the sensor’s derived mathematical FOS model. Two data analysis approaches, temporal and spatial–temporal model based, are adopted to classify the manipulator-sensor actions. A long short-term memory (LSTM) temporal classifier is engineered to exploit the sensor’s derived model. Also, the LSTM spatial–temporal classifier is designed using an event-weighted centroid of the region-of-interest features. Both LSTM methods successfully identified the robotic actions performed with an accuracy of more than 99%. Additionally, quantitative slip rate estimation is carried out based on centroid estimation, and qualitative assessment of pressing force is performed using a fuzzy logic classifier.