i have a core MLModel how work with live camera and show box around bat and ball but when we apply on saved video it is not possible because in live camera swift get continuously buffer but in saved video we have not any buffer i we read buffer from saved video but we cannot play in player we want buffer where video is playing in player then we show detection .
Simply i want to get buffer where video play in player then send to model to show detection of object .
i am try asset read and get buffer show buffer in view.content look like video but i am show image and also send same image to model to show detection i want to play video in player give buffer at seek time of player we send buffer to model show detection .