首页|基于多场景序列匹配的视音频互动行为数据在线共享方法

基于多场景序列匹配的视音频互动行为数据在线共享方法

扫码查看
视音频在多场景下共享时,由于过多干扰量导致匹配难度较大,因此提出一种基于单播流与多播流的互动行为数据在线共享算法来有效解决此问题.考虑到网络具有自我保护机制,本身就存在共享障碍,需采集历史数据建立安全保护函数,结合网络流量剩余定理计算不同大小和私密程度的视音频文本,记录每一次身份验证时流量的变化,设立验证门限.根据视音频互动行为特征,建立观测序列,按照特征值查找匹配度最高的场景,结合每次共享时视音频产生的资源增益,分别计算单播流和多播流形式下资源增益的对应带宽,生成适应度最高的共享序列.实验证明,所提方法对单播流和多播流形式的视音频数据均能实现精准共享,效率较好,共享代价很低,具有极高的实用价值.
An Online Sharing Method of Video and Audio Interactive Behavior Data Based on Multi-scene Sequence Matching
When video and audio are shared in multiple scenes,there is much interference.Hence,an online sharing algorithm of interactive behavior data based on unicast stream and multicast stream is proposed to effectively solve the problem.Because the network has a self-protection mechanism and there are sharing obstacles,it is necessary to collect historical data to establish a security protection function,calculate video and audio texts of different sizes and degrees of privacy in combination with the network traffic surplus theorem,record the changes of traffic during each authentication,and set the authentication threshold.According to the interactive behavior characteristics of video and audio,the observation sequence is established,and the scene with the highest matching degree is found according to the eigenvalue.Combined with the resource gain generated by video and audio during each sharing,the corresponding bandwidth of resource gain in the form of unicast stream and multicast stream are calculated respectively to generate the sharing sequence with the highest fitness.Experiments show that the proposed method can achieve accurate sharing of video and audio data in the form of unicast stream and multicast stream,with good efficiency and low sharing cost,and has high practical value.

multi-scene sequence matchinginteractive behavior dataonline sharingobservation sequenceresource gain

王佳培、李伟、王培宏

展开 >

国网浙江省电力有限公司培训中心,浙江,杭州 310000

多场景序列匹配 互动行为数据 在线共享 观测序列 资源增益

国家电网浙江省电力公司教育培训开发项目

9111JP21N01P

2024

微型电脑应用
上海市微型电脑应用学会

微型电脑应用

CSTPCD
影响因子:0.359
ISSN:1007-757X
年,卷(期):2024.40(5)