An Exploration of Criminal Risks Posed by ChatGPT-like Generative Artificial Intelligence and Their Governance Pathways
[Research purpose]With the rapid development of generative artificial intelligence exemplified by ChatGPT,numerous issues have increasingly come to the fore,including risks of privacy breaches and data security,infringement of intellectual property rights,economic and financial crimes,and threats to social trust and information authenticity.Consequently,there arises an urgent need for an era of governance to address the criminal risks associated with generative AI.This study endeavors to explore a comprehensive governance framework aimed at addressing and mitigating such risks,thereby ensuring the rational utilization and sustainable development of generative AI technologies.[Research method]Utilizing the case analysis method,this study delves into the manifestations,causes,and harms of generative artificial intelligence-related crimes in specific cases.Additionally,it employs a comparative analysis approach to contrast the legal frameworks,policies,measures,and practical experiences in managing the risks of generative AI-related crimes both domestically and internationally.[Research conclusion]In the current governance framework addressing criminal risks associated with generative artificial intelligence,several issues persist,including the lag in legal regulation,the ambiguity surrounding the boundaries of the principle of technological neutrality,limitations in technical monitoring and identification capabilities,and the lack of cross-border regulatory cooperation.To realize the healthy development of generative artificial intelligence,it is imperative to adopt a multi-faceted and comprehensive approach,encompassing the refinement of legal and regulatory frameworks,the reinforcement of ethical guidelines and industry self-discipline,the enhancement of technical monitoring and defense capabilities,the strengthening of international cooperation and information sharing,and the elevation of public awareness and education.