首页|基于少样本学习的书籍装帧手写字体生成

基于少样本学习的书籍装帧手写字体生成

Book Binding Handwriting Font Generation Based on Few-Shot Learning

扫码查看
手写字体具有一定的亲和度,尤其在书籍装帧设计中更具人性化和表达力,但手工设计字体流程烦琐且专业程度要求高.通过基于人工智能的手写字体生成算法帮助设计是解决这一问题的有效途径.本研究采用少样本字体生成网络,其中在特征提取方面采用了多子编码器提取了多个特征,可更好地捕捉不同的局部概念;在样本训练方面采用端到端的训练方法,可以显著减少训练时间,提高字体生成效率.采用定性与定量分析相结合的方式,对多种字体生成方法进行了深入比较.相较于其他方法,本研究所提出方法的FID与LPIPS值更低,这充分证明了本研究方法在生成效果上的优越性,生成的字体更加清晰且高度逼真.该方法为书籍装帧设计提供了更快、更有效的解决方案,简化了字体设计的烦琐过程,提升了设计效率.未来的研究可进一步优化生成字体的质量和多样性,以满足不同书籍装帧设计的需求.
Handwritten fonts possess a certain level of affinity,particularly in book layout design,exhibiting humanization and expressiveness.However,manual font design processes are cumbersome and demand high levels of expertise.An effective approach to address this issue is utilizing artificial intelligence-based handwritten font generation algorithms to aid in design.In this paper,a few-shot font generation network was employed,wherein multiple style features were extracted using multi-sub-encoders for better capturing diverse local concepts.In terms of sample training,an end-to-end training approach was adopted,significantly reducing training time and enhancing font generation efficiency.Through a combination of qualitative and quantitative analysis,the various font generation methods were extensively compared.Compared to other methods,the approach proposed in this paper yielded lower FID and LPIPS values,adequately demonstrating its superiority in generation performance,with generated fonts being clearer,meeting the requirement of high fidelity.This method provides a faster and more effective solution for book layout design,streamlining the cumbersome process of font design and enhancing design efficiency.Future research can further optimize the quality and diversity of generated fonts to meet the needs of different book layout designs.

Generative adversarial networksFew-shot learningFont generationEnd-to-end leaning

王志敏、朱磊、张媛

展开 >

北京印刷学院 机电工程学院,北京 102600

北京印刷学院 邮政行业技术研发中心,北京 102600

生成式对抗网络 少样本学习 字体生成 端到端学习

2024

数字印刷
中国印刷科学技术研究所

数字印刷

北大核心
ISSN:2095-9540
年,卷(期):2024.(4)
  • 1