The year 1948 witnessed the historic moment of the birth of classic information theory(CIT).Guided byr CIT,modern communication techniques have approached the theoretic limitations,such as,entropy function H(U),chan-nel capacity C=maxp(x)I(X;Y)and rate-distortion function R(D)=minp((x)|x):Ed(x,(x))≤DI(X;(X)).Semantic communication paves a new direction for future communication techniques whereas the guided theory is missed.In this paper,we try to establish a systematic framework of semantic information theory(SIT).We investigate the behavior of semantic communi-cation and find that synonym is the basic feature so we define the synonymous mapping between semantic information and syntactic information.Stemming from this core concept,synonymous mapping f,we introduce the measures of semantic information,such as semantic entropy Hs((U)),up/down semantic mutual information Is((X);(Y))(Is((X);(Y))),semantic channel capacity Cs=maxfxymaxp(x)Is((X);(Y)),and semantic rate-distortion function Rs(D)=min{fx,f(x)}minp((x)|x):Eds((x),(x))≤DIs((X);(X)).Furthermore,we prove three coding theorems of SIT by using random coding and(jointly)typical decoding/encoding,that is,the semantic source coding theorem,semantic channel coding theorem,and semantic rate-distortion coding the-orem.We find that the limits of SIT are extended by using synonymous mapping,that is,Hs((U))≤H(U),Cs≥C and Rs(D)≤R(D).All these works composite the basis of semantic information theory.In addition,we discuss the semantic information measures in the continuous case.Especially,for the band-limited Gaussian channel,we obtain a new channel capacity formula,Cs=Blog[S4(1+P/N0B)],where the average synonymous length S indicates the identification ability of information.In summary,the theoretic framework of SIT proposed in this paper is a natural extension of CIT and may reveal great performance potential for future communication.