Binary neural networks(BNNs)possess hardware-friendly characteristics,yet to ensure computational accura-cy,floating-point or fixed-point calculations are still required at the input layer,increasing hardware overhead.To address this issue,this paper applies another hardware-friendly stochastic computing method to BNNs,achieving efficient computation at the BNN input layer and designing a binary stochastic computing neural network(Bi-SC-NN)architecture.Firstly,high-precision stochastic computing units are used in the BNN input layer,achieving ac-curacy comparable to fixed-point computation.Secondly,by reusing random number generators within and between processing elements(PEs)and optimizing the computing units,Bi-SCNN effectively reduces hardware overhead.Lastly,the paper optimizes weight configuration methods based on input data characteristics,thereby reducing over-all computational latency.Compared with the existing best-performing BNN accelerators,Bi-SCNN achieves a 2.4-fold increase in throughput,a 12.6-fold increase in energy efficiency,and a 2.2-fold improvement in area efficien-cy,reaching 2.2 TOPS,7.3 TOPS·W-1,and 1.8 TOPS·mm-2 respectively.