Simon Bührer
All papers

BitLogic: A Training Framework for Gradient-Based FPGA-Native Neural Networks

Simon Bührer, et al. · TMLR (under review) · February 2026

Abstract

End-to-end framework for training FPGA-native neural networks using differentiable lookup tables instead of MAC operations. Provides modular PyTorch APIs, hardware-aware components, and automated RTL export with bit-accurate equivalence. Achieves 72.3% CIFAR-10 accuracy with <0.3M LUTs and <20 ns inference latency using LUT resources only.

Tags

  • FPGA-Native ML
  • Hardware-Aware Training
  • Differentiable Logic
  • RTL HDL Generation
  • PyTorch
  • Low-Latency Inference