Hardware accelerators for AI-driven healthcare
We published our perspective on integrating neural net accelerators into the clinical & outpatient workflow in the IEEE Transactions on Biomedical Circuits and Systems journal.
Neural nets are extremely resource-intensive. For example, GPT-3 needs the equivalent energy of a nuclear reactor running for an entire month just to train. For continuous, portable monitoring, this simply isn't an option.
This paper dives into the hardware constraints we currently face in using deep learning in healthcare. We go into the viability of using neuromorphic computing, spiking neural nets, and in-memory computing in alleviating these constraints.