top of page

Approximate Computing

Results are right, right?

Concept

Approximate computing uses imprecise logic to compute approximately-correct results.

It is more power efficient than precise logics. A naive example is that fixed-point multiplications are approximation of floating-point multiplications.


 


Application

Approximate computing has been used for image processing, machine learning, deep learning, etc.


 

Effort

My research on approximate computing includes but not is limited to algorithm, circuit, architecture, application.



bottom of page