A bird's-eye view on the hardware acceleration for training deep neural networks.
Comparison of PlaidML performance using OpenCL and Apple Metal backends.
A very brief description of Haskell tools I've encountered so far.
Observations I've made while trying to write my first non-trivial Haskell program.
How to define your Keras models in code to allow reuse.
Monitoring GPU (Graphical Processing Unit) utilization for deep learning.
Taking care of Python environments can be hard and the variety of tools overwhelming. But it doesn't have to be!
How the trained models can be persisted and reused across libraries and environments.
An overview of the tools and methods I use to organize my personal and work life.
A story on how I learned myself some machine learning and an advice to fellow engineers.
Machine learning is a software-heavy area where software development know-how is useful in a number of cases. Let's take a look at how applying engineering practices can help build ML products faster, make them more reliable and keep data scientists happy.