Robotics with machine learning / deep learning may be the next big thing from a venture capital perspective. Incidentally, it is pretty complete and affordable as an indie r&d, modular and scalable project, so let’s get in! A couple of nice projects to perform some tests are the RobotArm from ftobler and the mBot from makeBlock. I am more focused on Python environments and computer vision / deep learning for the arm, though, so something like this semi-professional Dorna, now on kickstarter, might come better at under €1000.
“Deep neural networks (DNNs) are currently widely used for many artificial intelligence (AI) applications including computer vision, speech recognition, and robotics. While DNNs deliver state-of-the-art accuracy on many AI tasks, it comes at the cost of high computational complexity. Accordingly, techniques that enable efficient processing of DNNs to improve energy efficiency and throughput without sacrificing application accuracy or increasing hardware cost are critical to the wide deployment of DNNs in AI systems. This article aims to provide a comprehensive tutorial and survey about the recent advances towards the goal of enabling efficient processing of DNNs.” Vivienne Sze, Yu-Hsin Chen, Tien-Ju Yang, Joel Emer. Full article here: arXiv.org > cs > arXiv:1703.09039
A new book by Francois Chollet, the inventor of Keras.io, is going to be released in Jan 2018 under the title Deep Learning with Python from Manning. It will present the state-of-the-art neural networks architectures and implementations for supervised learning, relying upon the TensorFlow library as the main backend engine of his own excellent Keras wrapper. The ebook is in progress but complete and available in pre-release already at a discount, coupon code from the free materials on the Manning webpage itself. Short tip from the late 2017 forefront: use gradient boosting machines for shallow learning and keep deep learning for perceptual problems.
Even if PyTorch for deep learning does not support Windows natively, there is an unofficial version being followed through this issue #494 topic on their GitHub repo. After one month away without my laptop, it is possible I’m going to need an update, provided that it does not break my hard earned stability with Anaconda on Python 3.5 and packages. PyTorch is the forefront these days, but I am not convinced feature engineering is past its best, so a small technological debt may save my stable install.
I was a fan of the Julia language a couple of years ago, I could see distributed computing and data science as natural targets for that kind of semantics. That said, I have struggled a lot to follow on, mostly because “Language level is what developers care about, but the majority of programmers are not developers.” and I am not a developer indeed. That’s why I am now a full Python language adept and very happily so.
1000+ jobs from the latest “Who is hiring (Oct 2017)” on Hacker News, computer science trends from the Silicon Valley and the US: developers are very strong again, less industry focus is required, data science seems bust. Two reasons: data science top markets are full already and specialists are more in demand than generalists because of their domain knowledge. All the rest is infrastructure and software to support a business, therefore proper computer science.
Computer science research is so fast these days that publications are not peer reviewed anymore, also because publishing has a cost too much often unsustainable. Here comes ArXiv.org, a new but really hot aggregator of open access papers supported by Cornell University Library. Of course, the most part is dross or actually impossible to assess, peer reviewing being slow and painful if you try something audacious. That said, deep learning is worth a look into the magma and this is a real recent gem: A Brief Survey of Deep Reinforcement Learning from Imperial College, London, UK, both algorithms and implementations.
October in Turin with two nuclear projects: novel materials from data science methods (hard) & molten salt reactors engineering design (ok).
The Python environment for deep learning has a new golden boy in the form of the PyTorch package. It seems faster and less convoluted than Tensorflow, other than more scalable. At this stage, it works for Linux and MacOs only, so I am going to need a double boot on my Win 10 laptop, again, as for Salome Meca 2017 for nuclear engineering design. Good!
I am reconsidering my commitment to GitHub for showing my data science & deep learning code. Actually, I will not put anything into my account there for the time being. I am neither a public persona living off conferences nor a developer looking for a job, so GitHub is worth very little to me. Fundamentally, I am hearing more and more stories about GitHub accounts being raided by unscrupolous firms who steal the code and resell it under their copyright without a problem. I prefer to keep my things private, then, only releasing reproducible results in a formal manner.