Deep Learning needs GPU hardware which I do not own, not yet, so I need a cloud service. The newest solution is this Amazon Machine Image for DL on Ubuntu coupled with p2x.large spot instances from the Ireland zone at about $0.20/hr. In the long run, an own rig with GPU should contain at least a NVIDIA GTX 1060 6GB because of their CUDA platform and cuDNN accelerated deep learning library. This user built his for $800 recently and may be a good starting point, it will be a purchase for professional use.
I am grateful for this free, seven week Practical Deep Learning course for Coders online at fast.ai by Jeremy Howard and Rachel Thomas, which is helping me revise deep learning techniques after too much theory and too many books. It suits Windows-minded Python users like me very well with the AWS cloud hardware for a GPU instanced from a Python Anaconda platform & packages image. They even provide students with video lessons, an internal forum and a github repository with the full Jupyter notebooks! Thank you, really, giving back with a donation once all the materials are digested and my course completed.
Some still ask me about poetry I was practicing as a youngster. Little interest nowadays, mainly because I do not believe in its social importance anymore, but it still holds as a private matter. Are you interested in personal matters, I hope not, not your business anyway? Therefore, I can add I am following some protocols for languages, tech languages, as a way to understand if invariants exist. Prof. Fabb wrote about Meter in Poetry, still a valuable handbook to me.
The latest software concept to allow modularity and scalability is containers a la Docker. If you couple these with blockchain access, you have a pretty secure environment for doing what you need to do: transactions, exchanging messages, etc. One ecosystem trying to get into this market is Monax.io. I have an interesting upgrade on this concept and will work on a prototype together with a couple of old friends based in Pisa. MVP by this Summer.
All tech troubles for innovative nuclear come from current materials not being good enough to sustain requirements aka bear expectations. Is there a way to accelerate material science research then? Recent computation advancements really can’t help? While browsing the Angel.co database, this Exabyte.io emerged as a materials discovery cloud. Will their approach do the trick? My opinion is that you need to know where you wish to find novelty from local domain applications first, it would be a shot in the dark otherwise: applied research vs blue-sky research.
I am trying to get involved with some startup for r&d in Angel.co and browsed the site in the weekend. It is still good value but a lot of recent dross makes it almost unrecognisable and substantially flooded. It is not a signal of quality or global reach any more being there, just another web place to spam from anywhere. That said, surfing was fun because I got some trends, so thank you and see you again in a few months.
Eventful March 2017 in the US nuclear industry: -1- PWR NuScale SMR is now officially under review for license by the NRC; -2- X-Energy Xe-100 HTGR moves to conceptual design phase; -3- mPower Consortium halted its project for an integral SMR PWR because lacking traction. These add to -4- the much-hyped startup Transatomic backtracking on its key promises last month and -5- Westinghouse being in trouble because of its parent Toshiba financial issues. Wow! Who says nuclear is boring over there?
Will go to MECSPE Parma on Saturday and I am having a look at the nearly 2000 exhibitors websites for introductory e-mails about r&d external collaboration, where opportune. Apart of industrial buildings, very often paraded with pride, one useful piece of info is that many shops in the mechanical field use Dassault SolidWorks as a 3D software package for their duties. It is quite expensive to sustain a single-seat license without a stream of work, but the freeware FreeCAD is not ready at commercial level yet.
The most comprehensive resource I found about blockchain for industrial IoT is a late 2016 paper by Bahga and Madisetti, which implements an Ethereum application for the supply chain in the manufacturing field on top of a cloud-based model. The problem here is adoption, with big corps like IBM working on their own version. This solution may well stay on paper and die, but they are funded by unis so ok. On the other hand, for indie devs the point is in getting paid some gas / cryptocurrency fraction for providing the apps and then get real money for it. While b2b is going towards private networks of ledgers, b2c may be more open, a la bitcoin, and I want to do something here.
Can you treat deep learning as a blackbox? It is not advisable but perfectly doable if you use Keras 2, released three days ago and now better integrated with Google TensorFlow (other than the much simpler Theano). How to have it then? Install Anaconda using this tutorial and you are off to go, Anaconda is the reference package to many Windows users and me. What to do with deep learning in the industrial or IoT field? Told you that already and more to come, ehehe.