Literary critics say that there are three kinds of literature today: one to entertain, one to mimick and confirm the accepted wisdom, one to challenge that wisdom for themes and language. Proper literature is the third in the list and disappearing because less and less critics (and readers) nurture the cultural means and the personal attitude needed to approach such challenging texts. In my modest opinion, nothing is more challenging and impacting today than science, that’s why humanities are losing traction while getting confined in a museum: all we needed to know from humans about humans, we know now, the unknown being elsewhere for us as a species and humanity as a whole. What we know about us, ultimately how to behave ourselves avoiding extinction, is being carried with us and may help or not for approaching alien environments like outer space. It will be survivors’ texts again.
Nothing really useful to automatise data extraction from academic papers exists, so the finest path is still manual. I can imagine a dictionary becoming standard for each given field and being provided with every future .pdf paper as a companion in the form of a .csv sparse matrix or similar format. Such dictionaries will be put forward by researchers for exchanging data in the smoothest way, agreed as a standard after some time and then forced into adoption worldwide, also to feed machine learning algorithms. We will get there for sure, some fields before others, some industries have started already, aeronautics, automotive.
I am still listing my 2500+ papers archive about nuclear engineering, materials science and Gen IV plants. It seems that about 1/5 will be fit for purpose. There are some very detailed but superficial proposition and many going deep on specific subjects involving materials: experimental and numerical data, microscopy, critical components. This is my best shot to write a paper for something like the recent Nuclear Data 2016 Conference in Bruges, the primary conference for the advancement of nuclear data in the interest of both science and technology.
After the Monax.io startup from a previous post here going actively for an apps ecosystem, I found both Microsoft Azure offering blockchain as a service and NeuVector doing containers security alive and kicking. These three combined, pretty much close the need for an autonomous r&d service and I am not for reinventing the wheel at all. Project off, then, I will have a proper look at industrial applications and IoT instead of software solutions. Pisan friends also alerted and informal agreements dissolved.
Nuclear materials are mainly subjected to irradiation, creep, fracture and corrosion. The combination of loads puts the integrity of critical components at risk. We do not want radioactive releases, so cracks that may propagate and burst open must be avoided. When components break, they exhibit a characteristic fracture surface from which the type of fracture can be inferred, it being different for different classes of materials. The French startup Tortoise.io says that the fracture surface of every material is similar to fluid turbulence and devised an algorithm to sell as a SaaS, getting rid of material-specific expertise. We will see.
The world first bench for testing drones is being developed in Foggia by DPM Elettronica under the name DronesBench. It is at the pre-commercialisation stage and disseminating its DronesBench Index, an all-in-one parameter to measure the efficiency of drones which may come good for certification purposes. An industrial paper will be presented at the I2MTC 2017 Conference in Turin, 22-25 May 2017 and possibly at the Commercial UAV Expo 2017 in Brussels, 20-22 June 2017. See you there!
Toshiba made Westinghouse filing for bankruptcy in the US last month and the ongoing debate is what it will mean for the nuclear industry worldwide. Westinghouse is (was?) the benchmark for large PWR reactors worldwide, the most deployed, the new way for financial reasons being small modular reactors proposed by competitors and startups. More at large, it seems some sort of decentralisation of the business is now happening in the West, which means more hope for small players and outsiders maybe.
I wrote about Foggia train station contest a while ago with the aim to contribute using data science methods aimed at improving its nearby quarter socially. It happens Trenitalia is planning to make Foggia a lateral hub for their new high-velocity system and a second train station will be needed to accommodate an interchange. What about the main station then? No need tinkering any more, just make a park on the opposite entrance and open dedicated social spaces there to attract (and control) unwanted guests from the front quarter. Submitting a project soon.
Deep Learning needs GPU hardware which I do not own, not yet, so I need a cloud service. The newest solution is this Amazon Machine Image for DL on Ubuntu coupled with p2x.large spot instances from the Ireland zone at about $0.20/hr. In the long run, an own rig with GPU should contain at least a NVIDIA GTX 1060 6GB because of their CUDA platform and cuDNN accelerated deep learning library. This user built his for $800 recently and may be a good starting point, it will be a purchase for professional use.
I am grateful for this free, seven week Practical Deep Learning course for Coders online at fast.ai by Jeremy Howard and Rachel Thomas, which is helping me revise deep learning techniques after too much theory and too many books. It suits Windows-minded Python users like me very well with the AWS cloud hardware for a GPU instanced from a Python Anaconda platform & packages image. They even provide students with video lessons, an internal forum and a github repository with the full Jupyter notebooks! Thank you, really, giving back with a donation once all the materials are digested and my course completed.