We already must deal with computers too much rather than too little, and there is already lots of advanced computing done also for example in materials science and nanotechnology, for example molecular dynamics (MD) and Monte Carlo simulations.[2] The molecular biologist’s programs for predicting protein folding can also count as nanotechnology. Nevertheless, all of our previous articles* concluded that we need more computing, and several mentioned statistics. This would sound predictable if coming from a statistical physicist with a background in computing, advertising his skills. However, we mean a more efficient computing rather than simply more.

We started the type of computing we do only recently and for reasons not yet mentioned: Given complex nano-micro compounds, materials’ characterization is difficult due to the three-dimensional complexity of the structures. We originally integrated image analysis with simulation in order to derive 3D structure from 2D images (SEM) and projections (TEM).[3,4] The most fruitful result was however the insight into how easy it is to create adaptable software that analyzes images and keeps track of all the data, calculating anything desired such as comparisons with numerical simulations, all in one integrated system.[5,6] Many of the previously discussed issues, for example error reporting, are thereby basically already automatically solved!

Adapting software sounds prohibitively difficult: Who in my lab can modify software? Nowadays everybody! Today, programming is done partially graphically, for example with LabView™, where no programming language appears anymore. We work with Mathematica and therefore with programming code, but we mostly just download parts of code and adapt them playfully until they behave as desired. To whomever such does not count as the ability to program, we cannot program!

However, it can hardly count as an advanced AI enhanced future if scientists still needed to know how to code on top of all the increasingly numerous skills we are supposed to have. Our approach is guided and consistent with that we participate in evolution as a process of integration, and this means here including the AI available today. We do still also mean, as discussed, to learn from bio-medicine. While biologists’ microscopes are often connected to computers equipped with image recognition software that counts different shapes of cell’s automatically, the nanotech researcher establishes a particle-number density or some average diameter in images usually by eye, which is time consuming and too often obtains un-reproducible results with strong sample-bias. Simply buying biologists’ systems is of limited use in nanotechnology.

Instead, we pursue the inevitable, much more flexible and true integration with AI, and this can neither mean just buying expensive computers and finished programs, nor a “The Matrix” type nonsense future where scientists strangely still stare at screens full of flashing, green code. We mean to approach the real, inevitable near future of the “Startrek” type where the scientists concentrate on the scientific questions and simply ask the computer to display whatever they desire to have illustrated, say for example the same data in a parameter space with a different set of axes.

We currently must still “ask” the questions by modifying given code until we see the desired, but this already feels like asking, because it is not the programming of 20 years ago, when computers were still merely faster but everything else was to be fully understood by the programmer. Those who think that this is unscientific and without future do not understand evolution; indeed, there is an obvious close similarity with those who think that a super-smart creator is needed in order to have come up with the higher animals’ genetic code. Evolution is not about writing the code but about integrating/modifying bits of code and selecting.

Scientific rigor is assured by proper testing (= selection). Evolution is all about getting ahead with what bio-evolutionary scientists call “kludges.” Efficient, naturally selected systems are a complex bunch of kludges, of “dirty little tricks” that happened to survive. As technology evolves, the technology itself becomes like this, not just the complex systems we synthesize, but also the methods we use to analyze them. For example, we have at times mixed simulation of images and the capabilities of the human visual cortex into microscopy image analysis,[4] thus truly integrating the human capability into exact computerized analysis where the reliability of the reported accuracy is ensured.

Technology is macro-evolution and includes the integration with AI. There has been as much hype and misunderstanding around AI as around the concept of evolution. The computer helps intelligently today; the “programming” has turned into “asking the computer to do” via graphical user interfaces (GUI) and ‘interpreted languages’ such as Mathematica. It has already almost developed into an informal, playful chat with the machine. This is part of the long hyped and misunderstood transition, not any mystic and ill-defined “singularity” as discussed in rather more pseudo-scientific circles. And quite of course not, because “artificial intelligence” is not news to evolution in the same way that nanotechnology is not news – the terms “artificial” and “technology” merely reveal our anthropocentric perspective.

There can be much more said about how a somewhat ‘fuzzy,’ evolution guided approach to computing has general advantages specifically for the needs of modern nanotechnology as we discussed them. However, many issues are more clearly presented with examples from practical research as provideed in the second part of [1].

--------------------
*This article continues the series Adapting As Nano Approaches Biological Complexity: Witnessing Human-AI Integration Critically”, where the status of all this as suppressed information has already been discussed. I allow myself to actually publish the most interesting and critical parts here (edited). If citing, please cite nevertheless [1] anyway in order to support the author.

Previous posts were “Complexity And Optimization: Lost In Design Space

, "Magic Of Complexity With Catalysts Social Or Metallic".

and "Nanotech: The Most Dangerous Science Least Carefully Done"

------------------------

[1] S. Vongehr et al.: Adapting Nanotech Research as Nano-Micro Hybrids Approach Biological Complexity. J. Mater Sci.&Tech. 32(5), 387-401 (2016) doi:10.1016/j.jmst.2016.01.003

[2] K. Y. Yun et al.: Monte Carlo simulations of the structure of Pt-based bimetallic nanoparticles. Acta Materialia, 2012, 60, 4908-4916.

[3] S. Vongehr et al.: Microscopy Analysis by visual Comparison with simulated Images. in A. Mendez-Vilas and J. Diaz (Eds.): Microscopy: Science, Technology, Applications and Education, Formatex Microscopy Book Series, 2010, 4, 1565-1571.

[4] S. Vongehr et. al: Quantitative Analysis of Particle Distributions by Comparison with Simulations. Microsc. Microanal., 2011, 17, 61-66.

[5] S. Vongehr et al.: Computer assisted microscopy image analysis for high throughput statistical analysis and 3D structure determination in nanotechnology. in A. Mendez-Vilas (Ed.), Microscopy: Advances in scientific Research and Education, Formatex Microscopy Book Series, 2014, 6, 618-625.

[6] M. Sun et. al: Shape versus porosity: A systematic survey of cobalt oxide nanosheet Calcination from 200 to 900C. Mater. Lett., 2015, 141, 165-167