Science

Building Jarvis, part 3

Yesterday we talked about how the module that interacts with users can be constructed, today I’ll be talking about something more meaty and substantial, I call it the Conduit Module.

To accept open-ended query’s that transcend the simplistic search term based algorithms that Google and others employ we must shift towards a completely different, somewhat unused territory of programming, a conventional programmer’s worst nightmare, the self-modifying program.

Most programmers shy away from self-modifying code as the concept is generally only useful for a few specialized applications such as encryption/obfuscation and viruses  and has the potentially disastrous potentiality of being unable to read/alter your own source code, in Jarvis’ case, self modifying code is required and will not be manipulated by humans once trained.

To teach a hierarchical neural network to learn how to program answer sets and the standard functional pieces that make-up a program (IE: for loops with counters, sorting algorithms, etc), we need to teach the network the basics of programming syntax (boundary conditions and limits of manipulation) and then train it by giving it a large number of interesting and unique challenges.  In the end we will have a trained neural network that understands not only how to complete a certain task that’s requested of it, but also taking information that is output from the language module as an input telling it what it’s next task could be able to program its way into a solution for a large variety of problems, hence open-ended.

conduit module

 

 

 

A process like this is complicated to imagine let alone create, however such a module also has the capability of altering the processing of other parts of Jarvis adapting them to perform better over time via testing and training, enabling another layer to be built on top of the language/conduit modules that enable behaviour and quasi-conciousness.

As you can imagine from a neural network based program as complicated as this, super computers will need to be used and storage of “genes” will need to placed in very large data storage locations. Tomorrow I will be discussing the centralized training and decentralized ownership model and how we can leverage decentralized currencies (bitcoins, altcoins, etc) to help perform the calculations Jarvis will need to be able to learn in a reasonable amount of time.

Obfuscation  – http://en.wikipedia.org/wiki/Obfuscation_(software)

self-modifying code – http://www.i-programmer.info/programming/javascript/989-javascript-jems-self-modifying-code.html

Automatic Programming (Essentially what the Conduit Module employs) – http://en.wikipedia.org/wiki/Automatic_programming

more information on ANNs (Artificial Neural Networks) for those new to the concept – http://en.wikipedia.org/wiki/Types_of_artificial_neural_networks

 

 

Advertisements

Building Jarvis, part 1

Sorry everyone for the lack of updates, I was able to find work for a few months as a Product Design Engineer unfortunately it looks like the company might not be financially stable enough to pay me regularly which as a silver lining gives me more time to work on my ANN projects.

For the next couple of weeks I will be describing a major project that my colleague @Brandon.Benvie and I have been working on and keeping under our wings in preparation of working on it full time, as it has the possibility to completely revolutionize how the internet the likes of which have never been seen before in history.

This ANN project, named Jarvis  will be a fully functional companion AI that will have its computational and storage requirements distributed inside of a digital currency similar to the Bitcoin Protocol using some significant tweaks to enable custom kernel code computations.

I’ve also added some light reading for anyone interested in freshening up before we get into the technical details of exactly how such a system will be constructed, I promise this will be a hell of a ride.

Bitcoin Protocol – https://bitcoin.org/bitcoin.pdf

Ethereum – https://www.ethereum.org/

Companion AI systems – http://www.qrg.northwestern.edu/papers/files/fs104kforbus.pdf

 

 

Predicting material composition via Non Destructive Testing and Neural Networks

Material selection can at times be very tricky, sometimes you want certain aspects of a material such as high strength but you don’t want the low ductility that generally trends along with it.
(for information on the techniques discussed in this article please look at the wiki links below)

To solve this problem alloys and composite materials are developed by materials scientists and engineers that have particular characteristics that are well suited for a particular application; however this process is very time consuming an could take months if not years to completely map the strengths and weaknesses of a new alloy or composite design.

There are ways around this time sink, and one of them is instead of empirically measuring the resultant properties of a material for each change of a materials chemistry or micro-structure, it may be possible to predict the resultant structure based on non-destructive analysis methods and neural networks.

The first step with any neural network is to prepare the training data for the system to adapt and successfully understand the underlying natural laws the govern the kinetics. Input data would be generated from the resultant Electrochemical Impedance Spectroscopy (EIS) and/or Ultrasonic Testing (UT) of a sample training material, and the output data would come from slicing the specimen into very thin (~10 micron) plates which would then be analysed with EDS to determine the real material parameters.

The data would then be fed into a neural network designed to handle 3D input parameters (IE: It would have to have at least 2 hidden neuron layers), and a trained neural network will be generated. This network will now be able to take NDT data from a new material and generate a 3-D Finite Element Model of what it predicts to be the composition and micro-structure.

This process can be used in place with existing NDT QA techniques, but can also be used as an initial step with my previous article (see https://learningann.wordpress.com/2014/03/10/how-machine-learning-unlocks-the-secrets-of-materials-science/)  to create a material optimisation program.

http://en.wikipedia.org/wiki/Dielectric_spectroscopy – EIS

http://en.wikipedia.org/wiki/Ultrasonic_testing – UT

http://en.wikipedia.org/wiki/Non_Destructive_Testing – NDT

http://en.wikipedia.org/wiki/Energy-dispersive_X-ray_spectroscopy – EDS
I hope this article was informative and helpful, follow me if you like what I write about and if you want to see more like this article please like the post!