neural networks

Building Jarvis, part 3

Yesterday we talked about how the module that interacts with users can be constructed, today I’ll be talking about something more meaty and substantial, I call it the Conduit Module.

To accept open-ended query’s that transcend the simplistic search term based algorithms that Google and others employ we must shift towards a completely different, somewhat unused territory of programming, a conventional programmer’s worst nightmare, the self-modifying program.

Most programmers shy away from self-modifying code as the concept is generally only useful for a few specialized applications such as encryption/obfuscation and viruses  and has the potentially disastrous potentiality of being unable to read/alter your own source code, in Jarvis’ case, self modifying code is required and will not be manipulated by humans once trained.

To teach a hierarchical neural network to learn how to program answer sets and the standard functional pieces that make-up a program (IE: for loops with counters, sorting algorithms, etc), we need to teach the network the basics of programming syntax (boundary conditions and limits of manipulation) and then train it by giving it a large number of interesting and unique challenges.  In the end we will have a trained neural network that understands not only how to complete a certain task that’s requested of it, but also taking information that is output from the language module as an input telling it what it’s next task could be able to program its way into a solution for a large variety of problems, hence open-ended.

conduit module

 

 

 

A process like this is complicated to imagine let alone create, however such a module also has the capability of altering the processing of other parts of Jarvis adapting them to perform better over time via testing and training, enabling another layer to be built on top of the language/conduit modules that enable behaviour and quasi-conciousness.

As you can imagine from a neural network based program as complicated as this, super computers will need to be used and storage of “genes” will need to placed in very large data storage locations. Tomorrow I will be discussing the centralized training and decentralized ownership model and how we can leverage decentralized currencies (bitcoins, altcoins, etc) to help perform the calculations Jarvis will need to be able to learn in a reasonable amount of time.

Obfuscation  – http://en.wikipedia.org/wiki/Obfuscation_(software)

self-modifying code – http://www.i-programmer.info/programming/javascript/989-javascript-jems-self-modifying-code.html

Automatic Programming (Essentially what the Conduit Module employs) – http://en.wikipedia.org/wiki/Automatic_programming

more information on ANNs (Artificial Neural Networks) for those new to the concept – http://en.wikipedia.org/wiki/Types_of_artificial_neural_networks

 

 

Building Jarvis, part 2

An artificial intelligence created from Neural Networks must be multi layered; This kind of structure is considered a “Hierarchical Neural Network” and is required as the information from lower level functions are necessary to feed into the higher level abstract functions, exactly how the human brain operates.

Any AI that plans on interfacing with humans will need to have one fundamental module, a language interpreter.

Query front end  netStarting from the beginning we start to see multiple layers of neural networks that feed into each other required to complete open ended query/solution tasks that Jarvis will need to be adept at solving. Word parsing is by far the easiest of all steps, as it is a simple English (or equivalent) word parsing module that is trained via dictionary/thesaurus and by reading the responses to strings on webpages (forums are a great resource for natural language datasets) and by training the network using data mined from these resources we can program a network that fully understand not only words and grammar but entire sentence structures and query/answers in the natural language of conversation.

Once the language is interpreted and the program understands what you’re asking right now, it may go back and look into previous queries to uncover patterns in your questions. An example would be if you were looking for a waterproofing device to fit around a cable, and your previous question was about silicone sealants, Jarvis may ask in response “Are you looking for other waterproofing techniques as well?” and by going back and forth multiple times you will end up with a solution you didn’t even know you were asking.

 

Tomorrow I will talk about how such an open-ended solver could not only be used for data retrieval, but also used to actually act upon something, eventually I will be able to say “Jarvis, please install yourself onto my USB drive” and in response Jarvis will say “Sure James, do you need me to call the movers? They seem to be running an hour late.”

Building Jarvis, part 1

Sorry everyone for the lack of updates, I was able to find work for a few months as a Product Design Engineer unfortunately it looks like the company might not be financially stable enough to pay me regularly which as a silver lining gives me more time to work on my ANN projects.

For the next couple of weeks I will be describing a major project that my colleague @Brandon.Benvie and I have been working on and keeping under our wings in preparation of working on it full time, as it has the possibility to completely revolutionize how the internet the likes of which have never been seen before in history.

This ANN project, named Jarvis  will be a fully functional companion AI that will have its computational and storage requirements distributed inside of a digital currency similar to the Bitcoin Protocol using some significant tweaks to enable custom kernel code computations.

I’ve also added some light reading for anyone interested in freshening up before we get into the technical details of exactly how such a system will be constructed, I promise this will be a hell of a ride.

Bitcoin Protocol – https://bitcoin.org/bitcoin.pdf

Ethereum – https://www.ethereum.org/

Companion AI systems – http://www.qrg.northwestern.edu/papers/files/fs104kforbus.pdf

 

 

I finished it!

I’ve just finished testing my foundation neural network function with openCL C and can confirm that it works! From what you can see above, I have my run through it’s selection process for batches of 7000 individuals each (due to scalability constraints), selects the best one and then will have a “battle royale” of the top performers from each batch.

The results you see are based on my training data of the XOR table (IE: input 0, 0 = 0 ; 0,1 =1 ; 1,0 = 1 ; 1,1 = 0) and the result for 0,0 is displayed to compare to its fitness data.

I’m about half way finishing converting the extremely obtuse and unwieldy code into C++ class based structures so that myself and others can use it, however I’m not entirely sure if I still want to make my first practice project trading in bitcoins after my exchange of choice (MTgox) decided to fold last month, so if anyone has any good ideas I’m all ears!

I’ve also been in an interview process for a really amazing job! I have a very real possibility that I might be a PLM (product lifecycle management) engineering consultant for a really neat tech startup that an internet friend of mine referred me for. As you could imagine after looking for work for 6 months its a breath of fresh air to finally find someone who values your skills, knowledge, and determination!

Let me know of any projects that could use NNs you’d like me to talk about! As you can tell I’m pretty enthusiastic about the topic.

Predicting material composition via Non Destructive Testing and Neural Networks

Material selection can at times be very tricky, sometimes you want certain aspects of a material such as high strength but you don’t want the low ductility that generally trends along with it.
(for information on the techniques discussed in this article please look at the wiki links below)

To solve this problem alloys and composite materials are developed by materials scientists and engineers that have particular characteristics that are well suited for a particular application; however this process is very time consuming an could take months if not years to completely map the strengths and weaknesses of a new alloy or composite design.

There are ways around this time sink, and one of them is instead of empirically measuring the resultant properties of a material for each change of a materials chemistry or micro-structure, it may be possible to predict the resultant structure based on non-destructive analysis methods and neural networks.

The first step with any neural network is to prepare the training data for the system to adapt and successfully understand the underlying natural laws the govern the kinetics. Input data would be generated from the resultant Electrochemical Impedance Spectroscopy (EIS) and/or Ultrasonic Testing (UT) of a sample training material, and the output data would come from slicing the specimen into very thin (~10 micron) plates which would then be analysed with EDS to determine the real material parameters.

The data would then be fed into a neural network designed to handle 3D input parameters (IE: It would have to have at least 2 hidden neuron layers), and a trained neural network will be generated. This network will now be able to take NDT data from a new material and generate a 3-D Finite Element Model of what it predicts to be the composition and micro-structure.

This process can be used in place with existing NDT QA techniques, but can also be used as an initial step with my previous article (see https://learningann.wordpress.com/2014/03/10/how-machine-learning-unlocks-the-secrets-of-materials-science/)  to create a material optimisation program.

http://en.wikipedia.org/wiki/Dielectric_spectroscopy – EIS

http://en.wikipedia.org/wiki/Ultrasonic_testing – UT

http://en.wikipedia.org/wiki/Non_Destructive_Testing – NDT

http://en.wikipedia.org/wiki/Energy-dispersive_X-ray_spectroscopy – EDS
I hope this article was informative and helpful, follow me if you like what I write about and if you want to see more like this article please like the post!