Building Jarvis, part 3

Yesterday we talked about how the module that interacts with users can be constructed, today I’ll be talking about something more meaty and substantial, I call it the Conduit Module.

To accept open-ended query’s that transcend the simplistic search term based algorithms that Google and others employ we must shift towards a completely different, somewhat unused territory of programming, a conventional programmer’s worst nightmare, the self-modifying program.

Most programmers shy away from self-modifying code as the concept is generally only useful for a few specialized applications such as encryption/obfuscation and viruses  and has the potentially disastrous potentiality of being unable to read/alter your own source code, in Jarvis’ case, self modifying code is required and will not be manipulated by humans once trained.

To teach a hierarchical neural network to learn how to program answer sets and the standard functional pieces that make-up a program (IE: for loops with counters, sorting algorithms, etc), we need to teach the network the basics of programming syntax (boundary conditions and limits of manipulation) and then train it by giving it a large number of interesting and unique challenges.  In the end we will have a trained neural network that understands not only how to complete a certain task that’s requested of it, but also taking information that is output from the language module as an input telling it what it’s next task could be able to program its way into a solution for a large variety of problems, hence open-ended.

conduit module




A process like this is complicated to imagine let alone create, however such a module also has the capability of altering the processing of other parts of Jarvis adapting them to perform better over time via testing and training, enabling another layer to be built on top of the language/conduit modules that enable behaviour and quasi-conciousness.

As you can imagine from a neural network based program as complicated as this, super computers will need to be used and storage of “genes” will need to placed in very large data storage locations. Tomorrow I will be discussing the centralized training and decentralized ownership model and how we can leverage decentralized currencies (bitcoins, altcoins, etc) to help perform the calculations Jarvis will need to be able to learn in a reasonable amount of time.

Obfuscation  –

self-modifying code –

Automatic Programming (Essentially what the Conduit Module employs) –

more information on ANNs (Artificial Neural Networks) for those new to the concept –




Building Jarvis, part 2

An artificial intelligence created from Neural Networks must be multi layered; This kind of structure is considered a “Hierarchical Neural Network” and is required as the information from lower level functions are necessary to feed into the higher level abstract functions, exactly how the human brain operates.

Any AI that plans on interfacing with humans will need to have one fundamental module, a language interpreter.

Query front end  netStarting from the beginning we start to see multiple layers of neural networks that feed into each other required to complete open ended query/solution tasks that Jarvis will need to be adept at solving. Word parsing is by far the easiest of all steps, as it is a simple English (or equivalent) word parsing module that is trained via dictionary/thesaurus and by reading the responses to strings on webpages (forums are a great resource for natural language datasets) and by training the network using data mined from these resources we can program a network that fully understand not only words and grammar but entire sentence structures and query/answers in the natural language of conversation.

Once the language is interpreted and the program understands what you’re asking right now, it may go back and look into previous queries to uncover patterns in your questions. An example would be if you were looking for a waterproofing device to fit around a cable, and your previous question was about silicone sealants, Jarvis may ask in response “Are you looking for other waterproofing techniques as well?” and by going back and forth multiple times you will end up with a solution you didn’t even know you were asking.


Tomorrow I will talk about how such an open-ended solver could not only be used for data retrieval, but also used to actually act upon something, eventually I will be able to say “Jarvis, please install yourself onto my USB drive” and in response Jarvis will say “Sure James, do you need me to call the movers? They seem to be running an hour late.”

Building Jarvis, part 1

Sorry everyone for the lack of updates, I was able to find work for a few months as a Product Design Engineer unfortunately it looks like the company might not be financially stable enough to pay me regularly which as a silver lining gives me more time to work on my ANN projects.

For the next couple of weeks I will be describing a major project that my colleague @Brandon.Benvie and I have been working on and keeping under our wings in preparation of working on it full time, as it has the possibility to completely revolutionize how the internet the likes of which have never been seen before in history.

This ANN project, named Jarvis  will be a fully functional companion AI that will have its computational and storage requirements distributed inside of a digital currency similar to the Bitcoin Protocol using some significant tweaks to enable custom kernel code computations.

I’ve also added some light reading for anyone interested in freshening up before we get into the technical details of exactly how such a system will be constructed, I promise this will be a hell of a ride.

Bitcoin Protocol –

Ethereum –

Companion AI systems –



I finished it!

I’ve just finished testing my foundation neural network function with openCL C and can confirm that it works! From what you can see above, I have my run through it’s selection process for batches of 7000 individuals each (due to scalability constraints), selects the best one and then will have a “battle royale” of the top performers from each batch.

The results you see are based on my training data of the XOR table (IE: input 0, 0 = 0 ; 0,1 =1 ; 1,0 = 1 ; 1,1 = 0) and the result for 0,0 is displayed to compare to its fitness data.

I’m about half way finishing converting the extremely obtuse and unwieldy code into C++ class based structures so that myself and others can use it, however I’m not entirely sure if I still want to make my first practice project trading in bitcoins after my exchange of choice (MTgox) decided to fold last month, so if anyone has any good ideas I’m all ears!

I’ve also been in an interview process for a really amazing job! I have a very real possibility that I might be a PLM (product lifecycle management) engineering consultant for a really neat tech startup that an internet friend of mine referred me for. As you could imagine after looking for work for 6 months its a breath of fresh air to finally find someone who values your skills, knowledge, and determination!

Let me know of any projects that could use NNs you’d like me to talk about! As you can tell I’m pretty enthusiastic about the topic.

How machine learning unlocks the secrets of materials science

Imagine for a second, that you had the power to simulate atoms and molecules and how they interact with each other to create a material, sounds like it would make material selection pretty easy right?

Unfortunately the technology doesn’t yet exist to model so many individual elements and FEMAP (Finite Element Modelling and Processing) can only do so much; it still requires something to interpret the resultant data.

Now imagine if you were able to teach a computer to look for the “natural laws” that govern the mechanical/chemical/electronic properties of a material based on its alloy (or composite)  content.

This idea brings us to machine learning and neural nets which are capable of discovering patterns within current materials, and use those results as “training data” to try and predict the material/chemical/and electronic properties of a yet-to-be tested material.

The length of time it took us to completely comprehend steel-making took us almost a millennia, imagine if we were able to cut that time down to a few days via material simulations and predictive modelling!

It won’t take long before we’re able to build space elevators and room temperature super-conductors when this technology comes of age.

A new beginning.

Hello! I’m James and welcome to my blog!

I’m a little new at this, however I think its around time that I started to develop this page. I’ve completed a lot and there is a significant amount of work left for me to do, however let’s start at introductions before we delve into the objective of this blog.

I’m a  spry 23 year old who has recently completed his materials engineering degree from Dalhousie University (whew!), and I’ve been working for my professor as a research assistant and loving it. Unfortunately that will soon be coming to an end on April when the grant money runs out, and I’ll be left without any form of income or further academic job prospects. I also happen to be taking care of my 76 year old father who  I love dearly and is relying on me for financial support, which at the very least I owe him for all of the hardship we’ve fought through together over the years, including my mother passing way in 2001.

Which brings us to my job search – I’ve been applying for jobs locally and internationally;  writing custom cover letters and resumes (like the one below) to find positions in companies that would be interested in materials engineers. however I’ve been essentially screaming “hire me!” into the void and expecting a result, absolutely not a single person wants to hire me. I’ve applied for over 300 engineering positions on linkedin, kijiji, careerbeacon, even in person, no one wants a materials engineer.

My Resume (March 6th, 2014)

So as you can imagine, when you’re 23 and taking care of your ailing father who’s had prostate cancer for as long as you can remember, and both of you are rapidly running out of money, all of a sudden spending all of my waking hours job hunting doesn’t seem like a valuable use of my only valuable resource, my time.

At this point all I wanted was a semi-permanent job, I wasn’t looking for the holy grail and even at the time of writing this if someone offered me ~$40k a year to paint the mona lisa upside down and with my toes I would try my hardest to make sure he got his monies worth, however in my city it seemed as if all work suddenly dried up upon graduating, so I had to try something else which brings me to neural networks.

On the Bitcoin forums (Been an active Bitcoin advocate for the past couple of years, but sold too early har har!) someone posted about Artifical Neural Networks (ANNs) and machine learning systems designed to act as smart commodity agents, and linked to Jeff Heaton’s videos ( on neural network design and fundamental structures. I immediately realized the value of such a program and went to my university library to find any books I could on the topic, and begun studying.

Initially it was hard, I wasn’t used to working at such a pace. However with time, I’ve become used to the frantic “burning the candle on both ends” level of work, and I’ve transformed from a somewhat lazy engineering graduate to one of the hardest workers I know.

That was on February 1st, and it’s now March 6th, and a lot has changed. I’ve successfully completed a working prototype in OOP (Object Oriented Programming) C++ on my CPU, and I’m very nearly finished my openCL C version of my prototype, of which I call (creatively) “Foundation”.

My ideas aren’t completely revolutionary, however with time I hope to completely retool myself to become a very good engineer and machine learning scientist hybrid, and maybe landing a job I could be proud of.

Thanks for sticking with me on this rant, and I really do want to hear what your thoughts on what I’m doing, I’ll be updating this blog hopefully bi-weekly so there should be some interesting case studies next week on what I’ve learned about openCL and neural network design.