Skip to content
This repository has been archived by the owner on Feb 15, 2022. It is now read-only.

Three strategies have been coded to the best of my ability. + Lightning Fast Threading #709

Closed
wants to merge 8 commits into from

Commits on Nov 15, 2017

  1. Three strategies have been coded to the best of my ability.

    Trendline: Buy when trend exceeds > 1 sell when trend is < 1 (down) where 1 is no trend at all.
    Standard Deviation: Buys when the mean of the last short trades exceeds the standard deviation and mean of the last long trades.
    Neural: Uses neural regression learning to predict the next price. Can be resource intensive. Be careful. Based on weather prediction sliding max/minimum. Buys when the mean of the last two periods prediction price exceeds the mean of the last 3 periods actual prices. I call it sliding 2/3rds averaging prediciton signal.
    
    Also:
    I attempted a fix for zenbot's single-thread node limitation by incuding cluster npm module, the whole zenbot instance is clustered from the master instance for neural and other processor-resource intensive strategy simulation. Be careful, it can be prone to a memory leak with: cluster.setMaxListeners(0).
    The cluster worker processes, if killed, should exit with:
      cluster.on('exit', function(worker, code, signal) {
        console.log('worker ' + worker.process.pid + ' died');
    
    -----------------------------------------------------
    The entire code for clustering:
    -----------------------------------------------------
    var cluster = require('cluster');
    cluster.setMaxListeners(0)
    var numCPUs = require('os').cpus().length;
    if (cluster.isMaster) {
      // Fork workers.
      for (var i = 0; i < numCPUs; i++) {
        cluster.fork();
      }
      cluster.on('exit', function(worker, code, signal) {
        console.log('worker ' + worker.process.pid + ' died');
      });
    } else {
    
    ////////////////////////zenbot.js code///////////////////////
    
    }
    
    -----------------------------------------------------
    Jacob McQueen committed Nov 15, 2017
    Configuration menu
    Copy the full SHA
    84e7333 View commit details
    Browse the repository at this point in the history
  2. Added node-prowl, updated package-lock.json

    Jacob McQueen committed Nov 15, 2017
    Configuration menu
    Copy the full SHA
    3dffb01 View commit details
    Browse the repository at this point in the history
  3. Removed unnecessary line.

    TheRoboKitten authored Nov 15, 2017
    Configuration menu
    Copy the full SHA
    67eb7f3 View commit details
    Browse the repository at this point in the history
  4. Alphabetical order correction

    TheRoboKitten authored Nov 15, 2017
    Configuration menu
    Copy the full SHA
    135f642 View commit details
    Browse the repository at this point in the history
  5. Updated Readme with 3 strategies added

    Jacob McQueen committed Nov 15, 2017
    Configuration menu
    Copy the full SHA
    d6318d1 View commit details
    Browse the repository at this point in the history
  6. Threading code does not function.

    Threading code did not work correctly at all.
    
    Thinking of solutions. Not sure where to go with this. I need actual professional node advice.
    
    ```
    
          if (s.lookback[s.options.min_periods]) {
              for (let i = 0; i < s.options.min_periods; i++) { tll.push(s.lookback[i].close) }
              for (let i = 0; i < s.options.min_predict; i++) { tlp.push(s.lookback[i].close) }
              s.my_data = tll.reverse()
              var learn = function (s) {
                s.done = 0
                for(var j = 0; j < s.trains; j++) {
                  if (cluster.isMaster) {
                      cluster.fork();
                  }
                  else {
                    for (var i = 0; i < s.my_data.length - s.neural.neuralDepth; i++);
                      var data = s.my_data.slice(i, i + s.neural.neuralDepth);
                      var real_value = [s.my_data[i + s.neural.neuralDepth]];
                      var x = new convnetjs.Vol(data);
                      s.neural.trainer.train(x, real_value);
                      var predicted_values = s.neural.net.forward(x);
                      if (i === s.my_data.length - s.neural.neuralDepth) { s.done = j }
                      if (i === s.my_data.length - s.neural.neuralDepth) { calculating === 'False' }
    
                    }
                  }
                }
              }
              var predict = function(data){
                  x = new convnetjs.Vol(data);
                  predicted_value = s.neural.net.forward(x);
                  return predicted_value.w[0];
              }
              if (s.lookback[s.options.min_periods] && calculating === 'False') {
              calculating = 'True'
              learn(s);
              }
              if (s.done === s.trains) {
                var item = tlp.reverse();
                s.prediction = predict(item)
                s.mean = math.mean(tll[0], tll[1], tll[2])
                s.meanp = math.mean(s.prediction, oldmean)
                s.sig0 = s.meanp > s.mean
                oldmean = s.prediction
              }
    
    ```
    
    The above is why. How can I drop that into a new clustered function?
    TheRoboKitten authored Nov 15, 2017
    Configuration menu
    Copy the full SHA
    a9f9432 View commit details
    Browse the repository at this point in the history
  7. Attempted threading, created fancy ./install.sh file.

    Had to roll back to node 7.x. But added napa/napajs microsoft threading npm module to package.json for future use.
    Updated package-lock.json
    Install goes smoothly.
    Included install log in install.sh for diagnostics.
    Jacob McQueen committed Nov 15, 2017
    Configuration menu
    Copy the full SHA
    ca0b215 View commit details
    Browse the repository at this point in the history
  8. ALMOST THERE!

    Jacob McQueen committed Nov 15, 2017
    Configuration menu
    Copy the full SHA
    3cf97a9 View commit details
    Browse the repository at this point in the history