OPEN-SOURCE SCRIPT

NAND Perceptron

От auroagwei
Обновлено
Experimental NAND Perceptron based upon Python template that aims to predict NAND Gate Outputs. A Perceptron is one of the foundational building blocks of nearly all advanced Neural Network layers and models for Algo trading and Machine Learning.

The goal behind this script was threefold:
  • To prove and demonstrate that an ACTUAL working neural net can be implemented in Pine, even if incomplete.
  • To pave the way for other traders and coders to iterate on this script and push the boundaries of Tradingview strategies and indicators.
  • To see if a self-contained neural network component for parameter optimization within Pinescript was hypothetically possible.


NOTE: This is a highly experimental proof of concept - this is NOT a ready-made template to include or integrate into existing strategies and indicators, yet (emphasis YET - neural networks have a lot of potential utility and potential when utilized and implemented properly).


Hardcoded NAND Gate outputs with Bias column (X0):
// NAND Gate + X0 Bias and Y-true
// X0 // X1 // X2 // Y
// 1 // 0 // 0 // 1
// 1 // 0 // 1 // 1
// 1 // 1 // 0 // 1
// 1 // 1 // 1 // 0


  • Column X0 is bias feature/input
  • Column X1 and X2 are the NAND Gate
  • Column Y is the y-true values for the NAND gate
  • yhat is the prediction at that timestep
  • F0,F1,F2,F3 are the Dot products of the Weights (W0,W1,W2) and the input features (X0,X1,X2)
  • Learning rate and activation function threshold are enabled by default as input parameters

    Uncomment sections for more training iterations/epochs:
  • Loop optimizations would be amazing to have for a selectable length for training iterations/epochs but I'm not sure if it's possible in Pine with how this script is structured.


  • Error metrics and loss have not been implemented due to difficulty with script length and iterations vs epochs - I haven't been able to configure the input parameters to successfully predict the right values for all four y-true values for the NAND gate (only been able to get 3/4; If you're able to get all four predictions to be correct, let me know, please).


// //---- REFERENCE for final output
// A3 := 1, y0 true
// B3 := 1, y1 true
// C3 := 1, y2 true
// D3 := 0, y3 true

PLEASE READ: Source article/template and main code reference:
* towardsdatascience.com/6-steps-to-write-any-machine-learning-algorithm-from-scratch-perceptron-case-study-335f638a70f3
* towardsdatascience.com/what-the-hell-is-perceptron-626217814f53
* towardsdatascience.com/how-to-build-your-own-neural-network-from-scratch-in-python-68998a08e4f6
Информация о релизе
//v5.6c - activation function error fix (was F > 0.25; corrected 1), line 99
Информация о релизе
//v5.6d - correction to activation function variable z not being keyed in + W0/W1/W2 not being factored in for initial iterations
Информация о релизе
// v6.4 - Dot product operation error for F0-F3 and W0-F3 fixed. Test for loop iterator for training.
// v6.5d -
// Loop Iteration for epoch training implemented
// Sum of Squared Error (SSE) implemented
// Y-pred vs Y-true color coded output option function (green/red)
// Custom input options for all arrays, including W0-W2
// Allows for custom of input features, weights, and bias - Default is NAND gate.
// Placeholder "========" for input options seperator for settings panel
// 3x Infopanel component for display output + match color (green/orange/red.)
// v6.6
// Gate detection including XOR/NOR (despite not being able to converge/solve with SLP Neurons - MLP + nonlinear activations required for XOR/NOR training and detection)
Информация о релизе
// v6.6b
// Missing XOR/XNOR MLP + nonlinear activation warning/message in yellow upon detection - fixed.
annCentered OscillatorsdeeplearningexperimentalLinear Regressionmachinelearningneuralnetnnperceptrontemplate

Скрипт с открытым кодом

В истинном духе TradingView автор этого скрипта опубликовал его с открытым исходным кодом, чтобы трейдеры могли понять, как он работает, и проверить на практике. Вы можете воспользоваться им бесплатно, но повторное использование этого кода в публикации регулируется Правилами поведения. Вы можете добавить этот скрипт в избранное и использовать его на графике.

Хотите использовать этот скрипт на графике?

Отказ от ответственности