Feedback

Please leave your feedback here or join Fann2MQL group on Facebook.

Advertisements

92 Responses to Feedback

  1. ari says:

    Hi.. I am trying to compile your project because I want to add a few things.. I downloaded FANN and TBB and updated the include paths in my project.. But now I am getting unresolved externals errors because of the calls to FANN internal functions.. What’s the trick to make it work?

    For now, I comment out f2M_train_fast.

  2. emsi says:

    The same way you updated the include paths you need to update library dependencies. In your project properties in Linker -> Input -> Additional dependencies make sure there is a proper path to fanndoubleMT.lib and tbb.lib.

  3. mt4 says:

    I got the similar errors:

    1>Fann2MQL.obj : error LNK2001: unresolved external symbol _fann_update_weights
    1>Fann2MQL.obj : error LNK2001: unresolved external symbol _fann_backpropagate_MSE
    1>Fann2MQL.obj : error LNK2001: unresolved external symbol _fann_compute_MSE

    I am using tbb22 and fann2.1.0beta. I am sure I configured the right library dependencies in Visual Studio. Any suggestions? Thanks.

    • emsi says:

      Unresolved symbol error while linking indicates that you are missing the proper fann.lib library file.
      The only reason is that your fann library is not exporting those missing symbols.
      Make sure they are declared in fann_internal.h as FANN_EXTERNAL as follows:
      FANN_EXTERNAL void fann_compute_MSE(struct fann *ann, fann_type * desired_output);
      FANN_EXTERNAL void fann_backpropagate_MSE(struct fann *ann);
      FANN_EXTERNAL void fann_update_weights(struct fann *ann);

  4. Darek says:

    I think I can successfully exploit the Fann2Mql neural network package for Excel VBA. Here is an example.
    After public declaration:
    Public Declare Function create_standard _
    Lib “C:\Fann\Fann2MQL.dll” _
    Alias “f2M_create_standard” _
    (ByVal num_layers As Integer, _
    ByVal num_enter_neurons As Integer, _
    ByVal num_hidden_1_neurons As Integer, _
    ByVal num_hidden_2_neurons As Integer, _
    ByVal num_exit_neurons As Integer) As Integer

    I can use function:
    ann = create_standard(4, 10, 10, 5, 1)
    to create an nn.
    I am testing other functions now.

  5. Darek says:

    I’ve got a mistake in one line in previous post, but this does not matter for the idea. The rihgt line should be:
    ann = create_standard(4, 10, 10, 5, 1)

  6. Julien says:

    Hi,

    Could you create a few other simpler examples?

    Like a really basic indicator for example.

    Thanks.

    • emsi says:

      Sure. I’ll try to write something more but the NN topic is not an easy one in general and you should educate yourself a bit in this subject before you start to use Fann2MQL.

      • Julien says:

        I have bought and read 2 books, one of which being about NN applied in finance, I wrote a language analyzer and an OCR program (not in MQL4 of course). Also wrote a few MQl programs, but in MQL, my programs are never working they way I anticipate.
        That’s why it would be nice to have an “official” example :) + it would be useful for other beginners!

      • Julien says:

        I’ve written an abstraction layer and was able to create applications such as a linear predictor, non-linear predictor, logic gates (XOR, AND, OR), arithmetic operators (1+4=5)…
        http://fann4mt.thetradingtheory.com/

        So everything is good now :)

  7. kevin07 says:

    I have integrated this neural network into my template EA but the external multithreading setting produces an error whether Parallel is set as true or as false.

    if (Parallel) {
    anns_run_parallel (AnnsNumber, AnnsArray, InputVector);
    }

    Does this tell me that this program was created on a Duo or Quad Core computer? Please tell me which processor works well and how much RAM I should provide.

    • emsi says:

      Intel TBB is platform independent. No matter what is your CPU it will work. Whether it is AMD or Intel, single, dual or more core it always should work fine.

  8. Dave says:

    what is the training algorithm used with FANN? LM, BFG ?

  9. Chris says:

    Thank you

  10. Ivan Tonev says:

    Training the networks takes more time than execution. Why is there f2M_run_parallel() and no f2M_train_parallel() ?

    • emsi says:

      Training in parallel in most cases would make no sense. I mean at least the way that the run is done. While training you change the internal structure of a net updating the weights so you cannot run multiple threads simultaneously as they would alter each other changes.

  11. Ivan Tonev says:

    I appreciate the reply but I actually meant trainig few networks at a time on the same data (just like running few networks on the same data). sure a single network can not be trained by multiple threads, but it’d be extremely useful to be able to train a group of networks in parallel.

  12. astic says:

    I have another stupid question : I’ve got the error 127 loading the dll (I put the library contents manually in windows/system32, I also checked web to know about it , they seem to talk about a def file ….)

    Could you help me then ?
    (sorry if this question su**s)

  13. astic says:

    Ok, here is the solutions to the issues I faced :

    My system was lacking some dlls required to fully import the FANN2MQL dll.

    I used the “Dependency Walker” tool to check exactly which dlls were used by fann2mql. It appeared that I was lacking some dlls :

    MSJAVA.dll
    MSVCR90.dll
    MSVCP90.dll
    EFSADU.dll

    That’s it, it now seems to work :)
    Hope that could help some other noobs ;)
    Regards

    • emsi says:

      That’s odd as the instalation package should contain all the libraries from MSVC. Probably it’s due to Windows 7. I’ll try to rebuild packages for this version but I’m not able to test it as I do not have W7 (and I do not intend to have one soon).

      • astic says:

        I am currently using a win XP version (under a virtual box) so I do not think it’s due to w7.

        I must also add that it still doesn’t work in fact. Even after having cleared the dlls issues. MT4 tells me that the fann2mql dll unsuccessfully tries to load a C runtime library.
        I’ve not found the solution to this issue til now… If you have some suggestions I’d like to know.
        thks for your reply

      • emsi says:

        For whatever reason the runtime library was not installed on you computer.
        Please try 0.1.4beta version or install MSVC 2008 runtime libraries.

  14. chris f says:

    I read an article on MQL4.com about this and I wanted to play around with it. I tried to download all 4 versions of the file but nothing downloaded. Is this still availible. Is it possible to get a copy or point me in the direction where I could find it.

    Thanks

    • emsi says:

      I’m sorry but there was some problem with the server that was serving the msi installation packages. It’s solved now.

  15. nima says:

    What kind of trading algorithm are you using in your library?
    Backpropagation training (RPROP, Quickprop, Batch, Incremental)?

    • emsi says:

      Please refer to FANN documentation (link in Documentation section).

      However all advanced training algoritms make sense only on data sets so you should collect those sets first then use FannTool of FannExplorer to perform training.

  16. Haidzatul says:

    FANN Works on MT5!
    I just try this library on MT5 and it work. Installation done at Windows 7 (32 bit) Intel Core 2 Duo CPU P8700 2.53 GHz using MT5 Build 338.

    But NOT work on geniune Intel CPU 2140 @ 1.60 GHz Window XP 32bit MT5 Build 334

    Also NOT work on Windows 7 64 bit (Intel® Xeon® Quad-Core Processor X5667 3.06 GHz. MT5 Build 338. Because of 64 bit not supported. Any idea or solution in future for 64 bit??

    • emsi says:

      Probably you made some mistake. If it works with MT5 it should work regardless of windows version.

      As for 64bit version the situation is a bit different. With 32bit MetaTrader Fann2MQL is working properly. If there is a 64bit build of MT5 it won’t be able to load libraries.

    • emsi says:

      I have just build native x64 build of Fann2MQL. It can be used only with MetaTrader 5 x64 builds. Please try it if you like :)

  17. chris f says:

    Hello,

    I have tried to use the F2M_train() but i dont think it is correctly training my data since it only does one iteration. Even when I loop the function it doesnt really change my MSE. Looking on the FANN website I think what i need to use is FANN_train_on_data() where it trains on the entire dataset over a period of time. I didn’t see anything like this in the F2M list of functions. Is there a way I can get similar results

    • emsi says:

      Yes. You can use FannTool of FannExplorer to train on data sets.
      As MQL is working on a chart and is tick based it you cannot obtain whole sets of data. And I would strongly discourage training in loops as it will take to much time to complete before next tick arrives.

  18. Haidzatul says:

    Thank you for release 64bit version for MT5. Thank a lot.

    • emsi says:

      You’re welcome. :)
      Please let me know how it works. It is not fully tested yet so your feedback is more than welcome. Specially if there are any issues with multithreading support and MT5 tester.

      • Haidzatul says:

        i found result big different. Maybe bug.

        MT5 EURUSD H1
        Before (fann 0.1.3) 32bit on 64bit Windows 7:
        Next Targer Price : 1.38345 MSE : 0.00000451
        (The value MSE are very small but it works to predict next price)

        After install new release fann 0.1.4 beta windows 7
        Next Targer Price : 9233356666666777.38345 MSE : 9233356666666777.38345451

        any idea?

      • Haidzatul says:

        Correction
        Before (fann 0.1.3 32 bit) on 32bit Windows 7 MT5
        After (fann 0.1.4 beta 64 bit) on 64bit WIndows 7 MT5

      • emsi says:

        Currently I’m verifying it (trying to replicate).
        Could you please verify the result under FANN2MQL 0.1.4 beta 32 bit on 32bit Windows?

      • Haidzatul says:

        It done well on Fann2MQL 0.1.4beta2 32bit on Windows 7 32 bit MT5 Build 342 (recompile)

  19. emsi says:

    The 0.1.4beta2 version is released (fixed buggy definitions in .mqh file).
    Please make sure to recompile your sripts when moving from 32 to 64bit version.

  20. arantis says:

    Hello,

    thanks for your job and your share, I have a little problem with at the compilation in mt4 with exemple :

    NeuroMACD-fixed.mq4 (10.1 Kb)
    NeuroMACD.mq4 (10.1 Kb)

    Line :

    int f2M_train_on_file(int ann, string filename, int max_epoch, float desired_error);

    message : ‘float’ – parameter definition expected C:\Program Files\OANDA – MetaTrader\experts\include\Fann2MQL.mqh (53, 64)

    I have download the beta version off fann2

    Thanks

  21. Magnificent web site. Lots of helpful info here. I’m sending it to some pals ans additionally sharing in delicious. And naturally, thank you on your effort!

  22. asutrader says:

    Hello

    Is inputs/outputs normalization mandatory?
    What are feasible normalization formulas?
    Thanks

  23. David says:

    Fann2MQL 0.1.4_x64-beta2.msi
    The download link is dead.

  24. David says:

    Sorry,
    http://www.pipscomfort.com/Fann2MQL/Fann2MQL%200.1.4_x64-beta2.msi
    is dead. I cannot download it from South Korea.

  25. Pete says:

    A quick comment. I have noticed a documentation issue, the f2M_create_standard function describes “l2num, l3num, l4num – number of neurons in hidden and output layers”. It is three parameters and two descriptions. It would help to know what all three parameters do. I know I can find it elsewhere, but it is the documentation and it should be accurate and clear what every parameter is for.

    • emsi says:

      It says:

      l2num, l3num, l4num – number of neurons in hidden and output layers (depending on num_layers).

      Depending on number of layers the last argument is the output layer and prevois ones are hidden layer.

  26. Daniel Aragonés says:

    Hi,

    I’m not able to get the FANN directory populated, either in Windows XP or Windows 7, either version 0.1.3 or 0.1.4Beta, either in forward mode or in tester mode, either as a user or as the Administrator. Always in the same computer under different root partitions.

    However, the Fann2MQL.dll is effectively loaded as the log states 15 times in all cases:

    08:29:53 2012.10.31 23:00 torero12 EURUSD,H1: ANN: ‘F:\FANN\torero12-1446436.15-short.net’ created successfully with handler 15

    I’m running Metatrader4 in a partition different of C:\ because otherwise, backtesting is not possible under Windows 7.

    ¿Any ideas?

    • emsi says:

      You need to run the MT terminal with admin rights unless you disable UAC (which is not recommended).
      Right click on terminal.exe in installation folder. Open Compatibility tab, check “Run this program as an administrator” at the bottom. (This should also fix testing issues for you so MT from C:\ will work ;)

      Also look for your files in:

      C:\Users\[YOUR USERNAME]\AppData\Local\VirtualStore

      where [YOUR USERNAME] is your local windows login name.

  27. Daniel Aragonés (danarag) says:

    Hello,

    I have opened this thread:

    http://www.mql5.com/en/forum/12290

    with some additional information about the problem. The only way I can use the library succesfully is naming every ann with a name that begins with a different letter. This way I get the installation directory populated by the anns, and are later retrieved from that place. It is a bizarre work around but it works.

    I suspect that the problem is the limited size of the string of the path that is fed to fopen inside the library. What header file in the sources must be modified to overcome that limitation?

    Thanks for your reply.

    • emsi says:

      There is no such limitation. The save function is using string char *path so the path length is not limited. You must be experiencing some other phenomenon.

      Please note that fann2mql is not using terminal.exe built in metaquotes language to save files but system functions instead. As a consequence it should be able (let other system limitation allow) to save files in an arbitrary location like c:\ann. In some example mql scripts I provided a path in form: “C:\\ANN\\” please note the double ‘\’ as it is escape character.

      • emsi says:

        Oh, and make sure that the folder you want to save your networks exists. You cannot save a file C:ANNfann-long1.net if the C:ANN folder does not exist.

  28. When I save a network the name of the generated file is aways one char long, aways the first char of the path.

    For example:
    f2M_save(ann,”C:\\ANN\\Testing.net”); // Generates a “C” file in the MT root directory.
    f2M_save(ann,”Testing.net”); // Generates a “T” file in the MT root directory.

    Any advise?

    PS: I’m using wine on mac.

  29. nodon says:

    Enter your comment here…Something strange happens when running an old example NeuroMACD in tester mode on MT4 build 646, working on Windows 7.
    The log reports:

    ANN: ‘C:\ANN\NeuroMACD-1446436.0-long.net’ created successfully with handler 0
    ANN: ‘C:\ANN\NeuroMACD-1446436.1-short.net’ created successfully with handler 1

    f2M_save(1, C:\ANN\NeuroMACD-1446436.1-short.net) returned: 0
    f2M_destroy(1) returned: 0
    f2M_save(0, C:\ANN\NeuroMACD-1446436.0-long.net) returned: 0
    f2M_destroy(0) returned: 0

    Actually no net files had beed created.
    I tried it even in visual mode to catch if anything appears in ANN folder, at least temporarily. Definitely, no files were created.
    I tried to: run the trminal explicitely as an administrator, turn the UAC off, change the destination disc, run this example on XP e.t.c.

    The simple question is how to make it work on MT4 646?

    • emsi says:

      Seems that MT4 build 6xx uses UTF-16LE. When that is passed to fann2MQL the file open function expects ASCII. Try to edit the MQL file in ASCII editor prior to compilation and fix the path (or use HEX editor preferably).

  30. nodon says:

    I downloaded a fresh copy of the NeuroMACD. Before that I installed Fann2MQL and VC++ 10 redistributable. By the way, it generated an error related to x64. As the setup installs two versions at a time, there is no way to chose the correct one. All the same I removed the x64. After making sure in HEX that the source NeuroMACD contains pure ASCII path definition I performed the test again. Unfortunately it resulted in the empty ANN folder again. Another try on disc D: has given the same result.
    The 0 returned by the f2M_save is most annoying. A failure should generate -1 instead.
    The conclusion is unclear. Either it is a question of the OS or MT4 6xx.
    The “D:” is not the system disc so it is rather not a security issue.
    Other idea?
    .

    • zi10ge says:

      for mt4 6xx+

      1. replace in Fann2MQL.mqh

      int f2M_create_from_file(string path);
      int f2M_save(int ann,string path);

      to this —>
      int f2M_create_from_file(char &path[]);
      int f2M_save(int ann,char &path[]);

      2. in you code for example

      uchar p[];

      path = “C:\\ANN\\me.nn”;

      StringToCharArray(path,p,0,-1,CP_ACP);
      ret = f2M_save (ann, p);

      • DANIEL says:

        great. It works. It took me 4 weeks to make it work on linux (using wine 1.9.7 on debian stable) and I thought that it was the operating system. Actually, it was due to this issue. Your solution works like a charm :).

      • Darek says:

        zi10ge, good work!
        And not to have to remember this conversion ( StringToCharArray(path,p,0,-1,CP_ACP); )
        every time, after the change as point 1, you can add new functions to your Fann2MQL.mqh file after import section:

        int f2M_create_from_file_1(string path) {
        uchar p[];
        StringToCharArray(path,p,0,-1,CP_ACP);
        int ret=f2M_create_from_file(p);
        return ret;
        }
        int f2M_save_1(int ann,string path) {
        uchar p[];
        StringToCharArray(path,p,0,-1,CP_ACP);
        int ret=f2M_save(ann,p);
        return ret;
        }

        and now you can use them as the previous without conversion of “path” string.

  31. fabs says:

    Hello
    this forum doesnt look very active recently, but I will try to post anyway since Im having issues running the 64bit version.
    After having installed latest 64bit (Fann2MQL 0.1.5_x64.msi) and copied dlls into MT5 library directory, I got the error message “Fann2MQL,dll is not a 64-bit version”

    also tried to download previous 64bit version with same result.
    Ive checked the dll with visual studio dumpbin and it seems to be x86.

    any indication much appreciated.
    thanks

  32. Titus says:

    Hello,

    great work!
    A question to the f2M_create_standard function(l1num, l2num, l3num, l4num) 4 Layers at fann2mql…now i want to use more then 4 Layers for example 6 Layers (Input, Hidden1, Hidden2, Hidden3, Hidden4, Output) with c++ it’s possible, but with the fann2mql max 4 layers…
    can any make a dll with more then 4 layers?

    regards

    • emsi says:

      It’s not possible. If you need more layers you need to use fanntool or some external code to crate the network, then save it to file and use that inside fann2mql.

      • zzmeta4 says:

        could I customize it by adding more layer in Fann2MQL.cpp like ,
        fanns[_ann]=fann_create_standard(num_layers, l1num, l2num, l3num, l4num, l5num, l6num);
        ?
        More generally, I found the function is not complete, like fann_get_rprop_increase_factor not included in Fann2MQL could I add it in Fann2MQL.cpp and re compile it to Fann2MQL.dll?
        Thank you.

      • emsi says:

        If you need more layers you should create your ann outside of MT and just load that network(s).
        If you need more functions feel free to add them. That’s what the OpenSource is for :) You may even contribute your code back so others can benefit from it.

  33. asut says:

    Hello. I have problems with f2m_save(). No matter what I put on second argument, the result is always < 0 (error). I tried with:
    c:\ann\file.net
    c:\\ann\\file.net
    file,net
    file.net is never created
    I tried also using older versions of fann2mql
    In the past with older version of Metatrader it worked ok
    What can I do?

  34. Wowa says:

    Hello,
    i really appreciate the work from the autor on this page! This must be said first! But i have some problems with my network in mql for mt4 trader.
    I have issues to create a ann from file. I created a quite big network with ca. 40 inputs, 100 neurons in second layer and 50 neurons third layer. If i do not train the network and save it with random weights, it can load the network succesfully. And if i train the network it is saved well, but can’t be loaded. here is my code:
    string path = “C:\\ANN\\NN06.nn”; (global)
    uchar p[];
    //ANN Creation
    StringToCharArray(path1,p,0,-1,CP_ACP);
    ann=f2M_create_from_file(p);

    Saving:
    uchar p[];
    int ann;
    StringToCharArray(path,p,0,-1,CP_ACP);
    ann=f2M_save(ann,p);

    what could be the reason and how to deal it?
    Thank u very much!

    • emsi says:

      What do you mean: “can’t be loaded”?

      • Wowa says:

        i mean it can not be created from file. the value ann in my case gets a -1 handle back. i think it isn’t created then. if it is successfull, ann gets a handle of “ann=0” back.

    • emsi says:

      Most probably loading takes too much time and MQL engine is terminating the function. AFAIR MQL requires function to return in something like 500ms or so. If that is the case there’s hardly anything you can do with it in this setup (other than trimming down your network).
      Alternative could be to use client server approach with server handling the ann stuff and MQL just interfacing with it using the client (the server would do the heavy lifting). However that’s different story that would require different approach thank fann2MQL. :/

    • Wowa says:

      thanks for the advice.
      u may are right. giving the net smaller size increases the chance for it to be loaded. but why is it the saving process never gives me a negative return. should also have some kickbacks if the net is too large.
      Also saving and loading an untrained network works fine, which have the same file size as the trained one.
      i tried to handle the creating from file in a iteration loop, to give it more attemps. but for now it doesnt’ help. either it can be loaded in first attemp, or can not be loaded at all.
      i also thought about the weights. in my trained network some of them may go close to zero and may there are some conversion problems for variable types? When i looked into the saved trained network file i found this for some weights:
      (34, -1.#QNAN000000000000000e+000)
      May it cause the problem?

      • Wowa says:

        bingo, ecxchanging all #QNAN in the net file with some zeros does the job. So now i need to balance the net in a better way to not produce zeros for the weights. actually i normalized all inputs. i take as now my net is trying to say me, that some inputs are not neccessary, but also i may not trained the net properly on diversity data. anyway i know where to look now. If you have any other ideas, you can may share

    • DANIEL says:

      I guess that you got #QNAN because you used wrong activation functions. I guess the first thing that you need to do is to change the fann_set_activation_function s for either hidden or output layers.
      I have observed that if you train your networks from FANN code directly, you need to write your code to initialize the different parameters like activation functions, error functions, steepness, etc. If they are not initialized, for some particular cases, you could get odd results.
      However, if you use the FANNTool, the FANNTool developers have gone through the trouble of initializing all the relevant parameters.

      • Wowa says:

        Thank you. I already found the problem. It is a not proper normalization of input parameters. That is how i understood my problem, correct me if i’m wrong. Some of my inputs where much bigger then 1. Due to the sigmoid activation function the changing rate after the first layer is very small then. So the learning algorithm is trying to increase the weights to take this small change into consideration. That is where the weights exceeded the double type variable and blow up the network. But normalization of inputs helps alot :D!
        But my other Question for everybody is: How do you use the network in accordance to trading? I tried to find sources for the neural network trading EA or publications in that field, but it seems the youngest publications are all 3-5 years old. Is there a decrease of the neural network in trading applications? Did it not approved themselves in trading? You may have some good sources for good/bad examples of neural network applications in that field?

      • emsi says:

        You should think about batch normalization. Take a look at this paper:
        https://arxiv.org/abs/1502.03167

        There’s a colossal difference in what was possible and well known in NN field prior to ~2012 and what is possible now. Think about keras/tensorflow and torch frameworks as a good starting point for learning.
        Following https://www.udacity.com/course/viewer#!/c-ud600 and https://www.udacity.com/course/viewer#!/c-ud730/ courses might be a good source for inspiration as well.

  35. DANIEL says:

    I am usually training my data by writing C code and use my saved trained network in fann2mql.
    And for your normalization issues, my C code looks like this:

    #include
    #include
    #include
    #include
    #include
    #include “fann.h”
    double average(double data[], int n)
    {
    double mean=0.0;
    int i;
    for(i=0; i<n;++i)
    {
    mean+=data[i];
    }
    mean/=n;
    return mean;
    }
    double stdev(double data[], int n)
    {
    double sum_deviation=0.0;
    double mean=0.0;
    int i;
    for(i=0; i<n;++i)
    {
    mean+=data[i];
    }
    mean/=n;
    for(i=0; i 1)
    {
    time_t t;
    size_t i;
    srand((unsigned) time(&t));
    for (i = 0; i < n – 1; i++)
    {
    size_t j = i + rand() / (RAND_MAX / (n – i) + 1);
    int t = array[j];
    array[j] = array[i];
    array[i] = t;
    }
    }
    }

    Then, in my main function I load my data into train_data_v1:
    struct fann_train_data *train_data = NULL, *train_data_v1 = NULL,*test_data = NULL;
    train_data_v1 = fann_read_train_from_file("../train_data/my_train_data.dat");

    Observation: For my scenario, my output data is binary ( 0 or 1 )

    Then I split my data randomly into 80% train vs 20% test:
    fann_shuffle_train_data(train_data_v1);
    int size,size_train,size_test;
    double procent_train=0.8;
    size = (int) fann_length_train_data(train_data_v1);
    size_train = (int) size*procent_train;
    size_test = (int) (size-size_train);

    Then I compute the average and the standard deviation for all my data and store them into the media and standard_deviation variables

    unsigned int i = 0, j = 0;
    double data[size];
    double media[num_input];
    double standard_deviation[num_input];
    for (j=0; j<num_input; j++)
    {
    for(i = 0; i input[i][j];
    }
    media[j]=average(data,(int) size);
    standard_deviation[j]=stdev(data,(int) size);
    }

    I split my data using a vector:

    for(i = 0; i < (int) (vect_imp_size*procent_train); i++)
    {
    vect_imp[i]=1;
    }
    for(i = (int) (vect_imp_size*procent_train); i < vect_imp_size; i++)
    {
    vect_imp[i]=0;
    }
    shuffle(vect_imp,vect_imp_size);

    Then I create my train_data and my test_data networks,

    train_data=fann_create_train(size_train,num_input,num_output);
    test_data=fann_create_train(size_test,num_input,num_output);

    I them populate my train and test data and define my binary output data to be either -1 or 1.

    int tr=0, te=0;
    for(l = 0; l < size; l++)
    {
    if(vect_imp[(int)(vect_imp_size*l/size)]==1 && tr<size_train)
    {
    for (m=0; minput[tr][m]= (double) ((train_data_v1->input[l][m]-media[m])/standard_deviation[m]);
    }
    else
    {
    train_data->input[tr][m]= (double) 0.0;
    }
    }
    for (p=num_output_start;poutput[tr][p-num_output_start]= (double) -1+2*train_data_v1->output[l][p];
    tr++;
    }
    else
    {
    if(vect_imp[(int)(vect_imp_size*l/size)]==0 && te<size_test)
    {
    for (m=0; minput[te][m]= (double) ((train_data_v1->input[l][m]-media[m])/standard_deviation[m]);
    }
    else
    {
    test_data->input[te][m]= (double) 0.0;
    }
    }
    for (p=num_output_start;poutput[te][p-num_output_start]= (double) -1+2*train_data_v1->output[l][p];
    te++;
    }
    else
    {
    if ((te>=size_test && tr>=size_train) || te+tr>=size) l=size;
    }
    }
    }

    Then I define my FANN training parameteres:

    fann_set_training_algorithm(train_ann,FANN_TRAIN_RPROP);
    fann_set_activation_function_hidden(train_ann,FANN_SIGMOID_SYMMETRIC);
    fann_set_activation_function_output(train_ann,FANN_SIGMOID_SYMMETRIC);
    fann_set_learning_rate(train_ann,0.25f);
    fann_set_learning_momentum(train_ann,0.0f);

    fann_set_train_error_function(train_ann, FANN_ERRORFUNC_LINEAR);
    fann_set_activation_steepness_hidden(train_ann, 0.25f);
    fann_set_activation_steepness_output(train_ann, 0.75f);
    fann_set_quickprop_decay(train_ann,-0.0001f);
    fann_set_quickprop_mu(train_ann,1.75f);
    fann_set_rprop_increase_factor(train_ann,1.2f);
    fann_set_rprop_decrease_factor(train_ann,0.5f);
    fann_set_rprop_delta_min(train_ann,0.0f);
    fann_set_rprop_delta_max(train_ann,50.0f);

    fann_shuffle_train_data(train_data);
    fann_init_weights(train_ann,train_data);
    fann_set_train_stop_function(train_ann,FANN_STOPFUNC_MSE);

    And then I train my data my storing the my ANNs into an MinANN[] vector:

    float TRAIN_ann_MSE = 1.;
    float TEST_ann_MSE = 1.;
    float OCS_ann_MSE = (TRAIN_ann_MSE + TEST_ann_MSE)/2;
    float error;

    for(i = 1; i <= max_epochs; i++)
    {

    error = fann_train_epoch(train_ann, train_data);

    double trainMSE=fann_get_MSE(train_ann);
    double testMSE=-1;
    unsigned int newBitFail=fann_get_bit_fail(train_ann);

    fann_reset_MSE(train_ann);
    fann_test_data(train_ann,test_data);
    testMSE=fann_get_MSE(train_ann);
    fann_test_data(train_ann,train_data);

    if(trainMSE <= desired_error) { desired_error_reached = 0; } else { desired_error_reached =-1; }

    // Memorizing Begin
    if(i==1)
    {
    for(q=0;q trainMSE )
    {
    if( MinANN[0]) fann_destroy( MinANN[0]);
    MinANN[0]=fann_copy(train_ann);
    MinTrainingMSE[0]=trainMSE;
    MinTestingMSE[0]=testMSE;

    }

    // Minimum Testing MSE
    if(MinTestingMSE[1]> testMSE )
    {
    if( MinANN[1]) fann_destroy( MinANN[1]);
    MinANN[1]=fann_copy(train_ann);
    MinTrainingMSE[1]=trainMSE;
    MinTestingMSE[1]=testMSE;

    }
    // Minimum (Training MSE + Testing MSE )/2
    if((MinTestingMSE[2]+ MinTrainingMSE[2])> (trainMSE + testMSE) )
    {
    if( MinANN[2]) fann_destroy( MinANN[2]);
    MinANN[2]=fann_copy(train_ann);
    MinTrainingMSE[2]=trainMSE;
    MinTestingMSE[2]=testMSE;

    }

    // print current output
    if(epochs_between_reports && (i % epochs_between_reports == 0 || i == max_epochs || i == 1 || desired_error_reached == 0) )
    {
    printf(“Epochs%5d. TRAIN: %f TEST: %f OCS: %f BitFail: %d\n”, i,trainMSE ,testMSE,((trainMSE + testMSE))/2 , newBitFail);

    }

    if(desired_error_reached == 0)
    break;
    }

    After my ANN is trained, I save the output:

    char path[512];
    strcpy( path, “../mypath/”);
    strcat(path,”min_TRAIN_MSE.net”);
    fann_save(MinANN[0], path);

    strcpy(path, “../mypath/”);
    strcat(path,”min_TEST_MSE.net”);
    fann_save(MinANN[1], path);

    strcpy(path, “../mypath/”);
    strcat(path,”min_OCS_MSE.net”);
    fann_save(MinANN[2], path);

    I hope this helps.

  36. Andrew says:

    I’m getting an error when I call the f2M_train_on_file command, having previously set up a network using the f2M_create_standard command. The error I’m getting from within an MT4 script is 2016.09.14 22:12:00.134 CreateFANNConfigFile EURUSDbo,M1: stack damaged, check DLL function call in ‘CreateFANNConfigFile.mq4’ (50,4)
    Line 50 is the f2M_train_on_file command, looking like this in the code: f2M_train_on_file(ann,input_full_name,max_epochs,permitted_error);
    Any ideas on how to fix this please? Thanks in advance.

    • emsi says:

      Please use the workaround described by zi10ge.
      In other words redefine the function as follows:
      int f2M_train_on_file(int ann, char &filename[], int max_epoch, double desired_error);
      and use it accordingly.

      I’m about to account for that in the short to come release.

  37. Gustavo says:

    Hi, I just start to use this library, it looks like very good, but is missing some functions. There are some plans to include functions like fann_train_on_data()? I’m trying to training the net faster.
    Thanks

    • emsi says:

      This question has already been answered.
      You can use FannTool of FannExplorer to train on data sets.
      MQL is working on a chart and is tick based. Training on data would take too much time to be completed between tick. You have to handle that outside of MQL.

  38. autratec says:

    Hi, i am facing this issue when i try to compile the source code:

    ‘path’ – parameter conversion not allowed

    the error was related to:
    ann = f2M_create_from_file (path);
    and
    ret = f2M_save (ann, path);

    anyone can help here ?

    • Mete YALCİNER says:

      i changed this two fuction like this
      int
      ann_load (string path)
      {
      int ann = -1;

      /* Load the ANN */
      uchar p[];
      StringToCharArray(path,p,0,-1,CP_ACP);
      ann = f2M_create_from_file (p);
      if (ann != -1) {
      debug (1,
      “ANN: ‘” + path + “‘ loaded successfully with handler ” + ann);
      }
      if (ann == -1) {

      /* Create ANN */
      ann =
      f2M_create_standard (4, AnnInputs, AnnInputs, AnnInputs / 2 + 1,
      1);
      f2M_set_act_function_hidden (ann, FANN_SIGMOID_SYMMETRIC_STEPWISE);
      f2M_set_act_function_output (ann, FANN_SIGMOID_SYMMETRIC_STEPWISE);
      f2M_randomize_weights (ann, -0.4, 0.4);
      debug (1,
      “ANN: ‘” + path + “‘ created successfully with handler ” +
      ann);
      }
      if (ann == -1) {
      debug (0, “ERROR INITIALIZING NETWORK!”);
      }
      return (ann);
      }

      void
      ann_save (int ann, string path)
      {
      int ret = -1;
      uchar p[];
      StringToCharArray(path,p,0,-1,CP_ACP);
      ret = f2M_save (ann, p);
      debug (1, “f2M_save(” + ann + “, ” + path + “) returned: ” + ret);
      }

      void
      ann_destroy (int ann)
      {
      int ret = -1;
      ret = f2M_destroy (ann);
      debug (1, “f2M_destroy(” + ann + “) returned: ” + ret);
      }

      double
      ann_run (int ann, double &vector[])
      {
      int ret;
      double out;
      ret = f2M_run (ann, vector);
      if (ret < 0) {
      debug (0, "Network RUN ERROR! ann=" + ann);
      return (FANN_DOUBLE_ERROR);
      }
      out = f2M_get_output (ann, 0);
      debug (3, "f2M_get_output(" + ann + ") returned: " + out);
      return (out);
      }
      it worked fine :)

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s