I am working in matlab.

I have data samples of two unrelated variables at 256 time-steps. Their plots with their value on Y - axis and time-steps on X-axis is as below. Typical Plot for the first variable say Pos is enter image description here

Typical Plot for the second variable say Vel is enter image description here

Now I need to predict the values for these variables at next 10 time-steps. To check various machine learning techniques to do so , I took values of the variables at first 246 time-steps , predicted the next 10 time-steps and then compared them with their actual value by calculating the mean square error say ms_error.

I have done this using time-series(NAR) ,linear regression,fuzzy input systems,neural networks. but none of these are able to give the value of ms_error lesser than 2. Can someone suggest a learning algorithm to use to predict future values for data samples like these two.

asked May 21 '14 at 01:32

nishantsny's gravatar image

nishantsny
1111


One Answer:

If all the data you have are the 2 plotted sequences, I doubt that machine learning will produce anything useful unless you just get lucky. There is a chance that looking at something like a Fourier or Laplace transform of the signals will reveal some hidden structure that is not apparent in the time domain, in which case you may be able to use that. However, to do any useful prediction on signals as complex (noisy?) as these I expect you will either need a lot more data or prior knowledge about how these signals are produced.

answered May 22 '14 at 02:12

Daniel%20Mahler's gravatar image

Daniel Mahler
122631322

Your answer
toggle preview

powered by OSQA

User submitted content is under Creative Commons: Attribution - Share Alike; Other things copyright (C) 2010, MetaOptimize LLC.