Jump to content

Welcome to the new Traders Laboratory! Please bear with us as we finish the migration over the next few days. If you find any issues, want to leave feedback, get in touch with us, or offer suggestions please post to the Support forum here.

  • Welcome Guests

    Welcome. You are currently viewing the forum as a guest which does not give you access to all the great features at Traders Laboratory such as interacting with members, access to all forums, downloading attachments, and eligibility to win free giveaways. Registration is fast, simple and absolutely free. Create a FREE Traders Laboratory account here.

SNYP40A1

Best Open-source Data Smoother?

Recommended Posts

I am looking for an open-source smoother to down-sample data. I know there are many such as Hull Moving average, but I have data logged at the ms scale at non-fixed time intervals and I want to sample it on the second / minute / hour etc. range at fixed time intervals. Does anyone know of a good smoother to do this? The key is that the input data is not logged at a constant frequency.

Share this post


Link to post
Share on other sites

Aren't sampling and smoothing different issues? Not sure exactly what you want to end up with. The 'normal' way of dealing with the data would be to sample it (highest high lowest low of sample period) then smooth it. You could of course smooth the data in each sample first (e.g. take an hours data average it and use that price as your sample value for that discrete period)

 

As to actual smoothing algorithms there are whole bunch here (and other places too).

Share this post


Link to post
Share on other sites

It's basically an exponential moving average...that's what would allow for smoothing and downsampling in the same pass. The idea is that as data-points start coming in, they are entered into a distribution such that each datapoint's weight in the distribution decreases with time. Then the average or preferably median of the distribution can be sampled at any time to get the value most closely representing the real-time sample average. One way to get the sample is to re-evaluate all data in the distribution to find whatever info is of interest. But I think there is a more efficient way, that's what I am looking for. Key is smoothing function where the input data does not have a fixed period. Used for plotting tick data on time vs. price chart where each pixel may correspond to seconds or minutes rather than microseconds. How do most software packages handle this issue? I think they just group everything into time periods and then average the time period, as you suggested. That would provide some smoothing...it just seems like it would be more accurate to smooth and down-sample at the same time rather than smoothing on already-downsampled data. I'll think about this some more...the accuracy difference might be insignificant.

Share this post


Link to post
Share on other sites

Are you weighting by time or actually by position in the time series? Will later data effect weights of earlier data (that becomes processor intensive). Just a couple of things to consider.

 

What platform btw?

Share this post


Link to post
Share on other sites

The keyword to search for in the literature is "non-uniform sampling," and most algorithms I'm aware of interpolate to find uniform samples and then do typical DSP (choose your favorite low-pass filter) on the uniform samples. How accurate you need the results to be will dictate how complex the interpolation process is. Of course for some purposes you could apply a (possibly non-linear) least-squares fitted model, or a spline etc. to the data, and call the resulting model the "smoothed" data with no further filtering.

Edited by RichardTodd
added model suggestion

Share this post


Link to post
Share on other sites
Are you weighting by time or actually by position in the time series? Will later data effect weights of earlier data (that becomes processor intensive). Just a couple of things to consider.

 

What platform btw?

 

Weighing by time rather than position in the time series would be more accurate. If I weighted only by position, then I could basically apply any moving average -- that would essentially be treating the data as uniform sampled. I could go even more complicated. I could weight the data by time as well as volume to reflect more of the instantaneous "force" of the market. Or weight by bid vs. ask bias, etc. Although, at that point, I have a new indicator beyond just plotting the data. For now I simply want to feed in the tick data and for any given point in time, get an estimate for what the price is at that point. What I plan to do now is simply take the last trade behind my desired sample point. Been a while since I took a signals and systems class, but I think this process is referred to as zero-order hold:

 

Continuous/Discrete Conversions of LTI Models :: Operations on LTI Models (Control System Toolbox™)

 

I am developing my own platform in Java -- at least the analysis part.

Share this post


Link to post
Share on other sites
The keyword to search for in the literature is "non-uniform sampling," and most algorithms I'm aware of interpolate to find uniform samples and then do typical DSP (choose your favorite low-pass filter) on the uniform samples. How accurate you need the results to be will dictate how complex the interpolation process is. Of course for some purposes you could apply a (possibly non-linear) least-squares fitted model, or a spline etc. to the data, and call the resulting model the "smoothed" data with no further filtering.

 

Non-uniform sampling. That's what I am looking for. I thought about doing least-squares analysis / linear regression, but I think that's probably getting too complicated...I guess all I am really trying to do at this point is plot the data. In Ninjatrader, it looks like what they do is simply group all the data into time intervals (1-minute, 5-minute, 233 tick, etc) and then plot the close of the given bar and hold that level until the next close. Essentially, they are doing zero-order hold. That's simple and it works, but seems like there must be an algorithm I could use and get a much smoother plot for very little effort.

Share this post


Link to post
Share on other sites

SSA (singular spectrum analysis) is a tool that is often used to fill in the gaps on univariate time series data. Then you can sample in fixed time intervals. The problem off course is that this technique is not causal. So you can't use it in real time, but it will work fine on historic data.

Share this post


Link to post
Share on other sites

Actually, the longer I look at this problem, I don't see anything wrong with applying a moving average to the data and ignoring the non-uniform sampling issue. Run a moving average and at the given point of interest, and just take the last update of the MA. The reason that I think this approach is valid is because signal reconstruction, as long as the Nyquist Criterion applies, does not require uniform sampling -- I read that statement somewhere on comp.dsp, but I have not been able to verify it. Can anyone confirm?

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.