Jump to content

Welcome to the new Traders Laboratory! Please bear with us as we finish the migration over the next few days. If you find any issues, want to leave feedback, get in touch with us, or offer suggestions please post to the Support forum here.

  • Welcome Guests

    Welcome. You are currently viewing the forum as a guest which does not give you access to all the great features at Traders Laboratory such as interacting with members, access to all forums, downloading attachments, and eligibility to win free giveaways. Registration is fast, simple and absolutely free. Create a FREE Traders Laboratory account here.

SNYP40A1

Best Open-source Data Smoother?

Recommended Posts

I am looking for an open-source smoother to down-sample data. I know there are many such as Hull Moving average, but I have data logged at the ms scale at non-fixed time intervals and I want to sample it on the second / minute / hour etc. range at fixed time intervals. Does anyone know of a good smoother to do this? The key is that the input data is not logged at a constant frequency.

Share this post


Link to post
Share on other sites

Aren't sampling and smoothing different issues? Not sure exactly what you want to end up with. The 'normal' way of dealing with the data would be to sample it (highest high lowest low of sample period) then smooth it. You could of course smooth the data in each sample first (e.g. take an hours data average it and use that price as your sample value for that discrete period)

 

As to actual smoothing algorithms there are whole bunch here (and other places too).

Share this post


Link to post
Share on other sites

It's basically an exponential moving average...that's what would allow for smoothing and downsampling in the same pass. The idea is that as data-points start coming in, they are entered into a distribution such that each datapoint's weight in the distribution decreases with time. Then the average or preferably median of the distribution can be sampled at any time to get the value most closely representing the real-time sample average. One way to get the sample is to re-evaluate all data in the distribution to find whatever info is of interest. But I think there is a more efficient way, that's what I am looking for. Key is smoothing function where the input data does not have a fixed period. Used for plotting tick data on time vs. price chart where each pixel may correspond to seconds or minutes rather than microseconds. How do most software packages handle this issue? I think they just group everything into time periods and then average the time period, as you suggested. That would provide some smoothing...it just seems like it would be more accurate to smooth and down-sample at the same time rather than smoothing on already-downsampled data. I'll think about this some more...the accuracy difference might be insignificant.

Share this post


Link to post
Share on other sites

Are you weighting by time or actually by position in the time series? Will later data effect weights of earlier data (that becomes processor intensive). Just a couple of things to consider.

 

What platform btw?

Share this post


Link to post
Share on other sites

The keyword to search for in the literature is "non-uniform sampling," and most algorithms I'm aware of interpolate to find uniform samples and then do typical DSP (choose your favorite low-pass filter) on the uniform samples. How accurate you need the results to be will dictate how complex the interpolation process is. Of course for some purposes you could apply a (possibly non-linear) least-squares fitted model, or a spline etc. to the data, and call the resulting model the "smoothed" data with no further filtering.

Edited by RichardTodd
added model suggestion

Share this post


Link to post
Share on other sites
Are you weighting by time or actually by position in the time series? Will later data effect weights of earlier data (that becomes processor intensive). Just a couple of things to consider.

 

What platform btw?

 

Weighing by time rather than position in the time series would be more accurate. If I weighted only by position, then I could basically apply any moving average -- that would essentially be treating the data as uniform sampled. I could go even more complicated. I could weight the data by time as well as volume to reflect more of the instantaneous "force" of the market. Or weight by bid vs. ask bias, etc. Although, at that point, I have a new indicator beyond just plotting the data. For now I simply want to feed in the tick data and for any given point in time, get an estimate for what the price is at that point. What I plan to do now is simply take the last trade behind my desired sample point. Been a while since I took a signals and systems class, but I think this process is referred to as zero-order hold:

 

Continuous/Discrete Conversions of LTI Models :: Operations on LTI Models (Control System Toolbox™)

 

I am developing my own platform in Java -- at least the analysis part.

Share this post


Link to post
Share on other sites
The keyword to search for in the literature is "non-uniform sampling," and most algorithms I'm aware of interpolate to find uniform samples and then do typical DSP (choose your favorite low-pass filter) on the uniform samples. How accurate you need the results to be will dictate how complex the interpolation process is. Of course for some purposes you could apply a (possibly non-linear) least-squares fitted model, or a spline etc. to the data, and call the resulting model the "smoothed" data with no further filtering.

 

Non-uniform sampling. That's what I am looking for. I thought about doing least-squares analysis / linear regression, but I think that's probably getting too complicated...I guess all I am really trying to do at this point is plot the data. In Ninjatrader, it looks like what they do is simply group all the data into time intervals (1-minute, 5-minute, 233 tick, etc) and then plot the close of the given bar and hold that level until the next close. Essentially, they are doing zero-order hold. That's simple and it works, but seems like there must be an algorithm I could use and get a much smoother plot for very little effort.

Share this post


Link to post
Share on other sites

SSA (singular spectrum analysis) is a tool that is often used to fill in the gaps on univariate time series data. Then you can sample in fixed time intervals. The problem off course is that this technique is not causal. So you can't use it in real time, but it will work fine on historic data.

Share this post


Link to post
Share on other sites

Actually, the longer I look at this problem, I don't see anything wrong with applying a moving average to the data and ignoring the non-uniform sampling issue. Run a moving average and at the given point of interest, and just take the last update of the MA. The reason that I think this approach is valid is because signal reconstruction, as long as the Nyquist Criterion applies, does not require uniform sampling -- I read that statement somewhere on comp.dsp, but I have not been able to verify it. Can anyone confirm?

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Topics

  • Posts

    • Be careful who you blame.   I can tell you one thing for sure.   Effective traders don’t blame others when things start to go wrong.   You can hang onto your tendency to play the victim, or the martyr… but if you want to achieve in trading, you have to be prepared to take responsibility.   People assign reasons to outcomes, whether based on internal or external factors.   When traders face losses, it's common for them to blame bad luck, poor advice, or other external factors, rather than reflecting on their own personal attributes like arrogance, fear, or greed.   This is a challenging lesson to grasp in your trading journey, but one that holds immense value.   This is called attribution theory. Taking responsibility for your actions is the key to improving your trading skills. Pause and ask yourself - What role did I play in my financial decisions?   After all, you were the one who listened to that source, and decided to act on that trade based on the rumour. Attributing results solely to external circumstances is what is known as having an ‘external locus of control’.   It's a concept coined by psychologist Julian Rotter in 1954. A trader with an external locus of control might say, "I made a profit because the markets are currently favourable."   Instead, strive to develop an "internal locus of control" and take ownership of your actions.   Assume that all trading results are within your realm of responsibility and actively seek ways to improve your own behaviour.   This is the fastest route to enhancing your trading abilities. A trader with an internal locus of control might proudly state, "My equity curve is rising because I am a disciplined trader who faithfully follows my trading plan." Author: Louise Bedford Source: https://www.tradinggame.com.au/
    • SELF IMPROVEMENT.   The whole self-help industry began when Dale Carnegie published How to Win Friends and Influence People in 1936. Then came other classics like Think And Grow Rich by Napoleon Hill, Awaken the Giant Within by Tony Robbins toward the end of the century.   Today, teaching people how to improve themselves is a business. A pure ruthless business where some people sell utter bullshit.   There are broke Instagrammers and YouTubers with literally no solid background teaching men how to be attractive to women, how to begin a start-up, how to become successful — most of these guys speaking nothing more than hollow motivational words and cliche stuff. They waste your time. Some of these people who present themselves as hugely successful also give talks and write books.   There are so many books on financial advice, self-improvement, love, etc and some people actually try to read them. They are a waste of time, mostly.   When you start reading a dozen books on finance you realize that they all say the same stuff.   You are not going to live forever in the learning phase. Don't procrastinate by reading bull-shit or the same good knowledge in 10 books. What we ought to do is choose wisely.   Yes. A good book can change your life, given you do what it asks you to do.   All the books I have named up to now are worthy of reading. Tim Ferriss, Simon Sinek, Robert Greene — these guys are worthy of reading. These guys teach what others don't. Their books are unique and actually, come from relevant and successful people.   When Richard Branson writes a book about entrepreneurship, go read it. Every line in that book is said by one of the greatest entrepreneurs of our time.   When a Chinese millionaire( he claims to be) Youtuber who releases a video titled “Why reading books keeps you broke” and a year later another one “My recommendation of books for grand success” you should be wise to tell him to jump from Victoria Falls.   These self-improvement gurus sell you delusions.   They say they have those little tricks that only they know that if you use, everything in your life will be perfect. Those little tricks. We are just “making of a to-do-list before sleeping” away from becoming the next Bill Gates.   There are no little tricks.   There is no success-mantra.   Self-improvement is a trap for 99% of the people. You can't do that unless you are very, very strong.   If you are looking for easy ways, you will only keep wasting your time forgetting that your time on this planet is limited, as alive humans that is.   Also, I feel that people who claim to read like a book a day or promote it are idiots. You retain nothing. When you do read a good book, you read slow, sometimes a whole paragraph, again and again, dwelling on it, trying to internalize its knowledge. You try to understand. You think. It takes time.   It's better to read a good book 10 times than 1000 stupid ones.   So be choosy. Read from the guys who actually know something, not some wannabe ‘influencers’.   Edit: Think And Grow Rich was written as a result of a project assigned to Napoleon Hill by Andrew Carnegie(the 2nd richest man in recent history). He was asked to study the most successful people on the planet and document which characteristics made them great. He did extensive work in studying hundreds of the most successful people of that time. The result was that little book.   Nowadays some people just study Instagram algorithms and think of themselves as a Dale Carnegie or Anthony Robbins. By Nupur Nishant, Quora Profits from free accurate cryptos signals: https://www.predictmag.com/    
    • there is no avoiding loses to be honest, its just how the market is. you win some and hopefully more, but u do lose some. 
    • $CSCO Cisco Systems stock, nice top of range breakout, from Stocks to Watch at https://stockconsultant.com/?CSCOSEPN Septerna stock watch for a bottom breakout, good upside price gap
    • $CSCO Cisco Systems stock, nice top of range breakout, from Stocks to Watch at https://stockconsultant.com/?CSCOSEPN Septerna stock watch for a bottom breakout, good upside price gap
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.