Millis Accuracy Again

If you need it that accurate I would use one of the Maxim I2C/SPI RTC, like DS3231/3234.
Those spit out very accurate times, read it as often as you want to keep the arduino sync'ed.

Accuracy ±2ppm from 0°C to +40°C
Accuracy ±3.5ppm from -40°C to +85°C

Compare its frequency stability to the best crstal one can find

Frequency Stability ±10ppm
Frequency Tolerance ±10ppm
Operating Temperature -20°C ~ 70°C

Maybe I'll go back to a lookup table.

I initially started this project on an Uno and didn't want to use RAM for LUT (didn't think about the flash..)

Actually I think a Taylor expansion will be good enough, it can be computed in fixed point.

But my problem isn't arctan! it's the timekeeping! Taylor expansion, using fixed point, etc. etc. are all solutions to a problem that I don't have, because I am not sure if it's the arctan routine itself that is causing lost ticks... it could be the SPI library (which I can't show in reproducible code because my SPI routine requires the existence of the MCP3304 ADC).

I've rolled my own millis() using TimerOne. I'll see if that works (I don't expect it will, but...)

The new timer code is in its own module, so I can add a ChronoDot or DS1307 later. I figure a 1PPS RTC is good enough, I'll interpolate the milliseconds from... millis()!

I could also use a DS1390 which has 10ms resolution, but they are not available from the local Farnell/Newark/Element14.

In my designs I use this slightly less expensive crystal
(35 cents vs 48 cents)

Frequency Stability ±50ppm
Frequency Tolerance ±30ppm

You could also sync to the 1 pulse per second from a GPS receiver.

A simple sketch printing the value of millis()/1000 every second shows that my UNO Rev 3 is losing a little over three seconds an hour, pretty consistently. I assume that's how accurate the resonator is.

I'm losing much more than that.. about 120 seconds in 4000+ seconds. :fearful:

I'll probably use an external RTC to sanity-check against millis(). Short-term accuracy of millis() is OK, so this scheme should work.

orly_andico:
I'm losing much more than that.. about 120 seconds in 4000+ seconds. :fearful:

How about posting a sketch that demonstrates the problem? Preferably with extraneous code removed.

With an external clock interrupting every 1 ms sharp and incrementing the value of an unsigned long, you'd be able to use that variable instead of millis().

It might be off if another interrupt is acting but ISR's are supposed to be short for a reason.

I wonder how short serial print() is? One way to influence serial print() time is to limit serial text and send values over as packed binary, and only send raw or compressed data. Set up a Processing sketch to get those bytes in the PC, process and make the CSV file and your Arduino will have more cycles for aiming and data capture.

I promise you, if you can code Arduino, you can code Processing. It will help greatly with resource headaches.

I tried rolling my own millis() using an ISR. The ISR only increments a volatile global (absolute minimum).

It still loses time (about 3 seconds over a 1200-second interval) compared to my NTP-synced laptop clock. So it does seem that something (arctangent, SPI library.. or it could be Serial.print itself) is somehow causing both millis() and my custom ISR to lose ticks. Although I've reduced the frequency of the above to 1/second (hence only lose 3 seconds rather than 60). Time to look at an external RTC. Obviously I can't use the 1PPS to interrupt, I'd lose time too. I'll have to read the RTC in my main loop.

Did you try with an external oscillating source?

No, I have not... but if the millis() or other ISR is getting masked, an external source would also lose ticks. I don't have a precise external time base anyway so the RTC seems the way to go. Looks like the PCF8583 is the way to go, 8-pin DIP, through-hole, 10ms accuracy.

ok someone just told me that the interrupt priority goes...

External interrupts, USB, WDT, Timer 1, Timer 0, SPI, USART...

So am I right in assuming that it's actually Serial.print() which is screwing with my timekeeping?

millis() / micros() uses Timer 0 right?

So if I hook up an external clock generator, I can get ticks that don't get masked by Serial.print() ?

I sped thru the thread, but I want to say a couple things. I'm sorry if I am redundant on this.

  1. millis() runs every 1.024 mS. That means that it is always in error by some amount. Every 40 or so times, millis() increments by 2 to get back on track. Overall, millis() keeps great time and you can indeed make a clock using it as crossroads indicated by creating a "rolltime" and always adding 1000 to it. I have one here that loses 30 seconds per day consistently because my resonator is off by 300ppm. My other UNO is only off by a few ppm at room temp and should keep excellent time using the resonator. I just haven't switched it to this other project because of all the jumper wires. It should be accurate to within a couple of seconds per day using just a ceramic resonator.

  2. since millis() only needs to handle rollover roughly once per mS, there is no reason that other "well behaved" ISRs would make millis() miss a count. OTOH, millis() takes approx 6uS to execute and causes all kinds of jitter with other things.

orly_andico:
ok someone just told me that the interrupt priority goes...

External interrupts, USB, WDT, Timer 1, Timer 0, SPI, USART...

So am I right in assuming that it's actually Serial.print() which is screwing with my timekeeping?

millis() / micros() uses Timer 0 right?

So if I hook up an external clock generator, I can get ticks that don't get masked by Serial.print() ?

They don't get lost, they just get delayed. So if an external interrupt and timer0 interrupt happen on exactly the same clock cycle, it will run the external interrupt and, when it's done, run 1 instruction from the main program and then run the timer0 interrupt (the arduino will NOT stop one interrupt to run another of a higher priority, the priority list only comes into play in the extremely rare circumstance that they both happen on the same mid-clock cycle or whatever it says in the datasheet) You only have a problem when the external interrupt takes so long that two timer0 interrupts happen while it's running. Otherwise, the one instruction preformed between the two interrupts could make the time off, but unless that instruction is cli (which is executed as the second instruction in millis()) the time will be correct.

So: the time could be off by up to 2ms+time of an interrupt at a time, but only in extremely rare circumstances, and it will resync (ie no drift). However, if you have an interrupt that takes longer than 1ms to execute, you will start losing time and getting drift. Also, if the resonator is off your time will, obviously, be off.

Using an RTC with an external interrupt will have the same potential problems except there's no 1.024 thing which could make your time off by 2ms. And their crystals are often better.

I should add that SoftwareSerial can make millis() lose ticks if you use a baud rate of 9600 or slower since it takes excessive liberty with cli(). Use AltSoftSerial instead. It uses a timer (Timer1 I think) to generate the outgoing bits instead of burning up cycles in a loop (with ints disabled) between bits. It's also more clever during receive.

  1. I'm using Serial. Not sure if this is SoftwareSerial or whatever.

  2. Everything I read is that millis() shouldn't lose so much time. But I see what I see. My laptop clock could be wrong (even though its actively NTP-syncing). But the telescope tracking is off, and this is detectable by watching a star (the ultimate reference). So I know the Arduino is forcing the tracking off, which can only be explained by its losing time.

orly_andico:
No, I have not... but if the millis() or other ISR is getting masked, an external source would also lose ticks. I don't have a precise external time base anyway so the RTC seems the way to go. Looks like the PCF8583 is the way to go, 8-pin DIP, through-hole, 10ms accuracy.

It will only be late by as much time as the current ISR takes to finish, then it should run. The next tick, being from external, will still come 1 ms after the last was -sent-. And it looks like Serial is the real bugger in this case.

I have Arduino-compatible Teensy's. Their Serial goes straight to USB, there's an IDE patch for them and they are cheap but I do recommend to get the pins unless you are very good soldering heavy pins very close to surface mount parts. Not me.

orly_andico:
2) Everything I read is that millis() shouldn't lose so much time. But I see what I see. My laptop clock could be wrong (even though its actively NTP-syncing). But the telescope tracking is off, and this is detectable by watching a star (the ultimate reference). So I know the Arduino is forcing the tracking off, which can only be explained by its losing time.

No, I'm saying that the SOFTWARE won't lose time. Everyone knows the resonator is only accurate to 0.5%, which could be up to be seven minutes over a day or 18 seconds per hour or 6 seconds per 1200 second period (twice what you measured)

I have also reproduced this problem on a Digilent Max32 - which definitely has a crystal... this is truly weird because millis() is implemented using a core timer on the PICMX, not via ISR.

orly_andico:
Here's my problem: over a worm cycle, I cannot get the measured rotational speed to match 86164 seconds. It varies with every run, e.g. 86300, 85950, 86000... about 1000ppm difference.

You clearly have a bug in your sketch. Until you post your code, all this talk of crystals, resonators, and RTC is a waste of time. You need to get the bug fixed before deciding if you do or do not need a more accurate timepiece.

To put this matter in perspective, I have an ATtiny13 sitting on my desk, running from an older generation of the internal oscillator, that is more accurate.

...post your code already.

Here is the latest copy - https://dl.dropboxusercontent.com/u/63497702/read_encoder_v32.7z

It is spread over multiple tabs, so I couldn't figure out how to paste them all into one code block.

The key variables are _tstart (time when we finished calibrating the encoder and are now measuring angles for real) and tcnv (the time the current angle was measured).

So, if we know when we started measuring the angle (_tstart) and we know the time the current angle was taken (tcnv), we just multiply by the sidereal tracking rate (_trackingRate), which is 15.04 arc-seconds per second and we know what the encoder angle ought to be.

And any deviations from that, is a mechanical error that we can correct.

Note that tcnv is set in read_encoder (when we get the current encoder reading) and _tstart is set by the set_origin() function. Both are set from millis().

The problem is because millis() can't be trusted, the theoretical encoder angle drifts monotonically in one direction (but not at a constant rate, because that I could fix).

void read_encoder(long &A, long &B, long &tcnv) {
  int reading;
  int i;

  long t0, t1;

  t0 = exttimer_millis();

  // this should finish in 5ms or less @ 32ksps
  for (i = 0; i < OVERSAMPLING; i++) {
    reading = read_adc(1);
    A += reading;

    reading = read_adc(2);
    B += reading;
  }

  A = A / OVERSAMPLING;
  B = B / OVERSAMPLING;

  t1 = exttimer_millis();
  
  // tcnv should be in milliseconds
  tcnv = (t0 + t1) / 2;
}

long calc_full_angle (long theta, long tcnv) {
  // calculate the full encoder angle (in DECI-arcseconds!)
  _current_encoder_angle = (long) (get_quadrature_count() * 648) + (theta % 648);

  if (_cal) {
    _current_encoder_angle -= _origin_encoder_angle;

    // don't use seconds directly as angles here are in DECI-arcseconds
    double tElapsed = ((double) (tcnv - _tstart)) / 100;
    _theoretical_encoder_angle = (long) (_trackingRate * tElapsed) + 0.5;
  } 
  else {
    _theoretical_encoder_angle = _current_encoder_angle;
  }

  return ( _current_encoder_angle );
}