Latest Blogs

Spring at last

Kelling Heath astro camp wasnt too bad weather wise. 

IMG 3473

More importantly the mk3 ScopeDog continued to perform OK. I tried out some new altitude drive rollers which had Shore70 hardness urethane sleeves. These worked incredibly well.

Summer break from observing now, so time to get busy in the workshop.

More new mathematics

Since my mk2 ScopeDog I had been calculating tracking rates using classic sphercial trigonometry. Quite straightforward and the method used in professional observatories. But their scope mounts are level!

Whilst the Nexus DSC takes care of positional accuracy with a tilted mount, and the digital finder measures absolute posiion for refinements, the calculated tracking rates in Az & Alt werent taking account of any tilt. This means the scope needs to be fairly well levelled when set up. Any tilt results in tracking slowly losing the target over a period of some minutes.

With the mk4 ScopeDog (no Nexus DSC)  I really needed to do my own 2 star alignment to determine tilt and improve tracking. A couple of days thinking how to do it werent very fruitful - just a headache!

Serge at Astrodevices suggested a few ideas, and a little while later I found an excellent document written by Toshimi Taki that covers it - but it was all based on matrix operations. I’d forgotten completely how to do this - so a day of revision followed. Fortunately Python has a lot of built in (NumPy) matrix operation methods.

Having coded the technique and tried it on some test data - I’m impressed and keen to incorprate it into ScopeDog mk4, and possibly mk3.

mk4 ScopeDog - no encoders!

All these versions of ScopeDog are a way of marking significant developments in the design and functionality.

A recap of versions so far.

mk1 - almost 10 years old now. Built around now obsolete Phidget modules, built in GPS, Pi3 and code written in Java.

mk2 - First appeared in 2021. New Phidget modules allowed a re-packaging into a much smaller box. Pi4 running code in Java. Closer intregration with the Nexus DSC meant no GPS needed. 

mk3 - 2022. Same basic hardware as mk2 but code now in Python. Digital finder functionality built in. New hand paddle with OLED display and digital finder controls. Vastly improved pointing accuracy and ease of initial alignment.

So why a mk4?

Based on experiences with the mk3, I could see the potential to remove the need for mount encoders completely. The digital finder can quickly determine absolute telescope position and the drive stepper motors can manage position inbetween solves.

With a week confined to a caravan at a rainy astro camp, the code rewrite made good progress. Currently I can just power up the scope, point it at an object, and 3 seconds later it is tracking. No alignment needed! I can then do a goto to a new target, which is initially accomplished using stepper motor step counts, but once that is done, a plate-solve automatically corrects any errors and puts the scope on target. Adds about 4 seconds to a goto.

Only about 10% of the core code needed rewriting, but the most difficult part was establishing a direct wifi link to SkySafari (previous this had been left to the Nexus DSC)

2023 Observing so far (Spoiler -its bad!)

Attended the Spring astro camp at Haw Wood Farm in Suffolk last month. Not the worst weather, but almost!

However the caravan I hire is cosy and I take most of my telescope drive development kit. With a lot of time to spare I made good progress on the mk4 ScopeDog code (more in next blog). 

I made the right decision not to take the 18” Dobsonian, but instead my 100mm Miyauchi binoculars. These proved very suitable for quickly taking advantage of the few clear spells inbetween rain and wind.

IMG 5457

mk3 ScopeDog update

Picture 1

After a lot more hours on the scope simulator, I’m happy the mk3 not only works OK, but is better than previous versions. The new handbox is a real pleasure to use - adding a text display and buttons to the scope drive opens up new possibilities. The OLED display is nice and can be user dimmed down to suit.

The simplicity of just plugging a camera straight into the scope drive box is great and having integrated the scope drive and finder software lots more features can be added.

In my view, this is the future for Dobsonian drives!

I’m now experimenting with a ‘ k4' - no DSC or encoders needed. It will just use the stepper motor count and finder solves to manage scope position.

A brief diversion from astronomy

While in the attic I came across a model boat I had made with my Dad, about 55 years ago. Its the Royal Barge that was used to accompany the Royal Yacht Britannia. Some of the cabin tops were warped or missing and the electrics had long since gone.

With grandchildren, it seemed a waste to let it fester in the attic.

IMG 5405

I therefore spent a fun couple of days remaking parts and re-purposing a radio control set I had lying around, (a long story!)

I bought a new motor and on initial testing found it to be very powerful. I think it is the right power for top speed, but controlling the boat at manoeuvering speed was very difficult. A more sophisticated speed contoller (ESC) was required and possibly even a brushless motor. Lots of money!

My blog followers will know I have been using Raspberry Pi Pico’s recently. It dawned on me that I could use one to modify the proportionality of the speed control.

The receiver outputs a pulse of variable length 1 to 2ms every 20ms. All I had to do was measure the pulse length, apply a correction, and recreate the modified pulse train to send to the ESC.

I tried a few corrections and found a simple ‘square’ law suitable. The graph shows the linear input from the receiver, plus square and cube laws. About 20 lines of micropython code were needed. The pico runs the code on power up (the receiver provides the power).

If necessary, I can use a spare receiver channel to change the law between linear and square, but early tests (in the hot tub!) suggest that wont be needed.

Job done! Pico cost £4.

All I need now is a boating pond.

mk3 ScopeDog tested OK

Winterfest weather wasnt great, but enough clear patches for me to test my new mk3 code. In summary the changes are,

  • All Java code rewritten into Python
  • eFinder code integrated into main ScopeDog code, hence running on same Raspberry Pi
  • ScopeDog can call for a plate-solve automtically at the end of a GoTo, and then refine the pointing. typically achieving about an arc minute absolute accuracy.

Considering how much had changed I was surprised and relieved to find it working so well.

But what was needed was a single hand pad to control all functions. The little 5 way navigation switch needed replacing too, as it was not good with gloves on!

Using the eFinder hand pad as the basis, it was easy to add the ScopeDog joystick to it. The Raspberry Pi Pico board in the hand pad is very flexible. I’m impressed with the Pico. It also gives me the ability to display some ScopeDog functions on the OLED text display.Here’s a shot of the new combined hand pad. The new 5 way switch was expensive, but I see now is worth it.

mk3 ScopeDog getting closer!

Now that the eFinder is working reliably, the next challenge was to integrate it furher with ScopeDog. A little while back I revised ScopeDog to use the latest components, including a Raspberry Pi 4B. The Pi would have more than enough capacity to run the drives and the plate-solving.

First issue was that ScopeDog was written in Java, and the eFinder in Python. This would make complete integration difficult given my coding expertise. So I converted the Java code to Python. 

That done I mashed together the new ScopeDog and eFinder codes. A few minor issues, but it worked almost straight away. The Pi OS & Python interpreter seems pretty good at managing threads and doing repeated plate-solves has no effect on the drives. The camera just plugs directly into one of the spare USB ports on the ScopeDog Pi.

Next job was to combine the two handpads. First attempt was to use the eFinder handpad as is, which meant using buttons to steer the scope. It worked, but I missed the light-touch joystick i have been used to. So I made a new handpad, with buttons for eFinder and display control, and a joystick for scope control. It included the eFinder OLED text display, which gives useful options to display ScopeDog data. Using a Raspberry Pi Pico in the handpad is giving a lot of flexibility in customising it for new features.

I’ll be taking the mk3 to Winterfest next week to give it its first shakedown for real.

Kelling Heath Autumn 2022

Back from a rather variable Kelling Heath astro camp. Observed for a while on three nights, so at least something!

Did get to try out some telescope changes…..

  • The altitude roller drive worked really well overall. However the scope does need to be better balanced. With my old belt drive I didnt have to worry much, but with a dew sodden shroud and a 21mm Ethos, the clutch slipped. Easy enough to add some weights to the back.
  • The new high precision planetary gearboxes on the stepper drives are noticeably better. Much less backlash.
  • The eFinder worked perfectly without a single failed solve. It got quite a lot of interest too.
  • I swapped my x4 Powermate for a x2. Much better suited to me and my eyepiece set.
  • Realised my telescope cover really is worn out. Its been good for about 6 years but I needed to be more careful about sharp edges on the scope! Now it leaks in the rain.

More eFinder developments

I think the development of the eFInder is nearing its end! The transition to Skyfield to do the maths went well. I then decided to use a standard Raspberry PI 32bit OS with a fresh build of installed, rather than use Astroberry. This makes available some useful new features. 

Producing a working local copy of isnt trivial I discovered, but thanks to Dustin via the user group I got there and can now reproduce the build quite easily.

This process led me to look at the capabilities of more closely and in particular the family of supporting programs. One of the most useful is the ability to directly obtain RA & Dec of any pixel in the image, not just the image centre. I re-wrote the offset calibration almost completely and the result is much more accurate and stable.

A very nice feature of the full build is the ability to annotate captured images with markers and labels for catalogue objects. The GUI control is now very comprehensive and here is an example screenshot

I’ve also switched to using a remote OLED display, rather than the LCD module mounted in the Raspberry Pi housing. The LCD got very sluggish at cold temperatures and wasnt very clear either. The OLED is very high contrast and perfect for dark adaptation. A 3m USB cable connects the display/control box to the Raspberry Pi. Full build instruction can be found here.

IMG 4969

© AstroKeith 2022