I started with a Raspberry Pi HQ camera. It's perhaps not ideal having very small pixels, but it is cheap and easy to drive from the RPi. I’m waiting for clear skies to see if it is sensitive enough.

The RPi is a RPi4 4GB, loaded with the Astroberry suite. This provides the standard Raspian Operating System (OS), a handy set of features to control the camera, and importanly it includes a built in local copy of Astrometry.net - which is the plate-solving solution I was going to use.

The guide scope started off as a Svbony SV106 50mm F4 unit. While mechanically sound (actually quite good), optically it was rubbish. Not helped by the manufacturers putting the objective in backwards! Even after correcting that the images were far too soft to be of any use. This was replaced with a 50mm William Optics guide scope, which whist not so nice mechanically, but has very good optics.

Whist I could use Astroberry with its Kstars & Ekos modules to take images and try plate-solving, this wasnt going to be a route to the desired end product. This is because I didnt want a full display (just a button by the eyepiece and a text readout by the finder) and I wanted to use the plate-solve output to automatically ‘nudge’ my 18” Dobsonian right onto target after my existing GoTo had done 95% of the work. There will also be a “where am I now” mode where the eFinder will work out where it is pointing in the sky with no help form the Nexus DSc and not move the scope.

Thus the plate solving application had to talk to the Nexus DSC to find out where the scope ‘should’ be pointing (ie as a result of a SkySafari initiated GoTo) and then talk to my ScopeDog controller to move the scope.

I’ve never really done much programming, but now (in Covid lockdown) was good time to learn. Python was the obvious language as it is the default language for the RPi, plus many astronomy packages are available. There are many on-line tutorials which helped me get up to just about sufficient proficiency.

Preliminary testing indciates that platesolving with inputs from the Nexus DSC takes only a couple of seconds, whereas in “where am I now” mode it can take a minute or so. The difference comes from the size and number of index files the program has to search through looking for a match.

IMG 4037

First job was to talk to the Nexus DSC, without upsetting the existing serial comms with ScopeDog. Fortunately the Nexus DSC has a USB port which mirrored the serial port. WIth help from Serge at AstroDevices I was soon reading out the scope position data. 

A bit complicated as the RPi USB ports are ‘hosts’ as is the Nexus DSC’s. Hosts dont talk to hosts, so I had to add a’device’ USB port coming off the RPi GPIO.

IMG 4051

Next was to get the Raspberry Pi running with Astroberry/Kstars/Ekos and accesing the RPi HQ ccd. 

The HQ ccd comes with a C/CS mount thread so fitting a 1 1/4” nosepiece was easy.

Here the RPi and ccd are housed in temporary black project boxes to protect them while testing etc.

IMG 4048

The original intended Svbony scope proved to be good mechanically but terrible optically. Replaced with a William Optics 50mm guidescope. (nice colour too!)

IMG 4046

The WO 50mm guidescope shown here with the ZWO ASI120MM-S ccd, pointing at a target on the workshop wall.

The code auto detects if the ZWO ASI ccd is plugged in and uses that one, otehrwise it defaults to the permanently connected RPi HQ ccd.

IMG 4047

The two line display complete with 5 buttons. very convenient to interface with (RPi i2c bus). 

It can show Nexus DSC output, plate-solved RA & Dec, delta Az & Alt, eFinder status and a mode for adjusting exposure, gain, etc.

If needed I will still be able to use a tablet to connect to the RPi via wifi and view images etc.

The Python code has been the biggest challenge for me. The languages I learnt at school and uni where nothing like the modern ones, plus I had to get into accessing hardware too. It took a while to come up with a structure that was right, and in the end the main program is just a loop reading the buttons and refreshing the two line display. This runs quickly and gives the buttons and display a responsive feel. Supporting the main loop are a large number of functions (sub-routines in old-speak!) that are called when needed. These are

Read the Nexus DSC output
This was more about learning how to use the RPi UART functions. Having read data from the Nexus DSC it was a case of parsing the result to extract the RA & Dec and get them ready for display and computations.

Capture an image (one each for the RPi HQ ccd and the ZWOASI120MM-S)
The RPi HQ ccd was relatively easy as the RPi OS now comes with built in access functions (Raspistill).
The ZWO ccd was much harder, but fortunately I discovered that a Steve Marple had written a Python ‘wrapper’ that handled the ZWO ccd API library.

Plate-solve an image
The documentation at astrometry.net is good, but the automatic installation within Astroberry wasnt obvious. Once I had discovered where the various modules were stored I could write some config instructions so that I could get the  astrometry plate-solve to work from Python and bypass the Astroberry suite. The plate-solving within Astroberry is still functioning should I need it.

Move the telescope
Not wishing to mess with the code in my ScopeDog telescope controller, I decided to add an ST4 style guider port to the ScopeDog handbox, and use pulses sent out via the RPi GPIO pins to steer the scope. In effect ScopeDog would see this as if I was using the handbox joystick to centre the target manually. I could then if I wanted, do a ‘align’ or ’sync’ on the SkySafari, but with my eFinder this wouldnt be needed anymore.

So this module has to take the delta in RA & Dec, convert it to a delta in Az & Alt, and convert that to the required pulse lengths to move the scope the right delta Az & Alt. Just straight mathematics (if rather complex) for once.

Generate an image viewable over wifi from a tablet
The aim is to not need an external display, but I expect at first I will need to see the captured images to give me a clue as to the best settings for image capture and image plate-solving. The RPi has built in ‘vnc’ which allows it’s displays, keyboard and mouse functions to be accessed on almost any computer, tablet or phone, via wifi. In the field the RPi generates its own wifi hotspot. So this module is a simple routine to display the captured image on the RPi display.

Later I will add a reticule to this display so the eFinder can be aligned with the main telescope.


Made the new handbox which combines eFinder with ScopeDog functions. Inside it has opto-isolators so that the eFinder can access the joystick circuits and move the scope on to target. It also has a return signal to check that the ScopeDog joystick speed is set to ‘slow’.

IMG 4061


The eFinder is just about finished and has had a couple of nights under clear skies. I need more testing but its is doing what I wanted - positioning the telescope to within an arcmin or so. The confidence it gives is also amazing. Being able to know with 100% confidence exactly where the scope is pointing really amkes for a good visual observing experience.

I realised I needed more access to the set up parameters and images during testing, so I wrote a graphical user interface (GUI) so that I can use a tablet to control and tweak the eFinder.

Another update (April 2021)

I’ve abandoned Raspberry Pi HQ ccd as its pixels are just too small. This also gave me the flexibility of moving the Rapberry Pi away from the eFinder telescope. The RPi is now in a new box along with the LCD and button module and a 12V to 5V converter.

Bentley Ousley in Kansas has contacted me as he was trying to make a similar finder but was stuck on the code. We worked together with me producng a new version for his ServoCat driven 20” New Moon scope, and he made me a 3D printed 72mm f2.7 finder scope. The ASI120mm-S ccd and this f2.7 scope produces excellent images for plate-solving with just 1 second exposures.

After a week’s observing and testing the system is now working very nicely. Plate-solves are nearly 100% successful and the scope is moved to within an arc minute or two of the target position. I’ve even added a ‘auto-track’ mode whereby the eFinder continuously images - solves - and moves the scope.

© AstroKeith 2021