Monday 27 March 2023

Running an INDI server on Windows

Running an INDI Server on Windows


We now routinely use an INDI server to control the mount and other devices with AstroDMx Capture for all platforms.


The INDI server always runs on a POSIX compliant operating system and we frequently use a Raspberry Pi running Raspberry Pi OS, a version of Debian Linux. With this system we can use AstroDMx Capture for any platform to control the mount and whatever other devices. However, we note that not everyone has a Raspberry Pi computer and due to supply chain shortages, they are, at the time of writing, hard to obtain.


We therefore explored installing an INDI server on the imaging computer which cuts down the amount of hardware required. This is fine for POSIX compliant operating systems such as Linux and macOS, but can't be done directly with Windows because Windows is not exactly a POSIX compliant operating system.


INDI was developed for the POSIX compliant operating systems: Linux, macOS, FreeBSD and  OpenBSD; all UNIX-like operating systems. Mac OS is largely POSIX compliant and has UNIX 03 certification by The Open Group® as conforming to the Single UNIX Specification. It is also noteworthy that two Linux distributions based on CentOS also have UNIX certification; these are K-UX®; K-UX and EulerOS®; EulerOS.


The Linux Standard Base (LSB) was established in 2001 to try to standardise the internal structures of Linux-based systems for increased compatibility. It is based on the POSIX specifications, the Single UNIX Specification and other open standards.  All of these facts make two important points, Firstly, that Linux (a UNIX-like OS) is basically UNIX, but without the expense (who would pay?) of UNIX certification. Secondly, because Linux ensures that it is compliant with the same standards as UNIX, and has from the very start, the distinction between UNIX and Linux is basically semantic. Unix seems to be on the decline, with more and more IT vendors moving their investments from UNIX to Linux. New technologies and applications are often not tested or certified on UNIX systems, but instead are developed for Linux platforms. So, Linux is now increasingly occupying the niche previously enjoyed by UNIX.


In the early 1980’s there were three branches of UNIX development: Firstly, UNIX system III from the Bell laboratories UNIX Support Group. Secondly, Berkeley Software Distribution (BSD®) from the University of California at Berkeley.  The third branch of UNIX development was Microsoft’s XENIX®, a version of UNIX that ran on the X86 family of processors’ and licensed from AT&T Corporation. In fact, in the early 80s, XENIX had the largest installation base of any UNIX system. UNIX fragmentation produced compatibility problems between UNIX versions which gave rise to the formulation of the POSIX® standard (Portable Operating System Interface for UNIX), which was an attempt to standardise the system-call interface in order to maintain compatibility between operating systems.


Microsoft with Windows NT, made moves towards being POSIX compliant. This is because they wanted to win an Air Force contract. The Federal Information Processing Standard FIPS-151 required that some types of government software purchases had to be POSIX compliant. In order to meet the requirements Microsoft got a company called Softway who were marketing a POSIX compliant subsystem called OpenNT to produce a solution that they released as 'Interix' that sat side by side with Windows as an environmental subsystem and called the NT Kernel directly. This meant that it was possible to compile and run POSIX code. Microsoft incorporated the Microsoft POSIX subsystem into the first versions of Windows NT. However, the Windows NT POSIX subsystem did not incorporate a POSIX shell or any UNIX commands. Nevertheless, the system was sufficiently POSIX compliant to allow Microsoft to win the contract. Microsoft eventually bought Softway and after a couple of re-namings 'Interix' became 'SUA' 'Subsystem for UNIX-based Applications. SUA was deprecated in Windows 8 and removed altogether from Windows 8.1. POSIX returned to Windows with WSL (Windows subsystem for Linux) in Windows 10; eventually with a Linux kernel and finally packaged as a Windows 11 application available from the Microsoft store. 


It is important to understand that being UNIX does not depend on, for example, having a certain kernel; it depends on meeting a number of criteria, mainly POSIX compliance, to conform to the Single UNIX Specification. If Microsoft had gone all of the way in making Windows compliant, Windows too could have been classified as UNIX. This was obviously not what Microsoft wanted.


Putting an INDI server on a Windows 11 computer to run alongside AstroDMx Capture.


Using an Oracle VM VirtualBox


We have previously written HERE about installing on a Windows machine, Oracle VM VirtualBox running a Linux distribution. (That was back in the days before Nicola had ported AstroDMx Capture over to Windows, and this was a way of running AstroDMx Capture on a Windows computer) The procedure shown could be used to install Oracle VM VirtualBox with a Linux guest OS, but it is, of course, no longer necessary to install AstroDMx Capture into the VM , only the INDI server will be installed in the VM.


This is a quick and simple solution and is the one we prefer.




Using Windows Subsystem for Linux


We have tested installing WSL2 on Windows 11 and then installing an INDI server into it.

However, WSL is not as mature as Oracle VM VirtualBox and the application here is not really why Microsoft has put WSL into Windows. They have developers in mind: ‘The Windows Subsystem for Linux allows developers to run a GNU/Linux environment, including most command-line tools, utilities, and applications, directly on Windows’. Basically, because many, if not most developers work in a Linux environment and Microsoft’s Cloud systems largely use Linux systems, Microsoft believes that it is no longer necessary for developers to have separate Linux computers. This is all well and good, but Microsoft has neglected to make provision for USB devices to be passed through to WSL. When the Microsoft website is searched it is found that they suggest an Open Source solution to do this. We have done this and found the WSL solution to work.




We may write a separate article to describe exactly how this is done, but unless the user is already using WSL2 for other reasons, we suggest that the Oracle VM VirtualBox or possibly another virtual machine solution is the way to run an INDI server on a Windows machine.


Wednesday 22 March 2023

Looking at the Hubble Palette

The data for this article were captured by AstroDMx Capture through a William Optics Super Zenithstar 81mm ED Doublet APO refractor at f/5.5 with x 0.8 reducer/flattener, using an SVBONY SV605MC 14 bit, cooled, monochrome CMOS camera and Altair narrowband filters.


The Hubble Palette is one of six palettes made by assigning monochrome images taken through SII, H-alpha or OIII narrowband filters to the Red, Green and Blue channels of a resulting false colour image.


The Hubble Palette has S mapped to Red; H mapped to Green and O mapped to Blue in the false colour image.


Palette mappings to RGB from H-alpha, OIII and SII


The Hubble palette is highlighted in the above table of filter mappings to RGB channels.


The false colour image is generated according to the additive properties of the primary colours of light. So for example, where green light and red light are combined the resulting colour is yellow; where blue light and red light are combined the resulting colour is magenta; where blue light and green light are combined the resulting colour is cyan and where red, green and blue light are combined the result is white.


The additive properties of light



How the elements’ emission lines in the Hubble Palette combine to produce different colours in the false colour image.




Out of the six possible palettes that are available, most astro-imagers choose the Hubble palette probably because it has an iconic ring to it; being named for the Hubble Space telescope.

The other named palette; HOS, is the Canada, France, Hawaii telescope palette. This palette name just doesn’t have the same ring to it!

All of the palettes are equally valid and any one could be used, and for interpretation purposes, the Venn diagram of overlapping colours would have to be re-labelled for that particular palette's element combinations.


Montage of the six available unprocessed palettes made from SII, H-alpha and OIII filtered monochrome images.



The problem with the Hubble palette is that it is so green. This is because H-alpha is usually the dominant element in a nebula and in the Hubble palette it is assigned to the green channel. Therefore green usually dominates a Hubble palette image. The same problem exists for each of the possible palettes; one colour or other usually dominates the raw image.


Astro-imagers usually proceed to reprocess the raw Hubble Palette image to replace much of the green with yellow and gold hues partly for aesthetic reasons. 

Of course, sensu stricto, once the hues of a Hubble palette image are changed, it is no longer a Hubble palette image, whatever we might call it; although it is derived from a Hubble palette.


Below is a basic Hubble palette image of the Horsehead-flame nebula region. It also contains luminance data acquired by capturing monochrome data through an Altair Quadband narrowband filter which is a narrowband filter with two bandpass zones:

1st band,  Centred on 495nm FWHM 35nm, range 477.5nm - 512.5nm.

2nd band, Centred on 660nm FWHM 35nm, range 642.5nm - 677.5nm.

These two bands are wide enough to include the emission lines of H-beta and OIII in the 1st band, with H-alpha and SII in the second band.

The luminance layer simply provides structural luminance information but does not change the hue of the image.


Raw Hubble palette image



One way to look closely at the information in the image is to increase, globally and selectively, the saturation of the colours in the image. The hues remain unchanged by the increasing saturation, so this is still a Hubble palette image.


Hubble palette image with enhanced saturation


In this image it is possible to see  green regions dominated by H-alpha; yellow regions where there are both SII and H-alpha; magenta regions where there are both OII and SII; blue regions where there is a lot of OIII, red regions where there is a lot of SII and cyan regions where there is both OIII and H-alpha.


However, in this example, whilst enhancing the saturation of the colours in the image does reveal more of the composition of the nebula, it neither produces an image that adequately distinguishes the regions of different composition, nor does it produce an image that is any more aesthetically pleasing than the original Hubble palette image.

This is why astro-imagers post-process Hubble palette images as mentioned previously.


Another method is where the individual monochrome images are colourised with the appropriate colours of red, green or blue and they are then combined together as layers with different % opacities. I shall call this the method of fractional channel blending.


This Hubble palette image was produced by layering 25% H-alpha, 50% SII and 100% OIII opacity; followed by global and specific colour saturation enhancements. The Quadband luminance layer was also incorporated for consistency with the previous images.


Hubble palette image by fractional channel blending


Strictly this is still a Hubble palette image and shows clear distribution of the composition of the nebula as well as being aesthetically pleasing.



Processing that selectively changes the hues of various colour components of the image can yield the required rendering of the image.


Hubble Palette image post-processed differently, including selective hue changes to reveal more differentiation in the compositional structure of the nebula.


Whilst this is strictly no longer a true Hubble palette image, it is derived from that palette and does show clear differentiation between the colours (derived from the composition) of the nebula. Moreover, it is an aesthetically pleasing rendering of the Hubble palette image.


It could be argued that one should allow the individual element channels to speak for themselves as monochrome images of the Hubble palette.


SII image




H-alpha image



OIII image



These three monochrome images tell the compositional distribution story perfectly, but not quantitatively. The reason for this is that the images have had to be stretched individually until they are ‘similar’ in their intensity, for composing into an RGB Hubble palette image. This also tells us that the ‘true’ Hubble palette image is not a quantitative image, but only a compositional structure image.


This is the luminance monochrome image captured through an Altair Quadband filter that was used to add luminance information to the original image



Various images could be used for luminance if it is to be incorporated at all  into the image. Here we have used an image from a filter that passes all of the wavelengths of interest. However, H-alpha or H-alpha + SII could also be used and will influence the luminance in different parts of the image.


I believe that it is desirable to explore the other five palettes that result from mapping H-alpha, SII and OIII to the red, green and blue channels of false colour images.


By using the Hubble palette produced by fractional channel blending , all six possible palettes can be constructed by reassigning the colour channels:



Monday 13 March 2023

Instructions for using AstroDMx Capture V2

 AstroDMx Capture version 2 has advance functionality and is substantially larger than earlier versions.

Nicola has placed instructions for using version 2 HERE.

The new documentation takes the form of an overview of the new AstroDMx GUI followed by a walk through all of the INDI functionality of version 2 using the INDI simulators.

The INDI simulators behave exactly like real mounts, cameras etc and are used in exactly the same way. In this way it is possible to get to practice the advanced functionality before you try it on the night sky with a real mount, scope and camera.

The project is under active development and for example, the INDI cameras, while they do work, are still a work in progress.

We shall report here as new features are implemented or other changes are made.

AstroDMx Capture for all platforms can be downloaded HERE

Making Bicolour images from OSC LeNhance filter images using the Gimp 2.10.

The Optolong LeNhance filter is a duo-band narrowband filter with two bandpass regions; one that passes H-alpha light in the red and the other that passes both OIII and H-beta light in the blue-green.


Transmission curve of the LeNhance filter



The rest of the spectrum is effectively blocked which eliminates most light pollution and increases the contrast of the images.


Removing stars


Probably the most effective way to process astronomical images is to use star removal techniques, process the nebulosity and then add back the stars to the image. The advantage of this is that during the processing, the stars are not processed and possibly become bloated. Also, any false colours that may have been introduced into the stars by the filter can be reduced by reducing their saturation or otherwise correcting their colours.

The Gimp 2.10 is a cross platform image processor of the same calibre as Photoshop and the star removal tool Starnet++ version 2 is available from GitHub as a plugin for the Gimp 2.10 in Linux or Windows. (It probably works in macOS as well). This means that virtually all of the processing of LeNhance filtered images can be done in the Gimp (Gnu Image Manipulation Program). The Gimp does have noise reduction functions but at the moment they are not quite as powerful as for example, Neat Image. However, the Gimp’s noise reduction could still be used to good effect if no other is available.


The example used here is from data captured of NGC 7000 The North America Nebula, captured through an Altair Starwave 60ED doublet refractor using AstroDMx Capture and an SVBONY SV605CC OSC 14 bit CMOS camera with an Optolong LeNhance duo-band narrowband filter. The image is a stack of 22 x 3min FITS exposures. These data were captured when we were still working out the spacing for the field-flattener/reducer so the field was not yet flat.


Click on any image to get a closer view


The basic image of NGC 7000



The Processing workflow


ALWAYS duplicate an image and work on the duplicate. In this way any changes you make will not affect the original image and mistakes can be easily rectified.


Star removal


The image must be a 16 bit TIFF, Linear light.



Select Starnet++



Starnet++ v2 working on star removal will take a few minutes depending on the computer.



Stars are removed


Flatten and save the image

Copy the visible and paste it over the original image (with the stars) as a new layer and select Subtract as the mode (method of blending). This will give an image of the stars.



Flatten the image, reduce the saturation a little and apply the Gimp noise reduction. This will have the effect of reducing the noise and reducing the stars a little. Export the image of the stars that will be added back later.


Stars


Working on the starless image.


Export the starless image and denoise it with Neat image or other application, or the Gimp’s own noise reduction function. The denoised image will be worked on from now on.

Using Colours, Components and Decompose.



Uncheck Decompose to layers

The image will then be split into its three monochrome channels as separate images corresponding to Red, Green and Blue.

The three monochrome images can be seen above, the arrows have been added to show which channel is which.


We are going to construct a bicolour image using the red channel (H-alpha) and the sum of the green and blue channels (OIII and H-beta).


Copy the visible of the green-channel image and paste it as a new layer onto the blue-channel image, using Addition as the blending mode.


Flatten the image and then apply Colours and Curves to stretch the image


You may have to come back to this stage (to stretch more or less) depending on the result you obtain in the following steps. 

At this stage you can remove the green channel image to avoid possible confusion.

So we now have a red channel and a blue channel (actually the sum of the original green and blue channels).


Composing a bi-colour image


Using Colours, Components and Compose, we have to specify what will go into each of the 3 colour channels of the RGB bi-colour image we are about to construct.

Put red in the red channel, blue in the green channel and blue in the blue channel and the bi-colour image will be composed.


At this stage you will have to decide, based on the result obtained, whether you are satisfied with the stretching of the green+blue image in the previous stage, or whether that stage needs to be re-visited.


In Colours and Hue-Saturation, you can selectively adjust the saturation of the separate colours (in this case red and cyan) if required.

Starless bi-colour image


At this stage it is possible to use Colours and Hue-Saturation to selectively change the hue of one of the colours. In this example, the red has been changed to a golden hue often seen in bi-colour images.


Hue-changed bi-colour starless image


This is a matter of personal preference.


Adding stars back into the images


Remember that the stars now have less false colour and are slightly less prominent than in the original image.


Copy the visible of the stars image and paste it as a new layer onto the starless image selecting Addition as the blending mode. Then flatten the image.


Bi-colour image with stars back


Hue adjusted bi-colour image with stars back


RGB image with stars back


The images in which the stars have been put back have a more pleasing star field with less false star colour and with the stars being less dominant in the image in which the nebulosity can be more readily appreciated.

Bi-colour images can give more insight into the qualitative composition of nebulae than can be deduced from the original RGB image captured through a duo-band narrowband filter such as the LeNhance filter.


Sunday 5 March 2023

Live-Stacking with AstroDMx Capture and stacking software




Live Stacking of images is a technique of value for Electronically Assisted Astronomy (EAA). EAA is of particular value for outreach work where members of the public are shown astronomical objects in real time.

Although it is possible to use EAA during astro-imaging sessions, it is not normally used in these circumstances.

EAA uses Live Stacking of images, that is, as each image (often Deep Sky images) is captured, it is stacked with the previously captured images and displayed on the computer screen. In this way, the viewer can observe the image building up on the screen and becoming less noisy with more details becoming visible.


AstroDMx Capture may well have EAA Live Stacking built into the software at some future date, but it is possible to do EAA and Live Stacking with AstroDMx Capture by using it in conjunction with Live-Stacking software in a completely seamless way.


Three programs have been tested and found to work simply and seamlessly with AstroDMx Capture for Live Stacking. They are:


Deep Sky Stacker Live 4.2.6

ASTAP version 2023.01.21

Siril-1.2.0-beta1


It is likely that one of these programs would be used routinely to stack and maybe partially process the images captured during an imaging session.


Deep Sky Stacker Live is a Windows only program.

Siril-1.2.0-beta1 is cross platform and works on Linux, macOS and Windows.

ASTAP version 2023.01.21 is cross platform and works on Linux, macOS and Windows.


All three programs work in the same way. They monitor a user specified directory (folder) and when a new image appears in that directory, it is stacked with the previous images that have appeared there. The current stacked image is displayed.


AstroDMx Capture saves its images into date-time-stamped folders by default.


Click on an image to get a closer view


The directories (folders) generated by AstroDMx Capture during an imaging session





So, for example, AstroDMx Capture created a folder on 2023-03-02.

Within that folder AstroDMx Capture created other folders that would contain the various sets of images captured. One of these folders was created at 19-48-15 when its contents started to be captured. When opened, it can be seen that it contained image files called S2HHead_000001_19-48-15_data.fits; S2HHead_000002_19-53-15_data.fits etc.


This folder was created when AstroDMx Capture started to capture the data on this object (SII on the Horsehead nebula).

It is this folder (19-48-15) that would be specified as the folder to be monitored as soon as the first image had been captured.

Whichever program that was being used would proceed to Live-Stack the images as they were captured by AstroDMx Capture and placed in the folder.


Deep Sky Stacker Live


The directory (folder) to be monitored is selected


Select the Stacked image tab. Click on the red Stack button and stacking begins.


When just the first image is captured and displayed



When all 20 of the images have been stacked and displayed

It can be seen that the 20 image stack is more defined and less noisy than the first, single image. In fact, the Signal to Noise ratio increases as the square root of the stack size.



Siril Live Stacking


The Home folder is set to the folder to be monitored.


Live Stacking is started by clicking the red button.


When the first image is captured it is displayed


After all 20 images have been captured and stacked

Again, it can be seen that the stack of 20 images is more defined and less noisy.



ASTAP Live-Stacking


The folder to be monitored is selected


The button is clicked to start Live Stacking


When the first image has been captured and displayed


After all 20 images have been captured, stacked and displayed

As before, it can be seen that the stack of 20 images is more defined and less noisy. This, of course, is why EAA is so valuable for outreach.


With ASTAP it is then important to click the button ‘Rename all files back to original’. This is because during the stacking process, the files have an underscore added to the filename extension as they are stacked. Clicking on the button removes the underscores from the filename extensions.


In each case it is a matter of moments to start the Live Stacking process. The window in which the Live-Stacked image is displayed is quite separate from the preview window of AstroDMx Capture. It is best to display this window on a separate desktop or even an external Monitor.


In this way, it is possible to simply and seamlessly Live-Stack the images being captured by AstroDMx Capture, whatever operating system is being used.