Home made solar (thermal) panel

In January 2015, I posted an update of my attempts to make my home more energy efficient. So far, I have concentrated on insulation, and monitoring the effects. This link describes the progress.


This summer (2016), I decided to try out an idea mention by another member, Donald. The idea was that you could make a solar thermal panel by painting a central heating radiator black, and putting it in a double glazed, insulated box.  That is what I have done using a 20 year old central heating panel left over from when I removed my central heating several years ago.

This is a graph of the temperature produced by the panel from the end of July to the end of August (2016). The graph mid-line is 50 degrees Celsius. Continuous live temperatures from the panel can be seen here:- https://personal.xively.com/feeds/940955947

Screenshot from 2016-08-28 10:28:30

The panel is connected to a towel rail in the bathroom, and heats the rail by thermo-cycling, no pump would be required, except……

The towel rail cannot dissipate all the heat produced by the panel on hot days, and eventually, when the temperature passes 70, stagnation occurs, thermo-cycling stops and the towel rail cools down. To combat this effect on hot days, there is an Arduino sensing the panel temperature. When the temperature hits 70, the Arduino sends a signal to turn the pump on, which restarts the circulation.

What I am mostly interested in, is how the panel will perform during the autumn, winter and spring periods. When the results are in, I will be able to decide if it is worth adding additional panels to heat my hot water cylinder.

Update November 2016: The panel is not producing any usable heat now. I have drained the system and switched the towel rail back to electric mode.

Full Circle Podcast Update

We haven’t posted much about the how the podcast recordings have been going since Blackpool LUG took over producing the main show of Full Circle Magazine back in May of 2012.  We have now produced four episodes, the last being released between Christmas and New Year.

We have learned alot of lessons during in that time, the first show was recorded in Tony’s front room using a Zoom H2 portable audio recorder which was positioned on a coffee table in the middle of the group of presenters.  As it turned out when listening back to the recording during in post production editing we were all sat too far away from the microphone, although the H2 has two very sensitive diaphragms in it, the acoustics of the room made it sound like we were whispering on a mountain.  With some advice from experienced podcaster and former sound engineer Dan Lynch we were able to produce something that was listen-able, coupled with some great content from the UK’s first ever Raspberry Jam our first episode came off pretty well.

For our second episode we realised that we didn’t really have the equipment needed to record a reasonable studio show, that coupled with the fact that we were all struggling to find enough time to convene in one location to record so we turned to a technological solution, VoIP and Video Conferencing.  We had originally settled on a open source solution, Mumble which is an audio conferencing/VoIP software, it comprises 2 elements the server which all the clients connect to, we used one of the presenter’s private server to host the recording.  The second part is Mumble the client which can connect to any of the public Mumble servers or a private server providing you have it’s address.  We had difficulty getting it to work for all of the hosts, so we abandoned the idea and tried the propriety solution Skype, using it’s conference call option however we again ran into technical problems and had to move to another medium.  All four Presenters are members of Google + Social Media site which has a feature called Google Hangouts which allows users to hold video calls with each other, not just one to one but as a group.  So as a last ditch hope we tried that and were able to all get a stable connection with reasonable audio and not too much lag.  We weren’t able to record all of our audio together through Google Hangouts, a feature which is standard with Mumble and with the aid of a plugin application called Skype Call Recorder is also achievable with Skype.  So we each recorded our own end of the podcast using Sound Recorder or Audacity, this seemed like a good idea at the time.  We were a little too adventurous with the content for the show, we tried to cover all the news from Google I/O, two interviews and 2 reviews of events this coupled with close to an hour trying to get setup and breakdowns in communication with one of the presenters produced an editing nightmare for poor old Les to sort out.  The end result was a 2 hour show which was difficult to listen to as the pace of the show was shot due to the breakdowns and took nearly 2 weeks to edit.  Once again there were plenty of lessons to be learn from this show.

Then came unconference season, Oggcamp, Barcamps Blackpool and Liverpool and a host of other events that presenters were involved in meant that there was not far short of a 2 month break from recording. Episode 31 was recorded late in October and at the start of the recording only featured 2 of the hosts, Jon joined the recording halfway through, again it was recorded remotely using Google Hangouts and once again there was some difficulty in getting setup with a VoIP solution.  The recording was far from ideal due to Jon’s haste to join the recording he ended up recording the other presenters voices with his microphone which caused a headache in post production as it was difficult to separate the voices from the recording.  The result was a very poor quality audio for Jon which was then needed to be balanced out with the other audio tracks.  The net result is a very low volume podcast with Jon being nearly unintelligible in some places.  Once again the Edit turned into a nightmare, taking 2 weeks to complete and not very desirable and sparked a certain amount of debate within the team on how to improve it’s quality.  In the end the team took the decision to release the podcast despite the poor quality of the audio due to the the long absence and the amount of time which had been spent on the edit.  Once again there were lessons to be learnt!!

The recording of the fourth episode was deliberately delayed to coincide with the festive period so the guys could do a review of the year special, we got a good solid connection on Skype and even though one of the recordings failed we still had a great recording due to the redundancy that was built in to the setup, Olly was using a mixer to mix both Les and Tony’s voice on to one channel and his own voice onto the other piped to a solid state recorder, meanwhile Les was using Skype Call Recorder to record a mix of Olly and Tony and one channel and his own voice on another.  Some sad news though, Jon is moving back to New Zealand in February and due to needing to organise things for this was unable to make the recording.  The recording went well and the the experience of recording three podcasts showed as the guys were more organised during the recording and got through the content very quickly, the whole thing took 1 hour 30 minutes to record.  This also helped in post production, very little editing was required, removal of the err’s and um’s, pauses and one breakdown as the guys needed to do some research.  The beds, intro and outro music and the recording of the intro voice over were added, this was done in two days and the episode was released between Christmas and New Year.  There have been alot of positive comments regarding Episode 32 including Robin Catling former main show lead presenter and now sidepod presenter who commented much better sound quality! And Skype worked! Yay!.  Just need to polish the mix for the beginning of the show and give it a bit more oomph and it’s there. In the groove inside a handful of shows. It only took us a year with the old team. We are nearly there, all the presenters are really happy with the result!!

We are looking forward to a fantastic 2013 and we hope you will join us on our continuing journey into podcasting!!

MP3 Feed for your podcatcher:  http://fullcirclemagazine.org/category/podcast/feed

OGG Feed for your podcatcher: http://fullcirclemagazine.org/category/podcast/feed/atom

If you don’t use a podcatcher here are the direct links:

Full Circle Podcast Episode 29: The Great Train Poddery
Full Circle Podcast Episode 30: Better Late Than Never
Full Circle Podcast Episode 31: The Difficult Third Episode!!
Full Circle Podcast Episode 32: The Year That Was….Well Nearly!!

Book Review: Podcasting Hacks – Tips and Tools for Blogging Out Loud By Jack D. Herrington

In July the LUG took over presenting the Main Podcast for the Full Circle Magazine, the four hosts: Les, Jon, Olly & Tony all had their own ideas about how a podcast should sound but had little idea of what was involved in producing a good quality podcast.

So it was decided that a bit of background research was required, this book by O’Reilly came up quite a few times as recommended reading for people intending to start out podcasting.  It did not disappoint, it starts from the very beginning by outlining what exactly is a podcast, the different audio formats how podcasts are published to the web and what’s the best way to listen to and get them.

Once you’ve made your way through the first few chapters you begin to delve more deeply into what makes a good podcast and how to produce one.  The author gets you thinking about the content of your intended podcast and talks about what is good and what is not.  You then move on through simple vocal techniques to the equipment that is a must and some nice to haves and discusses is some detail about what is available and the author makes some recommendations based on their experiences. The book then moves on into more advanced vocal coaching, setting up a home studio, improving audio quality and publicising your podcast including Blogging. The book closes discussing what was then the new medium of video blogging as Youtube had begun to dominate the broadcasting-yourself scene.

Some indication of costs are given for the various pieces of equipment but a note of caution here this book was written in 2005 and alot of it is now freely available through Ebay and alike, but it does give you an idea of what things to look for and which manufacturers to look out for.  It is also worth mentioning at this point that although Jack Herrington is the main author he has drawn on his contacts to help him write the book from every angle.  Quite a few chapters have been written by guest authors who have a specialism in a particular area for instance there are a couple of chapters written by a well known American voice coach who has worked with alot of voice and radio artists.  This helps to make this a very well rounded book on the subject.

Tips and Tools for Blogging Out Loud

To summarise this is an excellent book which will help guide you into the world of podcasting and I would say this is essential reading for anyone contemplating starting a podcast.  Although the book was written 7 years ago and the equipment featured is fairly dated, the book still gives you all the knowledge you need to choose the right one for what you need and to purchase with confidence.

Setting up a MythTV system based on Mythbuntu


This document records the steps that the Blackpool Linux User Group took to get a MythTV system up and working.

Goals of our MythTV machine:

To provide a usable EPG (electronic program guide) for upcoming programmes To schedule and record Freeview programmes directly or via the EPG To play DVD’s (and retrieve info from online sources, e.g. IMDB) To play video files (formats?) Note: Playing music and viewing images is not a priority for the group (and MythTV is notably weak in this area).

MythTV Software

Mythbuntu (Download from http://www.mythbuntu.org/) is a distribution of MythTV built on Xubuntu, with many pre-configured customisations that has its own application specifically for managing the configuration of the hardware, software and associated data such as the EPG and remotes

Why we choose Mythbuntu

We are all familiar with Ubuntu to a greater or lesser degree The Mythbuntu ‘distro’ is mature, officially supported by Canonical, is hardware-savvy and takes away many of the frustrations around getting MythTV working

Specifications of the machine

TBA (actual/minimum/recommended) 
DVD Burner
Special considerations, Noise and power consumption. 
Specific MythTV hardware needed:

Digital TV card(s) Here you will find a list of compatible cards:

A TV aerial or a satellite dish, an old sky minidish will be perfectly OK.
Wireless controller (we used a Microsoft-branded media centre remote) http://www.mythtv.org/wiki/LIRC
Speakers are a good idea! http://www.mythtv.org/wiki/Sound_card

Installing Mythbuntu

Insert the disk and follow the on-screen instructions (this first part was almost too easy – see Problems)
Note: We took over the whole disk as part of the installation. Initially the process failed at the point in the installation where GRUB is installed. We eventually resolved this by replacing the hard drive.
The official mythtv guide is here:- http://www.mythtv.org/docs/

Configuring Mythbuntu

(not sure what was done here? It just seemed to work straightaway, though we haven’t configured any tuner cards yet)
Configuring the media centre remote
Exited the MythTV frontend application – drops back to Xubuntu desktop Ran the Control Centre application which can be found in the Multimedia section on the Applications menu Ran the remote mapping scripts by selecting our specific model in the Control Centre application. Quit this and went back into MythTV frontend So far this appears to work just fine without additional tweaking. Here you can find a list of supported infrared receivers: http://www.lirc.org/html/table.html

Configuring MythTV

MythTV consists of a backend process and a frontend process. The backend must be configured using mythtv-setup,which can be found on the Mythbuntu settings menu (menu name to be confirmed). The mythtv-setup program offers to stop the backend process as most changes will not be used until the process is started again.


The first option to be configured is named General here you can setup the IP address of the backend server. If you only intend to have one computer running both the backend and the frontend process then you can leave the defaults values as localhost ( If you intend to have other mythfrontend clients then you will need to enter the IP address or the name of the backend server.

Capture cards

The next option which needs to be configured is the Capture Cards. Here you can tell the system how it is going to receive the television signals, there are many different types of capture card which can be used but we are currently going to use the DVB DTV option this is the correct option for both PCI and USB devices.

Video Sources

This option is used to group your capture cards together, even if you have just one capture card you have to create an entry here. Each group should contain cards which provide the same channels, so you would have one group for Freeview and one for Freesat and another for SKY. We will just create one group and name it Freeview.
Input Connections
This option allows you to link you capture cards to the Video sources group you have just created.
Channel Editor
Once everything else has been configured above you can then use this configuration to add channels, if you are using digital capture cards as we are you can just use the channel scan button to scan for and add the available channels.
Storage Directories
This configuration allows you set various locations for the storage of different files. Each storage type can have multiple physical storage locations and in this case MythTV will search each location for the required files. So if you run out of space for your recordings you can add another drive without having to move your existing recordings. MythTV will the just use the disk with the most amount of free space (depending on other settings).
Other things to look at (once the main box is working)
Using MythTV as a UPNP/DNLA media server for other devices, e.g. media players, pc, XBOX, phones, etc.
Using SFF/embedded hardware, e.g. Revo, eeePC, etc to build a small combined front-end/back-end system Using MythArchive to export recordings to DVDs and other media. Usage
The official MythTV wiki has an excellent user manual which you can find here:
These instructions are for version 0.24 of MythTV, the version we are using at the moment is 0.23-fixes but the daily usage has not changed very much.


On Saturday we had some difficulty getting the Avermedia PCI DVB-T card working with Mythbuntu 10.04.
I took it home and built a Mythbuntu 10.04 system and put this card in, I also had the same problem i.e. i could not scan for channels.

Looking in the messages log

(tail -f /var/log/messages) I could see that the system could not find or load the firmware for this card. After some research I found that we were not the only people who have difficulties with this card, but there is a solution. Apparently the system incorrectly identifies the card at startup and tries to load the wrong firmware which fails and makes the card unusable.
The solution to this is to update the kernel within 10.04 this is done by just asking the package manager to update the kernel and then rebooting. Once this is done the system will correctly identify the card.
Next we have to download and install the firmware for this card, this can be done by installing the Ubuntu firmware package using the package manager the package is named linux-firmware-nonfree.
This appears to work and allows you to configure MythTV and scan for channels, when I performed the scan at home I found 41 TV channels and the 5 I tested all were watchable even though the BBC channels were picked up from the transmitter in Wales instead of Winter hill.
After rebooting the computer I found another problem when using this card with Mythbuntu, the myhthbackend process is configured to startup before the system has finished configuring the hardware. The loading of the firmware takes time and the card is not ready until this is finished.
The solution to this is modify /etc/init/mythtv-backend.conf as follows:
Replace line
start on (local-filesystems and net-device-up IFACE=lo) with
start on (local-filesystems and net-device-up IFACE=lo and started udev-finish) This makes the startup process a bit longer but it does make sure the hardware is ready.

More Problems

After making these changes we found that we could still not scan for channels. So to fix this we had to use the backend setup program to delete all of the configured capture cards. The next step was to use the backend configuration program to add the capture card again. Now the channel scan found lots of channels but it was missing the BBC channels. We did not try to resolve this, but a good start would be to use the command line scan utilities to check which channels can be found.
dvbscan dvb-t/uk-Winter Hill > channels.conf
scan dvb-t/uk-Winter Hill > channels.conf
these utilities are not provided by MythTV they come with the DVB utilities, the uk-Winter Hill command line value is a file that is installed with these utilities and provides the details of this transmitter. 
This page herehttp://www.ukfree.tv/txdetail.php?a=SD660144 provides technical details of the Freeview transmissions from Winter Hill, so if you are missing one group of channels you can work out which multiplex/frequency you are missing. MythTV allows you to enter the details manually and the perform a quicker scan so that you can get just the channels you are missing. If you do look at this site you will notice that there are 4 HD channels on multiplex PSB3 you we can not receive these as to do so you will need a DVB-T2 card.

Very basic Linux

These links should help you in your quest to understand how Linux ‘ticks’.


If you download the kernel ‘bare.i’ from the Slackware/bootdisks link, then put it on a floppy with :-

dd if=bare.i of=/dev/fd0
you will then have a floppy with a linux kernel that can partially boot.

“I managed to get this far on Sunday!- Colin”

This endorsement from Colin indicates that the download link works, the ‘dd’ command works and the resulting floppy disk boots.

Now we need to use this command on a second floppy disk, this will format the disk.

fdformat /dev/fd0

Then create a file system on the second floppy disk.

mkfs.ext2 /dev/fd0

Having made the file system, the floppy needs to be ‘mounted’.

mount /dev/fd0 /mnt/floppy

If /mnt/floppy does not exist, it needs to be created.

Finally, the file hierarchy needs to be created on the floppy. Follow the instructions here :-


Overview of above instructions :

A root filesystem must contain everything needed to support a full Linux system. To be able to do this, the disk must include the minimum requirements for a Linux system:

The basic file system structure,

Minimum set of directories: /dev, /proc, /bin, /etc, /lib, /usr, /tmp,

Three of these directories will be empty on the root filesystem, so they only need to be created with ‘mkdir’ (make directory). The /proc directory is basically a stub under which the proc filesystem is placed. The directories /mnt and /usr are only mount points for use after the boot/root system is running. Hence again, these directories only need to be created. The rest need special attention see instructions at the buildroot link.

I tried this on Saturday (9th April) but I got a message saying the directories could not be created as they already existed, but I could only find Lost & Found. It later transpired I must have been looking at my Laptop Hard Drive rather than on the floppy, but I still had problems. I’d like to try the uncompressed variety when we next meet, but sadly think I need a little more “handholding” as to how exactly to do it (sorry Mike!)

Basic set of utilities: sh, ls, cp, mv, etc.,

Minimum set of config files: rc, inittab, fstab, etc.,

Devices: /dev/hd*, /dev/tty*, /dev/fd0, etc.,

Runtime library to provide basic functions used by utilities.

Look here for full details:-



to find how to build a basic file system (root disk). These two disks combined make a ‘very’ basic linux system.

Then maybe add some basic tools:-


Start by learning how to strip a working system down to the bare essentials needed to run one or two commands, so you know what it is you actually need. An excellent practical place to do this is the Linux BootDisk Howto, (link above) or for a more theoretical approach try From PowerUp to Bash Prompt.

To learn how to build a working Linux system entirely from source code, the place to go is theLinux From Scratch project.


They have an entire book of step-by-step instructions you can read online or download. Be sure to check out the other sections of their main page, including Beyond Linux From Scratch, Hardened Linux From Scratch, their Hints directory, and their LiveCD project. (They also have mailing lists which are better sources of answers to Linux-system building questions than the busybox list.)

If you want an automated yet customizable system builder which produces a BusyBox and uClibc based system, try buildroot, which is another project by the maintainer of the uClibc (Erik Andersen). Download the tarball, extract it, unset CC, make. For more instructions, see the website.