Published: Saturday, 19 September 2015 07:02
For a few years I have been frustrated at having no time to write up my image analysis methodology into academic journal papers. I am becoming worried that the world is moving on (e.g., from image analysis to video analysis) and may not considerable it feasible to backtrack to use my methods, far superior though they may be to the ones the world is using. My one published paper on algorithms was only the first in a series documenting a large body of work.
Due to a concatenation of circumstances, which from my point of view is characterised as intransigence by The University of Queensland, I am not employed in image analysis and have to write up this work in my "spare time". I returned to fishery population dynamics which I find very interesting although a much smaller mathematical community than image analysis. I also make a large investment of my time in keeping up and improving my skills as a classical pianist, which I feel it would be a crime against humanity not to do. I sometimes wish that my skill as a musician had been given to somebody else so that I could have been a listener instead of a performer, but I can't do anything about that.
Anyway, to return to the point, I am a big fan of the continuous wavelet transform (CWT) as opposed to the discrete wavelet transform (DWT). Most people prefer the DWT and its relatives but in my view that is only because the appropriate theory for the CWT has never been published and hence practitioners don't know how best to apply the CWT. In technical terms, the CWT uses what is called an "overcomplete basis" which means that practitioners can tune it to fit their image and choose which wavelet coefficients they want to use to represent their image. With the DWT there is very little choice: we take what the DWT provides whether it accurately fits our image or not.
For the CWT I am also a big fan of pyramidal algorithms. These algorithms operate in the spatial domain as opposed to the frequency domain (Fourier transform). They are based on two-scale relations. Roughly speaking, their computational efficiency comes from calculating the wavelet transform (WT) at a particular scale from the alread-computed WT at a finer scale. Pyramidal algorithms are faster than frequency-domain algorithms based on the Fast Fourier Transform (FFT) but not by as big a margin as we might expect. The FFT is actually an astonishingly efficient way of doing things, and spatial-domain algorithms to do any better have to be very craftily designed. The comparable speeds of the best spatial-domain and frequency-domain algorithms are described in my published paper which as far as I know is the only place where such a comparison can be found.
As a word of caution about pyramidal algorithms, I have to warn that they are not suitable for modulated wavelets, i.e., wavelets that involved a factor of exp(ix). The archetypal modulated wavelet is the Gabor wavelet or more correctly the Morlet wavelet which is very similar to the Gabor wavelet but meets the technical definition of a wavelet. Many other modulated wavelets have been developed over the years; see the book "Two-dimensional Wavelets and Their Relatives" by my friend Jean-Pierre Antoine et al. who has been very kind to me despite having to put up with my suffering chronic fatigue and consequently being out of contact for lengthy periods of time.
In my opinion, however, modulated wavelets are not desirable for wavelet analysis because their zero crossings (values of x at which the real or imaginary part of the wavelet is zero) tend to be very evenly spaced. This creates two problems: (1) modulated wavelets are very sensitive to noise at particular frequencies, and (2) they can miss important frequencies in a signal if those frequencies don't correspond very closely to frequencies that are used in the wavelet analysis. Practical wavelet analysis is a discrete technique. It is not possible to examine all possible frequencies, much as we might like to. Typically, the scale increases by geometrically, by a constant factor at each step as we advance from very fine to very coarse scales. This factor is characterised by the number of "voices per octave" or steps per doubling of scale. A coarse analysis would use one voice per octave (which most implementations of the DWT do). A fine analysis might use four even twelve voices per octave. Whatever style of analysis is used, there is still plenty of room in between successive scales for a signal to slip in between the cracks in the analysis. Extreme regularity of zero crossings tends, in my experience, to grossly aggravate this problem.
Back to frequency-domain algorithms, the major thing that lets them down is the gritty detail of interpolating the wavelet to the frequencies used in the discrete Fourier transform (DFT). This is in my opinion a major problem and there is no satisfactory means of solving it. For anisotropic wavelets (e.g., directional wavelets), the problem is exacerbated by the need to rotate the wavelet and then still interpolate the wavelet's Fourier transform to the original non-rotated points of the DFT. Therefore I would still prefer pyramidal algorithms even if they offered no computational speed advantages over frequency-domain algorithms.
My published paper is the first to show how an isotropic two-scale relation and its associated pyramidal algorithm can be used to implement the CWT for an anisotropic wavelet. The pyramidal algorithm can use the original Cartesian pixels with no need for rotation, and thereby overcomes a huge number of problems that can be caused by interpolation of a rotated wavelet to the original pixels.
The next steps in describing the body of work I need to publish on the CWT will be taken up in the next instalment of this blog.
Published: Saturday, 04 May 2013 04:06
IEEE Transactions on Signal Processing vol. 61 no. 1, pp. 28–37 (2013)
Fast FIR algorithms for the continuous wavelet transform from constrained least squares
George M. Leigh, Member, IEEE
Copyright © 2012 IEEE. Personal use of this material is permitted. However, permission to use this material for any other purposes must be obtained from the IEEE by sending a request to
This work is part of AMIRA International Project P843, supported by the Australian Research Council and mining industry participants Anglo Platinum, AngloGold Ashanti, Barrick Gold, BHP Billiton, Codelco, Newcrest Mining, Newmont Mining, OZ Minerals, Peñoles, Rio Tinto, Teck Cominco, Vale, Vale Inco, Xstrata Copper, Datamine Group, Geotek, Golder Associates, ioGlobal, and Metso Minerals.
G. M. Leigh conducted this work at the University of Queensland, Sustainable Minerals Institute, Julius Kruttschnitt Mineral Research Centre, Isles Rd, Indooroopilly, Qld 4068, Australia. He is currently with the Department of Agriculture, Fisheries and Forestry, PO Box 6097, St Lucia, Qld 4067, Australia (phone: +61-7-3255-4532; fax: +61-7-3346-2167; e-mail:
Abstract—New algorithms for the continuous wavelet transform are developed that are easy to apply, each consisting of a single-pass finite impulse response (FIR) filter, and several times faster than the fastest existing algorithms. The single-pass filter algorithm, named WT-FIR-1, is made possible by applying constraint equations to least-squares estimation of filter coefficients, which removes the need for separate low-pass and high-pass filters. Non-dyadic two-scale relations are developed and it is shown that filters based on them can work more efficiently than dyadic ones.Example applications to the Mexican hat wavelet are presented.
Index Terms—Algorithm design and analysis, Continuous wavelet transforms, Finite impulse response filter, Signal processing algorithms
Download paper Download software
Published: Saturday, 15 August 2015 06:30
The introduction of a weekly personal blog marks a change of direction for this site, to provide a greater frequency of content. Some of this content will necessarily not be of the standard that I originally envisaged for the site, but I think it is for the best. Genuine "mathematical discoveries" are hard to come by and less frequent than is desirable for a functional web-site.
To kick this off I am very lucky to be able to present the PhD thesis of my long-standing friend and colleague Bill Hearn (CSIRO retired), one of the great intuitive thinkers of fishery population dynamics. I was proud to shake Bill's hand in 1986 when he told me his thesis had been approved. One humorous memory is of when I was relaxing, walking to Hobart's Salamanca Market one Saturday, and saw Bill wave as he drove by to work on his thesis over the weekend.
Although it dates from 1986, Bill's thesis "Mathematical Methods of Evaluating Marine Fisheries" is still current and still contains a mine of material that has not been published in academic journals.
Bill in front of the Beerwah Hotel where we had lunch when he visited me in Brisbane in 2011.
Published: Friday, 03 May 2013 08:11
The mathematical academic journal system should function in the way as this site, by accepting high quality work and de-emphasizing the rest. In practice, for a variety of reasons, this does not happen.
Major problem 1 is that large numbers of publications are required for people employed in research to advance their careers. The academic system accommodates this need, with the result that most academic articles provide no new knowledge.
Major problem 2 is that true mathematical research ability is a rare commodity, and there are more jobs involving mathematical research than there are capable researchers to fill them. Employees still fulfil their career need to publish papers, and many then become reviewers to decide on the merits of new articles. A harsh critic might say that the lunatics are running the asylum.
Major problem 3 is that many journal editors don't do their jobs properly. They reject articles for nonsensical reasons and fail to appreciate that referees either don't understand the work or have selfish interests in blocking it. That said, one must acknowledge that being a journal editor is a thankless task. Journal editors have to deal with large amounts of unoriginal, poorly written or plagiarized work, and then find qualified and willing reviewers for what remains. They also have to maintain a regular feed of articles, and maintain the journal's reputation, subscriptions and finances.
There are pleasant surprises. Editors accept articles despite negative referees' reports. And most referees are altruistic, genuinely reviewing work according to its perceived merits, even if it promotes research directions that are contrary to the referees' own work. Some submissions are even rejected due to lack of originality.
The major need is for quality rather than quantity of published articles. There is no substitute for taking the time to understand and appreciate good work. This extends to institutions rating their staff, job selection committees, funding agencies, and professional organizations, all of whom need to understand how rare it is to produce a research article of true, lasting quality. Unfortunately, the current trend is in the opposite direction: the academic world is obsessed with journals' "impact factors" and many funding agencies provide funds on a per-article basis. It must be acknowledged that employers understandably try to keep clear of researchers who produce something outstanding early in their careers but never do so again; a continued publication record offers a small degree of protection against that.
Published: Saturday, 05 September 2015 07:27
Chronic fatigue is pertinent to this site insofar as it has had a huge effect on my work and has been the major limitation to my maintaining the site properly. I am in the process of rectifying the site maintenance. Possibly a brief outline of my history with fatigue will help others who may have it.
I first noticed that I had chronic fatigue in late 2009 when I stood up after eating breakfast and immediately felt like I had been hit by a train. Doctors were particularly unhelpful in telling me that there was no such thing as chronic fatigue. They nevertheless referred me to all sorts of tests, none of which showed anything. I could stagger in to work and didn't so much have trouble in doing the work as I did in allocating my time and priorities. Allocating time and priorities sounds easy to fix but it was not. It was a crisis and I was very unproductive for the duration of it.
Immediate causes several months prior to the fatigue were physical stress (lack of sleep finishing my PhD), emotional stress (work-related dispute) and a very nasty and long-lasting virus. In late 2010 my parents both died and my work moved to a new building with chemical fumes that affected me very badly. In early 2011 my parents' cat whom I had adopted was killed crossing the road.
In mid-2011 I discussed my fatigue with my friend Marc who used to practise as a naturopath. He suggested that gluten (more specifically gliadin, the water-soluble component of gluten) was a major contributor to fatigue. Sure enough, he turned out to be correct and I began to improve after going off it. Another friend put me onto a water-soluble mineral supplement that worked wonders, albeit at a much smaller dose than the recommended one. There was still a long journey ahead but I was on the way to full health and energy levels. In early 2012 I relocated to another work building, forced by nosebleeds. I hadn't associated the building with my fatigue, but the fatigue lessened dramatically immediately after I moved. Many other foods also produced fatigue reactions. These were very difficult to diagnose because there was sometimes a time lag of up to two days between eating the food and getting the reaction. It turned out that in addition to gluten I was getting fatigue from many brewed liquids (soy sauce, even gluten free, beer and wine), most fruits and cashew nuts.
Four years after beginning the recovery I can claim to be fully recovered and to have more energy than I have ever had before. I actually had the fatigue all my life before it became chronic. My diet is less restrictive than one might fear. I can eat foods containing wheat flour if they are hard-baked (pizza, hard biscuits, pies, some bread). Somehow baking chemically changes the gluten in wheat and renders it safe for me to eat. I can eat some fruits (strawberries, stone fruits, persimmons, pomegranites and grapes) and drink English, New Zealand and Italian beers, along with a few beers from other countries, Yanjing (Chinese), Guinness (Irish) and Corona (Mexican). The few Australian beers that I can drink are very expensive and not as good as the imports. Despite the famed German purity laws there is no German beer that I can drink.
I will be a while picking up the pieces of what didn't get done while I had the fatigue. Many friends and international colleagues I haven't contacted for years,and there's a huge backlog of work to write up. My major memory of my time with fatigue is that I can't remember it! It must have been so horrible that the mind has suppressed it and left me with a memory of what happened but no memory of what life was like during that time. I have been one of the lucky chronic fatigue sufferers in that I was still able to turn up to work. Still it is not something that I would wish on anybody.
Bear with me and I will gradually clear the backlog of things that need to be done!