We describe a potential problem with the use of standardisation techniques that fit growth curves directly to measurement data. The existence of medium-frequency variance (considered here to represent timescales of decades to a century) in the common climate-related forcing of tree growth can bias the removal of supposed "non-climate" variance, leading to distortion of the external forcing signal in tree growth chronologies. This is most prevalent at the ends of the chronologies. The term "trend distortion" is used to describe this effect. The idea that the common forcing signal can be removed from multiple series of ring measurements so as not to bias the chronology standardisation process constitutes the rationale for this discussion. This simple first attempt to mitigate the trend distortion problem, using division of the common signal into the original measurement data, represents an empirical "signal-free" standardisation approach. This can reduce or remove the distortion in the expressed external forcing signal. However, as with previous data-adaptive curve-fitting approaches, this leads to the need to adjust the overall trend of the resulting chronology which has arbitrary slope after being standardised using the "signal-free" method. Hence, the use of signal-free methods will limit the preservation of long-timescale variance to that of the length of the chronology. Trend distortion is described and demonstrated using simulated and measured ring-width series. Signal-free methods are developed and are used to minimise trend distortion in chronologies produced using so-called "conservative" standardisation methods, applied here to ring-width measurements from northern Scandinavia but also to samples from Canada. Some of the limitations of using traditional standardisation and some of the potential benefits and limitations of using signal-free methods in conjunction with traditional standardisation methods are presented and discussed.