University of South Wales DSpace >
University of South Wales >
Advanced Technology >
Engineering and Technology >
Please use this identifier to cite or link to this item:
|Title: ||Approaches for improving image quality in magnetic induction tomography|
|Authors: ||Roula, Mohammed. A|
|Issue Date: ||2-May-2012|
|Citation: ||Maimaitijiang, Y., Roula, M. A. and Kahlert, J. (2010) 'Approaches for improving image quality in magnetic induction tomography', Physiological Measurement, 31(8 ), pp: 147-156.|
|Abstract: ||Magnetic induction tomography (MIT) is a contactless and non-invasive method for imaging the passive electrical properties of objects. Measuring the weak signal produced by eddy currents within biological soft tissues can be challenging in the presence of noise and the large signals resulting from the direct excitation–detection coil coupling. To detect haemorrhagic stroke in the brain, for instance, high measurement accuracy is required to enable images with enough contrast to differentiate between normal and haemorrhaged brain tissues. The reconstructed images are often very sensitive to inevitable measurement noise from the environment, system instabilities and patient-related artefacts such as movement and sweating. We propose methods for mitigating signal noise and improving image reconstruction. We evaluated and compared the use of a range wavelet transforms for signal denoising. Adaptive regularization methods including L-curve, generalized cross validation (GCV) and noise estimation were also compared. We evaluated all these described methods with measurements of in vitro tissues resembling a peripheral haemorrhagic cerebral stroke created by placing a bio-membrane package filled with 10 ml blood in a swine brain of 100 ml. We show that wavelet packet denoising combined with adaptive regularization can improve the quality of reconstructed images.|
|Appears in Collections:||Engineering and Technology|
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.